> As of 2025, the time needed to earn $1 is 63 minutes in the US.
Confused, I clicked one of the links and tried to understand. Found this:
> The time to get $1 refers to a day of life for anyone at any age and in any circumstance, not just the hours worked by someone with a job.
Clicking another link took me to the abstract at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4785458 but that didn't answer any questions either.
I can't find anything really of substance in this, other than someone trying to redefine a lot of terms in confusing ways
$1 every 63 minutes would be $8343/year. I cannot think of any way to reconcile that with the US average household income or any other related figure.
So let's say you're Elon Musk and it takes you a negligible enough time to do this that we can say that t_Elon = 0.
Now say you are way below the poverty line and earn 6000$/year. This means t_Poor = 87 mins.
If we average 80 t_Poor and 20 t_Elon we find we get 0.8 x 87 mins = 67 mins. Even when the average income in this case would be 0.2 x income_Elon. Something like 7 billion $/year.
I hope this shows why you can't just take the inverse to get the average income. The only way that was true was if everyone earned the exact same income.
Why is this a better metric?
The average income is biased towards big earners, while this metric is more centered around the mode of the distribution (poor people).
It captures the income distribution much better than average income.
If you do want to use average, you'd at least need to remove 10% both from the top and bottom before calculating it, but it's still gonna be super untrustworthy.
Not sure what to take away from your comment, I'm still unsure what kind of metric you're pitching and why it'd be a valuable thing to track
It's focused on the very poorest, who are not the mode. (Income distribution is approximately lognormal; see https://www.researchgate.net/figure/The-lognormal-distributi...).
Say you have 10 people: one making $800/year, 8 making $80k/year, and one evil billionaire making $800 million. Their times to earn $1 are respectively 10 hours, 0.1 hours, and essentially zero. If you take the arithmetic mean of that you get 1.09 hours, and that's dominated by the single poor person. If you double that person's income to $1600, then they're at 5 hours to earn $1, and the overall average is nearly cut in half to 0.58. Meanwhile you can reduce the income of all the middle class people to $40k and not much changes; the average time to $1 would be (5+8(0.2)+0)/10=0.66.
It captures the income distribution much better than average income.
Not really, and certainly not better than median income which is what people typically use. It tries to measure exactly how little income the very poor make, which is not normally what people mean when they talk about inequality or poverty, and also hard to measure at the accuracy that you need when small changes produce huge swings in the result. In particular I don't believe he's correctly accounted for government benefits; hardly anyone in the US is consuming less than $8000/year.
In particular it seems weird that only we had a massive change during COVID.
Also seems a little odd that Germany was always better than the US, even in the 90’s when things were pretty good here.
Putting it together, we need to have COVID all the time here, so we can match the economic development of Germany immediately post-reunification.
It is not weird if you were old enough to be aware of the news during that time. Poor people in the US suddenly coming into money and being lifted out of poverty thanks to COVID stimulus checks was front and center in the news cycle as it was happening. The other countries noted did not follow the same "hand out free money" approach. Their safety nets were built around maintaining continuity during COVID.
A lot was written about the stimulus checks but they were so small to not matter. A $1200 check isn't going to suddenly lift a lot of people out of poverty and keep them there, even though it could be make-or-break for a few selected cases.
The bigger change was that the American economy was basically turbocharged by all of the interventions going on. Remember "The Great Resignation" when everyone was changing jobs because all the companies were hiring as fast as they could? It was an ideal time to move your way into a better position in the job market.
"A lot" is subjective, I suppose. Concretely, it lifted 11.7 million Americans[1] out of poverty. That makes up approximately 30% of those who were in poverty prior to the stimulus.
[1] https://www.commondreams.org/news/2021/09/14/incredible-covi...
A lot of things changed during that time, notably the job market. Getting a new job that paid $1/hour more would be more impactful than a $1200 stimulus check. People were getting raises much bigger than that.
The checks were not the primary driver of the economic changes
The article doesn't do anything other than quote the US Census Bureau.
Obviously you will have already read the citation in full, but for everyone else here is the full quote: "Stimulus payments, enacted as part of economic relief legislation related to the COVID-19 pandemic, moved 11.7 million individuals out of poverty. Unemployment insurance benefits, also expanded during 2020, prevented 5.5 million individuals from falling into poverty."
Again, this is from the US Census Bureau. It is being asserted in an official government capacity, from an governmental organization that has access to all the relevant data. If you think that they got something wrong you're going to have to offer something more compelling than some random theory you made up on the spot.
This one needs a little common sense. A one-time $1200 stimulus check is not going to lift 11.7 million individuals out of poverty in any meaningful sense, unless you're literally just looking at people within $1200 of an arbitrary cutoff and saying you "lifted them out of poverty" by bumping them over that threshold for the year.
Although I wish this sparked a conversation on how we can do better instead of national dick measuring contests. Those don't help.
Median workers in the US have some of the highest hourly wages at PPP in the rich world and they have been increasing, but they are pretty similar to those in Germany. The big difference in annual pay at PPP is down to hours worked.
For 2022 average annual hours worked per worker in the US is 1790 while in Germany it is 1340 [1]. Meanwhile average hourly wages at PPP in US are $34.9 vs $34.6 in Germany [2]
[1] https://ourworldindata.org/grapher/annual-working-hours-per-...
[2] https://ourworldindata.org/grapher/average-hourly-earnings
This means using PPP doesn't actually show where the level of precarity is.
https://fred.stlouisfed.org/series/LES1252881600Q
There is a huge mismatch between perception and data. I wonder whether some costs are just more pertinent?
So IIUC this "average poverty" (measured in time per international dollar) includes people living off social welfare? Otherwise, if it only included the working population, wouldn't we have
average poverty ≝ (average yearly income* of the working population / 1yr)⁻¹
and so it should be inversely proportional to the average yearly income* metric mentioned in the article?*) Adjusted for purchasing power, i.e. measured in international dollars.
>For these purposes, income includes earnings from work, government benefits and other sources of money, and it is averaged among all family members.
Yes, it is supposed to include income from all sources.
https://theconversation.com/measuring-poverty-on-a-spectrum-...
>>> import statistics
>>> 1/statistics.mean([10,30,100])
0.02142857142857143
>>> statistics.mean([1/10, 1/30, 1/100])
0.04777777777777778average poverty ≝ average(1 / annual income)
Inversely proportional to the harmonic mean of average yearly income.
I would be very interested to find out how those stats are related to things like, GINI or old pre-GDP economic measures of raw production.
The "old" way was to measure median net PPP per capita, which makes more sense to me:
https://upload.wikimedia.org/wikipedia/commons/8/85/Annual_m...
[1]https://en.wikipedia.org/wiki/List_of_countries_by_average_a...
The goal of increasing work productivity must be to produce the same by working less, not to work the same in order to make higher profits for a negligible part of the society.
People in the US are so close to financial disaster that in order to avert disaster the US had to heavily subsidize those out of work. Many people got healthcare and unemployment benefits that would not have been otherwise available. This meant money for zero hours of work. When you average in $1/0 hours it does crazy things to the graph.
The reality is: During Covid the US rapidly adopted similar safety nets to EU countries and, in effect, aligned with their levels of poverty. Once the emergency measures ended we snapped back to our previous, precarious, poverty level.
Just my theory.
In addition, anecdotally, everyone I know in the EU that had a job pre covid has a job today. I can't say the same thing about folks in the US.
I.e. making the economy more like the US.
You are not even responding to anything in my post. Please try again.
That seems like a complicated way to "talk about median income without talking about median income". By the end, they do describe the basic situation: US has greater total wealth and total income but that wealth and income is so unequally distributed that more people are poor.
I get the "international" part - purchasing power. The number still seems way off, though.
In a time when minimum wage is $7/hr, how is the average American earning $1/hr?
Can anyone make that number make any sense?
International dollars are normalized to USD, so there’s no conversion necessary. The figure he quotes of 63 min per dollar converts to $8343/year. However, his original paper states that he created this measure by inverting income, so the number 8343 is his starting point.
The closest guess I have is that is derived from the poverty line for a family of four, $32150 (which divided by four is $8037).
If that is the case, what he is really doing is comparing poverty line definitions between countries.
So it's how much you earn per day divided by 24, or maybe by yearly earnings and hours per year
Think of it this way, its like difference between median and average income. Larger inequality, larger the gap between median and average.
> Virtually everyone would agree that a 20-meter tree is twice as tall as a 10-meter tree. Conversely, everyone would agree that the 10-meter tree is twice as short as the 20-meter tree. There is no threshold or “shortness line” above or under which these relationships cease to hold: a 5-meter tree is twice as short as a 10-meter tree, a 1-meter tree is twice as short as a 2-meter tree, and so on. This reasoning remains valid when considering other multiples: a 1-meter tree is three times shorter than a 3-meter tree. To be sure, when assessing the height of a single tree, different people may disagree whether it is short or tall, as their judgment will depend on the benchmark they use for their assessment. However, when comparing two different trees, virtually everyone would make similar cardinal comparisons. In mathematical terms, shortness is the reciprocal of tallness. [...] In this paper, I apply the same logic to define a new poverty measure
I'm still trying to figure out how he reached the conclusion that it takes 63 minutes to earn $1 in the US