What an interest rate is in finance, automation is in programming. Time spent upfront to automate pays off with interest.
I sat down to run the math of how productive one would be if they invested a bit of time automating. Let's run a conservative calculation to see the time savings.
If writing a script can save 5 minutes a day and you work an 8 hour day, you save 1% of your work time a day. If the next day you improve the script to save an extra 5 minutes beyond the 5 minutes it started saving daily since yesterday, the time savings now compound exponentially. The script now saves 10 minutes daily. The initial value in the exponent is small, only 1.01, but it grows fast as time goes by. If you work only 40 weeks a year, 5 days a week, after 200 days this 1% of work time saved daily compounds to a productivity of 7.9x.
This jump to 7.9x isn't much when you take into account the effort needed to produce it. It's exhausting to constantly automate. To add a new feature, most times you can't just call the script from the outside and be done with it but also have to change the insides of the script to work differently. Automate for only a couple of months and productivity hasn't doubled (it's only 1.86x) while it increased the risk.
Because there is risk involved too. When something that moves forward at an exponential speed breaks, it sets you back exponentially too. Which means you must drop everything you're doing to go fix it immediately, however long it takes, possibly losing all the extra productivity gained so far. For high exponential gain, not fixing a broken script is not an option. And the more you automate the more there is to break. Time is both on your side and against you. The more time passes the higher the productivity and the higher the chance that something breaks.
The power of automation is more obvious if one manages to save 10 minutes a day. Whatever the risk, after 200 days you become 61x more productive and it's probably worth it.
It gets harder to argue against automation if one saves as much as 30 minutes a day, even if it comes at the cost of completely neglecting every other responsibility. If all you do is continue to automate the same script, and somehow the script continues to grow and eat up more of your time, in just a couple of months you are 38x more productive. Keep the pace of saving a whole 30 minutes each day and in 200 days you're 184,411x more productive which roughly equals the productivity of a small city. 
This is an increase of five orders of magnitude in productivity: 100,000x. I doubt most people are ready to gauge if such a big jump is within reach. Especially if they've done no hard automation before. But I know it's a mistake to dismiss the possibility. A less frightening question to ask is: which productivity multiplier might be within reach? Some number must be.
The limits of productivity aren't a complete mystery. Some claim 100x productivity as the lowerbound. Probably no one knows where the upperbound is today and where it will be in the future. If I had to guess I'd say it depends not just on the person but also on the nature of the problem solved.  The bottleneck often isn't a hard tech problem being unsolveable but some human getting in the way.
Note how conservative the calculation is. It excludes working weekends, and it comes with 12 weeks of not working at all. It's not working for two more days per week that will make a difference. It's automating that script to continually save more minutes each day. If you had to pick between continually automating like this for 200 days, or not automating but working a full 365 days, automating yields higher gains.
Because the key to productivity is achieving exponential savings, not working for a long amount of time. If the savings were merely additive, meaning you only saved 30 minutes per day and that's it, and worked 365 days a year with no break, you'd save 22.8 days. Which is a paltry 1.0625x in productivity gains. (A script that saves 10 minutes a day surpasses this in just four days, at 1.09x).  It's better to work smarter than harder.
After a certain point you could not work at all and still be productive. If one saved 15 minutes for 5 months, losing 1x of productivity daily from that point on by not working doesn't hurt much in the bigger picture. You're on cruise control. There's a surplus of more than 1x productivity in the exponent that the process is now self sustaining. Just make sure you don't break anything. 
The flurry of work involved in writing new software is partly due to not having reached yet the point on the exponential curve where everything is done by computers for you.
So if you're looking to become more productive with software, don't just try to do a lot. Also try to do nothing. Shoot for the point in the curve where exponential savings do all of the work for you.
It's rare to keep saving 30 minutes a day for 200 days straight all from the same script. What eats up time with the script ceases and something else takes its place. When that happens switch focus and start automating that.
What makes it hard to gauge the limits is that while we know a lot about the productivity of people who came before us, so we can compare our productivity to theirs, we don't know as much about the nature of the problems ahead. Innovation in technology during the past century errupted unlike any other time. The nature of the problems solved now is unlike that solved centuries ago.
When asked to do something manually, I used to find ways to give myself two or three days as a buffer to do it. I'd finish stuff and not tell anyone so I could then say "sure, as soon I finish x", to give myself this buffer. Or I'd pick a task that takes at least three days if done manually, to have enough time to automate it.
I didn't know why three days worked well until I ran the math here. Save 10 minutes a day and by the fourth day automation catches up.