Probably any scientist that went through college starting sometime in the early 2000's knows how to program to some degree. They may have learned how to write computer code to do data analysis, to implement simulations, or simply for fun. In my own case, I had been interested in computer programming since middle school when I first taught myself Visual Basic and HTML (I know, I know, HTML is not a programming language, but it did get me interested in programming).
Now, as a researcher, I often have to temper my enthusiasm for writing programs. It's true that creating scripts to automate data analysis and processing will save me a lot of time in the long run, but I also think that tweaking my code too much or trying to engineer it to do too many things may be counter-productive. After all, code is a means to an end, and my goal is not to write as many programs as possible.
So what's the right amount of time that should be invested into writing code to help with usual laboratory tasks? This is a tough but important question that scientists should ask themselves, since it putting in just the right amount of time should maximize their productivity.
And beyond maximizing productivity, scientists should also dedicate time to writing code that makes it easier to document and reproduce what they did. For example, I have recently written scripts to take the output of a curve fit routine and write a report that includes the fitting bounds, initial guesses, fit functions, and other data that is relevant to reproducing a fit. Hopefully, a future grad student can read my automatically-generated reports and figure out exactly what was done.
So in the end, enough time should be invested in coding when it significantly cuts down on the amount of time taken to do repetitive tasks and when it streamlines documentation and note taking, which may not happen otherwise.