I once read a heartbraking story about a young woman who finished her MFA (Master of Fine Arts) degree in classical composition by listening to her masters thesis work, a full-length 40-minute symphony, played by a professional orchestra. It was the high point of her life at that point.

In her essay, she wrote about the profound sadness she felt that she'd never hear it played again onstage. She'd probably never write another one, and even if she did the only way she'd hear it is through a synthesizer, in the privacy of her own home. Perhaps shared on SoundCloud or YouTube or even Spotify.

But she'd never hear it again the way orchestral music is meant to be heard. She was a scholarship student; she didn't have rich parents who could afford to put her onto the international composition circuit, sponsor her to expensive music festivals, and get her in front of critical symphony directors.

Software development is undergoing an equally problematic phase change. As a programming language dilettante enthusiant, I have the luxury of sitting around all day, trying out my ideas in Haskell or Racket, and then implementing them in some language that's more tractable for others, like Rust or C. And that's pretty much been true since P-system Pascal was released in the late 1970s. All you needed was a computer. Just one. The one on your desktop.

But today, if you want to do anything substantial in some of the fundamental up-and-coming fields, such as artificial intelligence, data science, or cloud computing, you need far more money. You need to be able to rent several farms worth of compute space, and if you're doing anything at all interesting with AI, the cost of a single experiment could easily run into the hundreds, if not thousands of dollars. Nobody except the wealthy can afford to do that, or let their kids specialize in it.

Companies looking for that kind of experience just can't get it off the street. The only way startups get people with cloud experience is by tempting someone away from an established company that put the time and effort into training someone. It's not available in schools, and you're certainly not going to be self-taught in systems reliability engineering or Kubernetes cloud deployments. Those skills come only through spending time with very large systems while someone else foots the bill.

What this means for the future of software development is anyone's guess, but I'm starting to see already an awareness that there needs to be some kind of revamping of the educational pipeline to enable people to get these skills without having wealthy sponsors.