The NJI development team recently participated in DrupalCon NA in Baltimore. It was a great, well attended event, and provided a snapshot of the general state and tone of the open source web development industry at large. The stand-out subjects were those that cleverly conceal complexity underneath: Virtual Reality, Artificial Intelligence, and DevOps.
With big releases from Oculus, HTC, Samsung, and Google, VR is buzzing. I attended a great demo and talk by Wes Ruvalcaba and David Burns from Lullabot about WebVR — an open standard designed to process inputs from a headset or handsets in order to create an immersive experience in a standard browser. Mozilla also is working on MozVR, which is very promising and accessible.
A-Frame is a WebVR development framework for developing VR and AR web applications in markup language. Lots of really fun possibilities pushing web development into new spaces. I can’t wait to dive into this further.
I also enjoyed seeing what some teams are doing already with Artificial Intelligence — how it can be integrated with Drupal to create better, more personalized experiences on the web. Drupal can even be integrated with Google’s Alexa AI bot.
Both AI and VR share a common thread — these technologies take highly complex logical algorithms or three dimensional processing and attempt to disguise them behind simple, intuitive, natural interfaces. Note that we aren’t removing complexity — rather, we’re building interfaces on top of inherently very complex things, using automation and design to engineer the experience of simplicity for the user.
Of course, Artificial Intelligence is the attempt to pull humanity out of a machine, but what of the reverse? Will we one day be able to digitize human consciousness — to put humanity into a machine?
In a way, I think we already have. Let me explain, by a diversion into Web Development history.
“– Last Updated By: Dave, Dec 3, 1997”
Recall the Age of Webmasters. A webmaster was the person that could adapt a document for presentation on a website. You send them text, pictures, they would “mark it up” in HTML.
There was just one more thing: the Webmaster needed to make those edits “live” somehow. The method was FTP: File Transfer Protocol. There were various ways to do this, but most professionals preferred a desktop program called Dreamweaver. One could FTP directly from Windows file Explorer or Mac Finder though.
“I know Kung Fu”
Of course, this paradigm is long dead. You probably can’t FTP HTML files up to your website anymore. But the weird thing is — your website is still the same stuff: HTML, CSS, and JS. What gives?
Content producers wanted freedom and webmasters got lazy. Webmasters created ways for people to modify some of the content, which evolved to a modern Content Management System. But now there’s a whole new component — the CMS itself, the “back end”. This is Real Code™. This is a program that has to be installed and executed on the server. Now more things can go wrong, and the server has to work harder. Those are new problems that have to be solved with more layers. As more conveniences and interfaces were created, more security considerations became necessary. At this point, the webpage isn’t an HTML file anymore. Instead, there’s a program generating what used to be an “HTML file”. This is a good thing. It’s dynamic now, and we wanted that.
Notice: our webmasters achieved total digitization. Look around your organization. Have a website? Yep. See any webmasters? Nope. Your webmasters have become incorporeal techno-entities. Your CMS is your webmaster now — you just give it your content, images, and instructions, and it produces your webpage for you.
Ever curse a website or CMS? Probably. That’s personification, but it vaguely plays into what I’m getting at here.
Are all the human webmasters out of work? Nope. They became Web Developers. They author the Real Code™ that is the webmaster. Somebody has to program the webmaster to do exactly the The Right Things.
It’s not just the website that got more complicated. We now expect to have private, synchronized staging environments, detailed code histories, automated tests, feature deployments impervious to human error, performance optimization, and more. The entire development ecosystem is more complex.
“The server” probably isn’t just a server anymore. It could be an indeterminate number of dynamically-allocated-app-server-containers connected to a highly-available-database-cluster behind load-balancers and reverse-proxy-edge-servers and a globally-distributed-CDN plus however CloudFlare works, to describe a fairly common setup. Files are transferred by a git-push post-commit-hook continuous-integration build rsync feature-toggle dark-launched “deploy operation”. Or something. That’s more-or-less true, deliberately convoluted to sound impressive.
But seriously, every little part serves a good purpose — something you would miss. Environments have matured. Relatively speaking, they can be pretty complicated. A developer (probably) can’t just open “the file” and press save anymore. They may have to fork the project code from a repository, build a local development stack using the production server “recipe” (this invention overcame the “well, it worked on my computer” woe). They might have to copy some “downstream” production content and assets, so they can see what they’re doing. They have to fire up development task runners, do the work, commit it, tag it, push it, test it, merge it, make it stronger.
There are, of course, great tools that get out of the way while in service of more advanced needs. These allows us to stay focused on creating great stuff and making clients happy. Usually once you get everything happily whirring away, it’s pretty phenomenal. Except when it isn’t, because sometimes computers. Inevitably, developers will get stuck wrenching on the wrenches themselves.
There isn’t just a barrier to getting initialized on a new project. The bar has also been raised on entering the profession. Unless each developer is a cog in a perfectly tuned machine, that developer probably can’t merely know HTML and CSS. They probably have to know Git. And Sass. And Node. Grunt. Gulp. Webpack. ES6. Babel. Vagrant. Docker.
DevOps tunes that machine to reduce the burden. DevOps is the work that makes other work more efficient and more reliable. Like AI and VR, DevOps tries to conceal complexity through automation; it makes intuitively simple tasks actually feel simple again. There’s a ton of fascinating stuff going on here.
DevOps at DrupalCon NA 2017
DrupalCon 2017 in Baltimore featured 150 sessions by the best and brightest in our industry. Here are a select few of the session subjects:
- Continuous Integration x 3
- Visual Regression Testing Automation x 2
- Full Stack Test Automation
- Build and Launch Tools
- Predictable Continuous Delivery
- DevOps Transformation
- Improving Dev Workflow
- Dev Workflow Tools
- Containers (Docksal, Wiring, Kontena) x 4
- Composer (Package Management)
- Deployment Automation
- Basic DevOps
There is a hefty DevOps presence, and for good reason: it’s important.
There’s an old saying, “Fast, Cheap, and Good: Pick Two”. Well executed DevOps is like magic dust that gives you all three. It’s fast and good because it reclaims lost time and improves quality. It can be affordable too, if you take advantage of the open source tools available, and leverage good products from a hosting partner.
DrupalCon 2017 also attracted scores of vendors offering solutions to the pressure points developers all recognize. Strolling around the exhibit hall, one can see that the competition has become pretty snug. DevOps and opinionated, integrated development workflow tools are perhaps the best way for competitors to distinguish themselves in this space.
At NJI, we’re constantly striving to make uncommonly great things, and to do it as efficiently as we can. We’re looking forward to mastering and contributing to the next generation of tools and processes to make science feel like magic. We think when we’re doing it right, most people won’t even know it’s there.