Time to Remodel the Kitchen?

Although determining full and realistic corporate valuation is a task I’ll leave to people of sterner stuff than I (since Facebook went public, not many could begin to speculate on the bigger picture of even small enterprise valuation), I’ve recently been working with a few clients whom have reminded me of why one sometimes needs to remodel.

Nowadays, information technology is often seen as a means to an end. It’s a necessary evil. It’s overhead to your real business. You joined the technological revolution, and your competitors who didn’t, well… sunk. Or… you entered the market with the proper technology in place, and, seatbelt fastened, have taken your place in the market. Good for you. You’ve got this… right?

I’m a software system architect. I envision and build out information technology. I often like to model ideas around analogies to communicate them, because it takes the tech jargon out of it. Now that I’ve painted the picture, let’s think about what’s cooking behind the office doors.

It’s been said that the kitchen is the heart of the home. When it comes to the enterprise (big and small) your company’s production might get done in the shop, but sooner or later, everyone gets fed business processes, which are often cooked in the kitchen of technology. In fact, technology is often so integral to what many companies do nowadays that it’s usually hard to tell where, in your technology stack, business and production processes begin. Indeed, processes all cycle back around, and they almost certainly end with information technology again.

Truly, we’ve come a long way since the ’70s, when implementing any form of “revolutionary” information technology was the basis of a competitive advantage. Nowadays, if you don’t have information technology in the process somewhere, you’re probably only toying with a hobby. It’s not news. Technology graduated from a revolutionary competitive advantage to the realm of commoditized overhead well over a decade ago.

Ok… ok… You have the obligatory kitchen in your home. So what?

If you think of the kitchen in your home as commoditized overhead, you probably are missing out on the even bigger value an update could bring you at appraisal time. Like a home assessment, due diligence as part of corporate valuation will turn up the rusty mouse traps behind the avocado refridgerator and under the porcelain sink:

  • Still rocking 2000 Server with ActiveX?
  • Cold Fusion skills are becoming a specialty, probably not a good talent pool in the area, might be expensive to find resources to maintain those components.
  • Did you say you can spell iSeries? Great, can you administer it?
  • No one’s even touched the SharePoint Team Services server since it was installed by folks from overseas.
  • The community that supported your Open Source components… dried up?
  • Cloud SLAs, Serviceability?
  • Compliance?
  • Disaster Management?
  • Scalability?
  • Security?
  • Documentation…?
    • Don’t even go there.

As you can see… “Everything but the kitchen sink” no longer applies. The kitchen sink is transparently accounted for as well. A well designed information technology infrastructure needs to go beyond hardware and software. It considers redundancy/disaster management, security, operating conditions, such as room to operate and grow, and of course, if there are any undue risks or burdens placed on particular technologies, vendors, or even employees. Full valuation goes further, looking outside the walls to cloud providers and social media outlets. Finally, no inspection would be complete without a look at compliance, of course.

If your information technology does not serve your investors’ needs, your CEO’s needs, your VP of Marketing and Sales’ needs, as well as production’s… but most importantly your customers’, your information technology is detracting from the valuation of your company.

If the work has been done, due diligence will show off the working utility, maintainability, security, scalability, and superior added value of the well-designed enterprise IT infrastructure refresh.

To elaborate on that, a good information technology infrastructure provides a superior customer experience no matter how a customer chooses to interact with your company. Whether it’s at the concierge’s counter, in the drive-through, at a kiosk, on the phone, at your reseller’s office, in a browser or mobile app, your customers should be satisfied with their experience.

Don’t stop with simply tossing dated appliances and replacing them. Really think about how the technologies work together, and how people work with them. This is key… if you take replacement appliances off the shelf and simply plug them in, you are (at best) merely keeping up with your competitors. If you want the full value add, you need to specialize. You need to bend the components to your processes. It’s not just what you’ve got.  It’s how you use it.  It’s the critical difference between overhead and advantage.

Maybe the Augmented Reality Kitchen won’t provide a good return on investment (yet), but… there’s probably a lot that will.

Paying Too Much for Custom Application Implementation

Face it. Even if you have a team of entry-level coders implementing custom application software, you’re probably still paying too much.

Here’s what I mean:

You already pay upfront for fool proof design and detailed requirements.  If you leverage more technology to implement your application, rather than spending more on coders, your ROI can go up significantly.

In order for entry-level coders to implement software, they need extra detailed designs. Such designs typically must be detailed enough that a coder can simply repeat patterns and fill in blanks from reasonably structured requirements. Coders make mistakes, and have misunderstandings and other costly failures and take months to complete (if nothing changes in requirements during that time).

But, again…   if you have requirements and designs that are already sufficiently structured and detailed… how much more effort is it to get a computer to repeat the patterns and fill in the blanks instead?   Leveraging technology through code generation can help a lot.

Code generation becomes a much less expensive option in cases like that because:

  • There’s dramatically less human error and misunderstanding.
  • Generators can do the work of a team of offshored implementers in moments… and repeat the performance over and over again at the whim of business analysts.
  • Quality Assurance gets much easier…  it’s just a matter of testing each pattern, rather than each detail.  (and while you’re at it, you can generate unit tests as well.)

Code generation is not perfect: it requires very experienced developers to architect and implement an intelligent code generation solution. Naturally, such solutions tend to require experienced people to maintain (because in sufficiently dynamic systems, there will always be implementation pattern changes)  There’s also the one-off stuff that just doesn’t make sense to generate…  (but that all has to be done anyway.)

Actual savings will vary, (and in some cases may not be realized until a later iteration of the application)but typically depend on how large and well your meta data (data dictionary) is structured, and how well your designs lend themselves to code generation.  If you plan for code generation early on, you’ll probably get more out of the experience.  Trying to retro-fit generation can definitely be done (been there, done that, too), but it can be painful.

Projects I’ve worked on that used code generation happened to focus generation techniques mostly on database and data access layer components and/or UI.  Within those components, we were able to achieve 75-80% generated code in the target assemblies.  This meant that from a data dictionary, we were able to generate, for example, all of our database schema and most of our stored procedures, in one case.  In that case, for every item in our data dictionary, we estimated that we were generating about 250 lines of compilable, tested code.  In our data dictionary of about 170 items, that translated into over 400,000 lines of  code.

By contrast, projects where code generation was not used generally took longer to build, especially in cases where the data dictionaries changed during the development process.  There’s no solid apples to apples comparison, but consider hand-writing about 300,000 lines of UI code while the requirements are changing.  Trying to nail down every detail (and change) by hand was a painstaking process, and the changes forced us to adjust the QA cycle accordingly, as well.

Code generation is not a new concept.  There are TONs of tools out there, as demonstrated by this comparison of a number of them on Wikipedia.  Interestingly, some of the best tools for code generation can be as simple as XSL transforms (which opens the tool set up even more).  Code generation may also already be built into your favorite dev tools.  For example, Microsoft’s Visual Studio has had a code generation utility known as T4 built into it for the past few versions, now.   That’s just scratching the surface.

So it’s true…  Code generation is not for every project, but any project that has a large data dictionary (that might need to be changed mid stream) is an immediate candidate in my mind.  It’s especially great for User Interfaces, Database schemas and access layers, and even a lot of transform code, among others.

It’s definitely a thought worth considering.

Best Practice: Cloud Computing

Red sky at morning, sailor take warning.redsky

Here’s a forecast: clouds are rolling in. Architecting for cloud computing will, very soon, become a conscious best practice.

There are lots of handy objections to Cloud Computing: Regulatory compliance, geographic containment requirements, taxes, liability, vendor lock-ins and lack of standards. Many are brushing off cloud technologies as a result, and maybe rightly so… for about another minute, anyway.

Last year, I was involved in a client’s effort to re-provision an application from an in-house infrastructure to a SaaS vendor. All told, the effort was risky and enormous. The administration of it took a year. It took a team of talented engineers from several different companies over six months to implement the transfer. When it was done, everyone breathed a sigh of relief.

The amazing part was that it wasn’t about changing applications. It was just changing who hosted the application. Simply put, no one had the fore-sight to architect for a transition of this nature, and so the ROI was heavily diluted.

Market fluctuations, re-focused specialization, business units changing hands, economic right-sizing, disaster recovery; there are many reasons agile infrastructures can be useful. Cloud computing technology is evolving quickly and has the very real potential to offer agility at a dramatically lower cost, if you’re prepared to leverage it. You don’t have to go in to the cloud to see what you might gain from it. The important part is preparing for it so you can use it when it makes sense for you. And, you could even go green at the same time.

I won’t try to predict what your organization has to gain by architecting around cloud technology. It’s more about what your organization is at risk of losing if you don’t.