Software Engineering is all about Modelling


There is a great article over at O’reilly entitled “Striking parallels between mathematics and software engineering”.  I’ve never really  thought about the parallels of Math and Software Engineering.  I’ve thought about Civil Engineering, Medicine, and the Law; but not Mathematics.  To summarize, the author says that Mathematics is really about modelling and that is what we do in software engineering continually, especially when following object-oriented paradigms.  It is striking and has opened my eyes to a whole other avenue to explore when it comes to Software Engineering.  Just thought I’d share :)

Turing Software


I have founded a new Software Engineering company, Turing Software, LLC.  Please head over there to attain services such as consulting, training, and custom application development.  I’m also going to start blogging over there, so if you enjoy these articles please continue to read  them over there.  I’ll also post here, but it will be more of a personal nature and probably less frequent.  Thanks for reading my blog!

www.TuringSoftware.net

Azure Development Virtual Machines


So I tried the new virtual machines on Azure for Visual Studio.  I’ve always dreamed of using a vm to do my development on, but never really trusted it because Visual Studio (VS) is such a performance hog.  Well, here are my results.  I downloaded “ImageResizer” from Codeplex, a popular C# program, and then built it on my local machine and the Visual Studio VM.  My local machine runs 64-bit Win 8.1 Pro with a Intel i5 4670K CPU @ 3.4 GHz and 8 GB of RAM.  It is also on VS Ultimate 2012.  The Azure VM has a AMD Opteron Processor 4171 HE at 2.10 GHz and 3.5 GB of RAM on 64-bit Windows Server 2012 R2 Datacenter.  It is running VS Pro 14 CTP (the latest and greatest).

Now, the results.

My local machine built it in ~1.1 seconds.  

The VM built it in ~3.1 seconds.

A factor of 3.  Not great, but not that bad either.  I could see myself doing it, maybe….  lots of advantages (clean machine, always running the latest and greatest, etc.).  But it still feels like it’s on the cusp of prime time.

Latest ACM Turing Award Winner Announced!


Lesile Lamport, a Principal Researcher at Microsoft Research, has been announced by the Association for Computing Machinists (ACM) as the winner of the 2013 Turing Award winner.  The Turing Award is the equivalent of the Nobel Prize in computing.  I always look at these prize winners as “Gods” of the computing world and I think it’s important that we remember and honor our history if we are to progress as a profession.  He won it for his his work in distributed computing, which I can tell you from my graduate course, is some very difficult stuff.  He also created LaTeX among other things.  Congratulations!

Official News Release

What is Software Engineering in the 21st Century?


In February of 2001, 17 people met in Colorado to discuss how to move the Software Engineering discipline forward.  They were frustrated that their more lightweight, adaptive methods were not being tried while watching heavier methodologies continue to fail with over budget and over schedule projects.  They correctly surmised that a revolution would be needed to make the Software Engineering community hear and, more importantly, implement their solutions.

They could have left in disagreement and disarray over non-essential questions like: What’s better, Scrum or Feature Driven Development?  Fortunately for the Software Engineering field, they didn’t.  They wrote a Manifesto which is shown below:

We are uncovering better ways of developing software by doing it and helping others do it.
Through this work we have come to value:

Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan

That is, while there is value in the items on the right, we value the items on the left more.

Their Manifesto worked beyond their wildest dreams!  Like any revolution, there was resistance, especially in the establishment.  But after 10 years, it was clear that they had won the war.  The establishment could not resist any more and all discussions on Software Engineering now have to include Agile practices.

After the American Revolution, the American founding fathers were faced with a whole new set of problems that were in many respects bigger and more complex than winning the revolution in the first place.  The Agile Revolution finds itself today in a similar situation.  Questions such as, “What is the best way to scale Agile to an entire Enterprise?” are what need to be answered today.  The Agile movement would be wise to not throw away everything that came before it in Software Engineering, but rather mix and modify it to set our discipline on a new course.  This is what the Americans did when they took the King concept, mixed it with democracy, and came up with the modern presidency.  I’m looking forward to the next 10 years to see where this mixture leads us!

Why is Healthcare.gov not working? (from a technical perspective)


Most web applications have a architecture like the below.  There are of course nuances and exceptions, but for the layperson, this will suffice.

The “Presentation Layer” handles all the graphical packaging of content in web pages presented back to the user.  This article in the Atlantic has a good description of the Presentation Layer for Healthcare.gov.  This is definitely NOT the problem as pages of just content come back very fast without problems.  Clicking 90% of the links in the Healthcare.gov sitemap come back in under a second.  BUT, a web application with just a good Presentation Layer is like a book with a nice cover design and nice pictures inside; no one will care if the text is not good.  The rest of the Healthcare.gov web application is the text and is seems to be horribly bad at the moment.

The next part of the system is the “Business Logic” layer.  In Healthcare.gov this is called the “Data Hub” and is described here.  There is a tremendous amount of coordination between different web services (Social Security Administration, IRS, Insurers, etc.) to make sure you get the insurance you’re supposed to.  Unfortunately, this is where the software engineers for Healthcare.gov have the least control over what happens, because they are dependent on these other services to relay data back to them quickly.

Finally, we have the “Data Access Layer” and “Data Source”.  This is where all the data is stored (e.g. your name, address, age, etc.).  The data that Healthcare.gov has to collect and then connect to other relevant pieces is tremendously complex and it is very possible that many of the problems lie here as well.  Fortunately, this is one place where you can “throw more servers at the problem” to alleviate performance problems somewhat.

While the answer to the question why healthcare.gov is failing is not entirely clear, I hope you have gained an appreciation for the complexity of this very important web application and how one problem in any of it parts can make the whole application slow down.  Unfortunately, I predict that many of these problems will not be fixed quickly because of their logical complexity.  Throwing servers at the system will only alleviate a small percentage of the problems and ultimately does not substitute for quality software.  Throwing more people at this problem violates one of the few laws we have in Software Engineering — “adding manpower to a late software project makes it later”.

So, What is Requirements Work?


Interesting article from the Spring Issue of IEEE Software entitled, “So, What is Requirements Work?”.  The author, an obvious expert in the field, goes on to conclude that Requirements Work is basically helping people help themselves.  I like this definition.  As a Requirements Analyst, you cannot know every single domain you will be parachuted into as your career progresses.  No one can be an expert in Medicine, Law, Accounting, Botany, etc.  Instead, “smart” Requirements Analysts depend on the domain experts that already exist to guide them to the problems that need to be solved.  Then, they are able to get out of domain experts/stakeholders the specifications of those problems.  Finally, the requirements analyst can specify those requirements in formats that are decipherable by multiple stakeholders (developers, customers, etc.).

I’m not sure I agree that Requirements Work involves coming up with solutions though.  It seems to violate some central core of Requirements that were taught to me, “Concentrate on the problem, not the solution.”  But, I’ll admit as a former software developer, I can’t help to think of solutions when I hear problems.  I have to be judicious, though, about when I bring them up.

Check the article out yourself when you get the chance!

Follow

Get every new post delivered to your Inbox.