Differentiating Between the Cost and Value of SharePoint

In class today, we had a heated debate regarding trustworthy, commercial collaborative software. Not surprisingly, I took the side of SharePoint so joined the group preparing the argument for the Microsoft stack (it was small group session breakout). Interestingly, the argument ended in a stalemate after both presented cases, both sides agreeing that each has their inherent benefits (its god damn hard to come up with an ample argument against Open Source benefits), and innate faults. However, from that preliminary argument, a new dialogue was produced. Regardless of the stack choice that either group was defending, it is typical that management level sponsorship for collaborative environment efforts often has difficulty pegging down the value of standing up a collaborative software instance.

The celebrated dramatist Oscar Wilde once said:

A cynic is a man who knows the price of everything but the value of nothing.

This quote to me means numerous things (since it is so open to elucidation), however none more exceptional than the simplest interpretation. A person at times can repeatedly put an empirical cost on an arbitrary object since this is, well, pragmatic. However, to get something that is not as tangible such as Realized Business Value (RBV) or Return On Investment (ROI) for a collaborative effort is remarkably curious. Although the cost is certainly going to play a parameter when generating your Realized Business Value Docket (RPVD), it certainly is only that, a piece of input while in the presence of other inputs that hold more weight (this is one of the strengths of RBV since it provides some basic levels of cost abstraction, and those living in the small-to-medium size vertical the purchase of SharePoint can be quite expensive).

I want to stress that in my opinion, measuring things like monetized ROI on collaborative software efforts leveraging legacy business value approaches is a façade, a mirage, an illusion (this might not apply to some of its subsets). Just don’t even try it, it’s self-defeating and will only suffice in providing a depressing, dismal night where you are fudging numbers more than harvesting from the actual project.

This isn’t to say that you can’t acquire value with output metrics with collaborative software efforts, but it is problematic to present your findings to management since they can prove to be abstract and often times contain a buttload of inference. Commonly, at a most basic level, a manager is going to expect that he can gain an object for a cost from someone and somewhere, and from these parameters as well as insight into the owning organization, an ROI can be determined and implemented. In the end, we can say that managers tend to be good at looking at a particular project with a pair of monetization spectacles, compensating for human factors like apathy to technology. Taking this monetizing approach when attempting to appreciate the value of SharePoint, as well as all collaborative software, is not practical however.

Why? The outputs are different. Plain and simple. Whereas with traditional software development efforts we can do the type of surveying of project results as described previously, this is unquestionably not the case with SharePoint. Collaborative software has outputs such as:

Augmented Information Worker Knowledge
Improved Employee Productivity
Enhanced Customer Service

And while I can name these out just fine, and in some fashion there may be some methods to garnish metrics from them (notably the Enhanced Customer Service entry), how does one place a value on something like enhanced information worker knowledge. While we can certainly see whether there are variations between productivity, this reporting is in essence very circumstantial because we are pretty much preparing a collection of facts rather than a conclusive series. There could be too many factors involved that could cause discrepancies for any of these during that process, as well as for the aggregate output.

So, stuff like this is somewhat conjectural at the moment. I think that the best way to measure something like this would be with revolving, time adjusted surveillance of collaborative software utilization growth and taking into account parallel visualization of the stored and delivered data. However, I don’t think that is enough, because that is only a very small part of the measurement. Rather, it would have to be inclusive to include the more human factors, such as how often a person is involved in face-to-face interactions, movement patterns that occur in the workspace, etc. This type of research would be even further constrained as it would have to compensate for dependencies on legacy collaborative tools, such as email. I am not going to go into the approach I would use with this, I will save that for another post.

I guess what to take away from this, particularly if you are a manager, is this is a new concept and a new type of recognition. And while the benefits might not be immediately available to conventional brick-and-mortar industries, they are there. There just haven’t been any mature, intelligent tools built that inclusively deliver those types of reports.

Yet…… (and the gears start turning). Stay tuned!
:)

Share

1 Step To Better SharePoint Project Management – Say No

There is one aspect that I unswervingly observe about project managers in relation to their project success rate, even though habitually my project management observations firmly occur from the SharePoint side. It seems without disappointment that the cumulative number of successful projects that a PM owns correlates with the maturity of their ability to elegantly push clients back from making decisions that appear to be unhealthy. In other words, they can look a client in the face, and while saying yeah….that’s actually bullshit and we’re not going to do it, appropriately spin a “no” response so that it doesn’t emerge so negative. In fact, they can get a client downright excited to get turned down for a feature that’s bubbled up, basically acknowledging that the client requirement makes sense, hell it’s a great idea, but it increases risk and costs right now. Coupling with this distinct skill is that these types of project managers often times can also step back from a client given requirement, and gauge what would be the most apposite response, regardless of the size of the decision. While at first glance the cost of immediately accepting such a poor input from the client is not instantaneously evident (since it is somewhat masked by the requirements timeline) even smaller decisions when coupled together can swiftly disassemble what otherwise can prove to be a successful project.

The incapacity to say no can without difficulty produce chaos with any type of project regardless of industry and target; however it is extremely prevalent to see it within software development projects. More specifically, with SharePoint projects (since project management on SharePoint will often, like several software products, mutate into some type of Software Product Strategy [SPS]). The grounds that scope creep and requirements overflow is so ordinary with collaboration systems is because its very nature is to touch and enable so many information workers. Once people get a taste of the functionality that you are offering them, the recommendations start to come in on how to enhance the system. The recommendations might get the right management ear, and then they become requirements, which tend to be out of the original project scope. Since internal customer management might not be technically experienced or not required to be for their job role, it is often difficult for them to put a filter in place which would otherwise sieve through those requirements. Following, those requirements are characteristically shot gunned at the consultants.

I am guilty of this as well; it’s not just an internal problem from the client. I often will bring additions that I am certain will greatly benefit the entire SharePoint environment. As a good SharePoint project manager though, while one should recognize that I might have good ideas for the platform, focusing on meeting the basic requirements is what at the end of the day pays the bills.

I am in no way proposing that for each requirement that comes down the pipeline that it should be immediately rejected. That would hinder and constrain the overall project deliverables, while making you appear to be rigid to both the client and your internal project team. I have however known, and worked for, companies in the past that handled requirements as they came down with instantaneous dismissal since it is admittedly the safest route to take. On the other end of the coin, you have requirements fanatics, which I tend to hate to work with even more than the former zealots. These personalities become aware of a requirement and it becomes holy doctrine for the rest of the team, they tend to be too god damn excited on delivering such random, unmeasured features to the client. Their passion for developing components for the customer is often times dampened with failure that ensues due to their mismanagement of basic organizational tasks when they become too overwhelmed with out of context actions.

Some of the reasons that a project manager, in the context of SharePoint, should say no is:

1) While this feature will extend our SharePoint environment, it is more of a nice-to-have, and not really a baseline requirement for this project

2) The calculated effort requirement by either/both the development or operations staff doesn’t really justify the production of the feature

3) The subproject in the terms of the current contract constraints doesn’t procure a practical baseline for reallocation (or allocation) of resources to make it a tangible production.

4) While this feature from the development or operations end is awesome, it is something that client doesn’t immediately realize the need for, or will never be noticed by the client

5) Holy crap, this doesn’t even qualify because it is so experimental, it isn’t funny (kind of like when a client asked me to build neural networks for forecasting models off SharePoint lists. Probably should have nixed that one).

So what to do? It’s all about promoting and maintaining a careful balance, supplemented with a project manager’s largest tool, a butt load of documentation. In this case, the particular piece of documentation that becomes your best friend is all that good stuff involved in the Change Control Process. I don’t mean go, download, and implement 50 templates that almost certainly include a bunch of fluff documents not required for your project. Each project is dissimilar, and since Change Control Processes while often times characterizing an orthodox set of documentation templates, doesn’t unavoidably mean it is all of a sudden project creed. The Change Control Process we must take into account is a supplement to a contract, placing both the client, as well as consultant, in a position to be subject to it. While it is inflexible in this regard, it ensures that requirements acknowledged and accounted for in the project plan are readily available to the client so that you can guarantee familiarity of current project state. Even while taking in the changes that may have come down the pipeline.

It is important that a project manager says no to a client, is all I am attempting to emphasize through this I suppose. This is particularly evident in SharePoint projects. While you shouldn’t say no to everything, the client is going to be partial to your organization a lot more when you completely relinquish a project with all the contracted features rather than delivering what is essentially a failed project with several half baked features.

Share

The Degradation of Empirical Software Development Management Techniques

Today I went to the last lecture that concluded my foundational MBA course at USC (actually I am still in the lecture hall while I am writing this). The title of the lecture sounded intimidating: The Degradation of Empirical Software Development Management Techniques. Yikes. However extensive and unapproachable the title made the talk sound, it was actually focused on one major issue: while birthing and evolving software development techniques there has been a distinct de-emphasis on pragmatic metrics gathering. More and more it seems that people are concerned with the rhetoric and semantics revolving around an SDLC, rather than focusing on the SDLC management and aggregate end results to build practical reports off its garnished output. This loss is epic since it will result in the same mistakes being made twice.

I guess I can kind of see this happening in my own industry revolving around SharePoint development. There has been such stress placed on implementing one of the contemporary SDLC’s techniques like Agile and Scrum that people simple just enter into an auto-cliché mode when they are decided upon. Several companies who previously have had no development experience are somewhat forced to adapt, they haven’t had the need for such rubbish before because they had no need for custom software in general. However, SharePoint as a collaboration platform often times can work its way into such companies because information worker collaboration is a fairly consistent enterprise need regardless of industry. As a side note, I actually don’t know whether these are considered contemporary techniques ergo the placement of quotation marks around the term, I suppose in the terms of waterfall methodologies that seemed to be ingrained into management brains this is the case, so I will make that statement.

While I bring this point up, I am reminded of a short rant I had with a fellow MVP where this seems to be a quasi-proper place to insert it. In the realm of all of these new processes, we see even more to the metal of software development things such as ALT.NET etc. Quite honestly, I still follow my own paradigm which is DBAGDI, standing for Don’t Be a God Damn Idiot. In this methodology, you program how you want, and what your client expects of you. You solve a business requirement and move on. While some of these ideas and concepts spark some interesting debate whose output might result in a new way of doing things, and that’s neither here nor there, whatever happened to just programming well and within what a client expects, you know, being that metal bender programming guy that gets it done? Every day I swear I hear something new, ALT.NET, blah, blah, blah. I mean I work on a military installation, all I do all day is hear acronyms, then when I sit down to do my therapeutic development time that I enjoy so much that I made a fricking career out of it, more acronyms. Invading my space. Making me all sorts of mad. But ah, I digress from my original point which was more tiered around the project management space. Thank for hanging on through that.

Back to the project management end. While embracing and applying these SDLC’s into an organization, sometimes there is a distinct loss over what are often times required project management attributes. What’s the biggest one? Well, Earned Value Management (EVM) of course!

I am going to fly through this, but the work by Fleming and Koppelman pretty much defined EVM (it was in late 1998 if my notes serve me correctly, the professor wasn’t a historian :) ). EVM is a simple technique revolving around basic arithmetic, and likewise, provides uncomplicated, concrete metric output. EVM from a mathematical level is fairly easy to define, and is composed of several smaller formulas which are bloated and I will cover in a separate post how to simplify them.

Stepping back, let’s take a very basic, simplified look at Agile SDLC, and graft some of the major points out of it.

Agile is composed of short iterations, as opposed to large ones

Agile team members are self oriented, and the software development process generally include the entire team as opposed to chess piece management processes

Agile Is Adaptive And Expects Change

Agile Focuses On Highest Business Value First

Agile Lends Itself Well To Test Driven Development and Continuous Integration

There is a lot more but you get the idea, there are more resources than you can shake a stick at regarding Agile methodologies so I don’t want to cover it. The question remains, how do we integrate an Agile framework with some of these more formal project performance metric outputs? Furthermore, how can I compliment an adaptive SDLC that is an organizational preference with PM attributes that have been proven to provide central results?

It’s actually not that hard. Earned Value Management resolves around the Earned Value (which takes the Actual Percentage Complete by the Total Budget, simply APC% of TB = EV) and the Planned Value (which takes Expected Percent Complete Times Total Budget, simply EPC% * TB = PV). So, you calculate the Earned Value by taking your technical output against your planned technical output. Then, you can get your Planned value which is the technical output value constrained by a specific date.

Now this part is important, because managers think about one thing, dollars and cents. A lot of this value non-sense is, in essence, intangible. I say technical output because I didn’t take the note on the exact term that the professor used. You can’t put it into a monetary sense, and garnishing business value and trying to equate it at that is kind of self-defeating and in essence fictitious.

Now, let’s put these in some SharePoint terms to make some sense of it J SharePoint always makes things less complicated (sarcasm intended).

So, we have a typical SharePoint rollout at a Small-To-Medium enterprise whose initial project budget is $100,000 (hey, as consultants we always pad the cost a lil). Let’s not overcomplicate it and introduced varying development tasks and other nonsense.

In the Agile framework, we are separating this into short iterations, but for the sake of an example these are going to be a little broad. So, we are just installing SharePoint, them provisioning out some architected collections.

SharePoint task

Estimate

Completed

AC

Install SharePoint

5

5

5000

Provision Initial Collections

10

10

10000

Architect Site Structure

10

Metrics

25

15

15000

So, we got some of the basics down. Now, let’s do some project health indicators which can be feed into indicators at a later date, such as a Cost Performance Index (CPI).

First, let’s start off with the Estimated Percentage Complete. We are going to use our current iteration metric as the argument, even though there could have iterations before or after this index. So, I am going to start with the installation iteration, and see where we are at.

EPC (Estimated Percentage Complete) = CI (Completed Iterations)[5] / TI (Total Iterations)[15]

EPC = x = 33.3%

PV (Planned Value) = EPC (Estimated Percentage Complete)[33.3%] * TB (Total Budget)[100000]

PV = x = 33300

APC (Actual Percentage Complete) = TIC (Total Iterations Completed)[15] / TIP (Total Iterations Planned)[25]

APC = x = 60% Complete

EV (Earned Value) = APC (Actual Percentage Complete) * TB (Total Budget)

EV = 60% * 100,000

EV = x = 60000

Now, even though we have iterations, we can tell several things about our project even though we are just performing basic arithmetic operations. Most importantly, you are awesome because your Earned Value is above your Planned Value. You rock! If your Earned Value is less than your Planned Value, then you are in trouble.

Well, I think that covers basically what we were covering as far as empirical metrics are concerned. The point I am trying to drive home is implementing an SDLC is not a nice-to-have thing on any SharePoint project, it is required. And while this may be the case, it does not discount the tried and true project metric harvesting methods that have been around since the dawn of man. While producing of client deliverables is always the focus, generating valid project metrics can both help to manage your project better, as well as make sure that iteration problems that you have in one project, don’t get repeated on another.

Whew :)

Share