1 Step To Better SharePoint Project Management – Say No

There is one aspect that I unswervingly observe about project managers in relation to their project success rate, even though habitually my project management observations firmly occur from the SharePoint side. It seems without disappointment that the cumulative number of successful projects that a PM owns correlates with the maturity of their ability to elegantly push clients back from making decisions that appear to be unhealthy. In other words, they can look a client in the face, and while saying yeah….that’s actually bullshit and we’re not going to do it, appropriately spin a “no” response so that it doesn’t emerge so negative. In fact, they can get a client downright excited to get turned down for a feature that’s bubbled up, basically acknowledging that the client requirement makes sense, hell it’s a great idea, but it increases risk and costs right now. Coupling with this distinct skill is that these types of project managers often times can also step back from a client given requirement, and gauge what would be the most apposite response, regardless of the size of the decision. While at first glance the cost of immediately accepting such a poor input from the client is not instantaneously evident (since it is somewhat masked by the requirements timeline) even smaller decisions when coupled together can swiftly disassemble what otherwise can prove to be a successful project.

The incapacity to say no can without difficulty produce chaos with any type of project regardless of industry and target; however it is extremely prevalent to see it within software development projects. More specifically, with SharePoint projects (since project management on SharePoint will often, like several software products, mutate into some type of Software Product Strategy [SPS]). The grounds that scope creep and requirements overflow is so ordinary with collaboration systems is because its very nature is to touch and enable so many information workers. Once people get a taste of the functionality that you are offering them, the recommendations start to come in on how to enhance the system. The recommendations might get the right management ear, and then they become requirements, which tend to be out of the original project scope. Since internal customer management might not be technically experienced or not required to be for their job role, it is often difficult for them to put a filter in place which would otherwise sieve through those requirements. Following, those requirements are characteristically shot gunned at the consultants.

I am guilty of this as well; it’s not just an internal problem from the client. I often will bring additions that I am certain will greatly benefit the entire SharePoint environment. As a good SharePoint project manager though, while one should recognize that I might have good ideas for the platform, focusing on meeting the basic requirements is what at the end of the day pays the bills.

I am in no way proposing that for each requirement that comes down the pipeline that it should be immediately rejected. That would hinder and constrain the overall project deliverables, while making you appear to be rigid to both the client and your internal project team. I have however known, and worked for, companies in the past that handled requirements as they came down with instantaneous dismissal since it is admittedly the safest route to take. On the other end of the coin, you have requirements fanatics, which I tend to hate to work with even more than the former zealots. These personalities become aware of a requirement and it becomes holy doctrine for the rest of the team, they tend to be too god damn excited on delivering such random, unmeasured features to the client. Their passion for developing components for the customer is often times dampened with failure that ensues due to their mismanagement of basic organizational tasks when they become too overwhelmed with out of context actions.

Some of the reasons that a project manager, in the context of SharePoint, should say no is:

1) While this feature will extend our SharePoint environment, it is more of a nice-to-have, and not really a baseline requirement for this project

2) The calculated effort requirement by either/both the development or operations staff doesn’t really justify the production of the feature

3) The subproject in the terms of the current contract constraints doesn’t procure a practical baseline for reallocation (or allocation) of resources to make it a tangible production.

4) While this feature from the development or operations end is awesome, it is something that client doesn’t immediately realize the need for, or will never be noticed by the client

5) Holy crap, this doesn’t even qualify because it is so experimental, it isn’t funny (kind of like when a client asked me to build neural networks for forecasting models off SharePoint lists. Probably should have nixed that one).

So what to do? It’s all about promoting and maintaining a careful balance, supplemented with a project manager’s largest tool, a butt load of documentation. In this case, the particular piece of documentation that becomes your best friend is all that good stuff involved in the Change Control Process. I don’t mean go, download, and implement 50 templates that almost certainly include a bunch of fluff documents not required for your project. Each project is dissimilar, and since Change Control Processes while often times characterizing an orthodox set of documentation templates, doesn’t unavoidably mean it is all of a sudden project creed. The Change Control Process we must take into account is a supplement to a contract, placing both the client, as well as consultant, in a position to be subject to it. While it is inflexible in this regard, it ensures that requirements acknowledged and accounted for in the project plan are readily available to the client so that you can guarantee familiarity of current project state. Even while taking in the changes that may have come down the pipeline.

It is important that a project manager says no to a client, is all I am attempting to emphasize through this I suppose. This is particularly evident in SharePoint projects. While you shouldn’t say no to everything, the client is going to be partial to your organization a lot more when you completely relinquish a project with all the contracted features rather than delivering what is essentially a failed project with several half baked features.


SharePoint Back-Propagation Neural Network Problem

Yeah, I know what you are thinking, but I’m not full of shit, and I know often times I bring SharePoint to probably levels it shouldn’t be taken to, but whatever. It’s actually a side project I am working on that is looking to aggregate several sets of data into a forecasting model type environment since SharePoint lends itself pretty well to the data aggregation part, and partially well to the data mining part, well, I mean it at least it kinda of exposes the required objects through the API that would otherwise be required to do it.

Ok, so for people that haven’t worked with AI before, the highest level introduction possible…

So there are basically two types of artificial intelligence, you have weak artificial intelligence, and you have strong artificial intelligence. Weak artificial intelligence doesn’t really have the capability to evolve that well, so it can be argued whether it really qualified as AI at all. It doesn’t really constitute the presence of a pattern that mimics human behavior and the concept of evolved choice, but more relies on the clever programming and raw computing power to represent behavior that may be considered to be “human”.

On the other hand, there also exists the concept of strong artificial intelligence, which is a lot different, since it implies that the behavior, and choice patterns, of humans can logically be represented. So, in essence, your patterned programming is instead representative of the human mind. I haven’t really seen anything in application that has done this, but in theory this is what an expert system that targeted a business application should adhere to, something like SharePoint, however weak AI might be a stepping stone into such arguments.

Regardless, if SharePoint, as a primary business application platform, were to be used coupled with an AI system, it would be composed / could use / whatever of three main concepts:

Expert Systems

Neural Networks (or Artificial Neural Networks [ANN])

Evolutionary Algorithms

OK, so there are several parts and concepts that make it up, The problem that I was running into was building a Back-Propagation Neural Network, however if I could get the rudimentary concept to work I plan on extending it to hopefully work with Dynamic link matching (Neuronal Modeling), which is my real interest. What’s this? Well, I am not very adapt at its concepts, but have studied it for a wee bit, and it basically is how one could theoretically use pre-defined neural systems for the recognition of external objects, which is neato cheato.

Dynamic link matching is one of the most robust mechanisms known mostly in the realm for physical pattern recognitions (or, in a broader sense, translation invariant object recognition) as it doesn’t have leave much error that is left for distortion (which generally occurs because expressions change so much during the templating process [also known as topographic mapping] and depth skews) of the inputted objects. Dynamic link matching is heavily dependent on the concept of wavlets, Gabor wavelet transform more specifically (which are responsible for grey-value distributions). The most notable thing about DLM is its low level of error rate, because it compensates well for depth and deformation within the template scan.

After the template scan has occurred, the fun stuff appears to start happening.

You can generally see something like a humanface (represented by the circular object) several little dotted nodes across it (for which the plane the image is mapped on is a neural sheets of hypercolumns), which is representative of a neuron, which, going back to the wavlet talks, also has an associated jet value, which is orchestrates the grey-value distribution.

When the actual matching is performed of the inputted object against the stored template, it leverages network self-organization. I will talk about this maybe in a later post, because there has been no posting of my problematic code yet which is starting to annoy me.

Anyhoo, I don’t remember what I was writing about now. Oh yeah, Back-Propagation. So I was working on that for a client, and my god, what a pain in the butt getting some of it to work with SharePoint was. My main problem was getting the god damn weights to update correctly. What I finally settled on was this:


private readonly Unit[][] neuralNet;

public double[] neuralData;

public static double PrimaryDerivationOfActivation(double Argument)
return (Argument * (1 – Argument));

protected void UpdateWeights(double learningCount, double influence, double decayRate)
for (int i = 1; i < neuralNet.Length; i++) { for (int j = 0; j < neuralNet[i].Length; j++) { Unit unit = neuralNet[i][j]; foreach (Link link in unit.InputLinks) { double lr = (((learningCount * link.Source.GetOutput()) * unit.neuralData[0]) * PrimaryDerivationOfActivation(unit.GetOutput())) + (influence * unit.neuralData[1]); unit.neuralData[1] = lr; link.Weight = (link.Weight + lr) - (decayRate * link.Weight); } } } } [/csharp] Whew, I am glad I finally got the mother to work. Anyways, I will hopefully be releasing the forecasting system if the client is hip to it, and hopefully an API that allows other developers to extend other AI applications into SharePoint in order to maybe build other applications. Or I may be the only person interested in it. Meh. :)


Formation and Elicitation of Knowledge Management

Formation and Elicitation of Knowledge Management

KM also known as knowledge management from processing perspectives is troubled with the dissemination, creation and usage of knowledge within the company. A well-structured process is in demand for placing for managerial knowledge to be successful. The processes could be separated into steps.

Starting with knowledge creations or elicitation: following its capture or storage, then transferring or disseminations: Lastly, its exploitations…we now comprehend the various stages of the processing.

Reaction of Creation and Elicitation

Knowledge requires creation and solicitation to produce reactions to something or else respond to stimulus of some sort. Thus, knowledge requires creation and solicitation from resources in order to function as inputs to the knowledge administration processing.

The first scenario where knowledge is required to create, we start with the roots, which is data. Relating data requires gathering from various basis, including sales, billing, transaction and systems collection. At the time applicable data is gathered, the data requires processing to produce meaningful information. The transaction systems processing takes care of the task in many businesses today. Similar to data information coming from various sources, likewise requires gathering of data.

One of the important considered aspects of the technique is vigilance of the information, since it can come from external starting place as well as internal sources. Industrial publications, as well as Government sources conduct market surveys to move regulations and laws, etc. These sources make up the external starting places. Information gathered requires integration. When all details of the necessary information is gathered and at our disposal, we start analyzing its pattern, trends, and associations—generating knowledge as a result.

Tasks of the knowledge creation could be delegated or devoted to personnel, including marketing financial analysts. Alternatively, it could employ artificial intelligence-base computerized techniques for tasking genetic algorithms, intelligent agents, and artificial neural networks.

Data Mining and knowledge discovery in databases (KDD) come in union with the processing of extracting validity, previously unfamiliar and possibly practical in patterns and information from raw data in large databases. Analogy of data mining suggests sifting through huge amounts of low-grade ore or data, to find something of value. The process is multi-stepping and iterative inductive processing. The processes include tasks including, data extraction, problem analyzing, data preparations, cleaning, data reduction, output analyzing, reviewing, and rule development. For the reason that data mining involves retrospective analyzing of data, experimental designs outside of the scope of data mining is out of complex. Data mining and KDD generally is minister to synonyms referring to the entire processing in evolving from data to knowledge. The goal of data mining is to extract pertinent information from the data with ultimate goals of discovering knowledge.

Knowledge Managing Processing

Knowledge as well resides in the mind of employees in form of know-how. Knowledge within the residency of the human mind however is often in unspoken form. To ready for sharing of information across an org, the knowledge requires transferring to explicit formatting.

Inviting organizational atmospheres is the center for knowledge soliciting. Sharing of ‘know-how’ with colleagues, is essential, especially without fear of personal value losses, and near to the ground occupational security issues. KM-Knowledge Management is all about sharing. Personnel at workplaces are likely to communicate freely with reservation in informal atmospheres, speaking with peers rather than mandated managers.

Capturing and Storage

In order to enable storage and distribution, knowledge gathering must adhere to codifications in machine-readable formats. Codification of knowledge leads to transferring of explicit knowledge in paper form reporting or manuals in electronic documentation, and in unspoken forms foremost, and then symbolized in electronic form. The documents require search capabilities in order to facilitate ease of knowledge retrieval. Codification techniques based on notions, lead us to believe knowledge can in fact be codified, stored and recycled at a later time. The conclusion is that the knowledge is extracted from the person {s} who developed the information, and made it independent of the person. The information then can be reused for various reasons. The approach facilitates individuals to search information and retrieve knowledge without contacting the original developer of the information. Codification of knowledge, while in form is beneficial for sharing purposes, will have linked costs. The plan makes it easy to transfer strategic know-how outside of the company for scrupulous reasons. It is costly to codify knowledge and generate repositories. Furthermore, we could witness information overloads, which large directories of codified knowledge could never be of use or used for reason that the knowledge is overwhelming in nature of the information gathered. Codified knowledge must be gathered from a variety of sources and made centrally accessible to all managerial affiliates. Exploitation of centralized repositories facilitates trouble-free and speedy recovery of knowledge, while eliminating duplication of efforts by departmental or organizational levels and for this reason, cost is saved.

Transferring and Dissemination

In terms, one of the prevalent barricades to managerial knowledge and its usage of knowledge is an unproductive channel between knowledge supplier and seeker.

Logjams take place from origins, such as temporal locality or the deficient in of inducements for knowledge sharing.


Ruggles’ (1998) conducted a study of 431 European companies, and US, which showed that creating networks of knowledge workers and mapping internal knowledge, are the two top missions for effectual KM-knowledge management.

Nowadays nearly, every knowledge repositories are web-enabled providing the broadest dissemination over the World Wide Web or via intranets. Group Support Systems are also utilized to provide useful knowledge sharing. Two of the prominent sources come form IBM’s Lotus Notes and Microsoft Exchange programs. The security data sources are important as well as capabilities of user friendliness, and are considered while providing accessible knowledge repositories. Password usage and severs on secure platforms are important while providing accessible knowledge in susceptible nature. Accessible mechanisms require user-friendly functions in an effort to utilize knowledge repositories.

The exchanging of explicit knowledge is comparatively straightforward via the electronic community. On the other hand, exchange of unspoken knowledge is easiest at what time we have a shared context, coalition, and ordinary language in non-verbal and verbal cues. It enables high-levels of understanding amongst members of organizational atmosphere.

In 1995, Nonaka and Takeuchi identified the processing of socialization and externalization, and its method of transferring unspoken knowledge. Socialization continues the knowledge unspoken throughout the transferring, whereas externalization modifies the unspoken knowledge into more explicit knowledge. Examples of socialization include on-the-job training and apprenticeships. Externalization, which comprises the utilization of metaphors and analogies to trigger dialogue amid individuals, communicates knowledge. Portions of the knowledge is, on the other hand, is lost during transfer. To endorse such knowledge sharing, businesses ought to consent for video and desktop conferencing as a practical alternative for knowledge dissemination.

Exploitation and Application of Knowledge

Members of staff using knowledge repositories within organizational recital are a key gauge of the system’s achievement. Unless people learn from knowledge and apply it, knowledge will never turn into an innovation trend. The enhanced capabilities of collecting and processing data, or to communicate through electronic device does not (on its-own), lead to improvement of human communications or actions necessarily.

The notion recently pertaining to community practices in fostering knowledge sharing and exploitations have developed interest around the world. Brown and Duguid, in 1991, argued that a significant task for organizations is to distinguish and back accessible or embryonic communities. A great deal of knowledge exploitation and application occurs within a team environment, including workgroups in organizations. The support is necessary to enforce success.

Davis and Botkin in 1994 summarized six traits of knowledge-base business. The traits include:

  1. The more the customers employ knowledge-based aids, the more intelligent they become.
  2. Knowledge-base products and services adjust changing state of affairs
  3. Knowledge-based businesses can customize their offerings
  4. Knowledge-base products and services relative have short life cycles
  5. Knowledge-based businesses react to customers in real time manner