SharePoint and SOA Architectures

SharePoint as a communications and collaboration solutions provides robust faculties to build out scalable and extendable architectures that conform with SOA. We will begin by defining how we use the information architectures terms, information management, document management and content management. Because these terms are considered interchangeable concepts, it is best to clarify the definitions as followed:

  • Document Content Typically, the human mind tends to separate a document from the actual contents. So for example, instead of saying, I read what you wrote we are saying, We read the document. Therefore, we are conveying that we not only saw the document, but we also read the contents.
  • Document Containers Because of the existence of computers, the human mind has begun to regard documents and files as two different items. Instead of saying The data is in that file we say The required data is contained within that document. We have begun to distinguish between the container and the document.
  • Document Vehicles The document has become a mode of transportation of its content. The age of electronic files or documents, allow for easy transportation via email. So now instead of searching for specific information to send electronically, you can send the entire document electronically.

In the above three descriptions you found three different document concepts. These documents could hold digital images or information; however, if there is no context within the document, it is nothing more than a data collection. Because of this analogy we have come to define the document as, a container that serves as an information vehicle, stored in a SharePoint document library. In regards to building our information architecture with SharePoint which can leverage the XML web service asset, when you have a document it is a river of characters, the structures internally come to serve as and information vehicle in regards to the data contained in the characters.

The main issue that causes a problem is that the human mind tends to be far too restrictive when it concerns the definition of a document. We appear to understand a document as something that is written on a piece of paper or that of its equivalent electronically. What is being described is that a document is not limited; it can mean a wide variety of things such as, instant messages, forum messages, e-mails, processes, graphics and many other forms of data.

You see, information that we gather is usually only derived from the typical document in small quantities. We tend to gather and learn more information in regards to other means such as meetings, discussions and email. We tend to only strive to learn what they give us, we typically do not step outside of context to gather information. Items such as a sales report, emails or a meeting can all contain knowledge that can be extremely valuable.

In general, companies tend to offer a variety of manuals, guidelines and visual guides that are written to guide a user in the direction of the way the company would like to run. This helps to ensure that nothing can be forgotten or left out because the documentation is always present. It is true that people tend to retain knowledge with the use of visual aids, this means not only the printed or written words but holds true for images as well. We develop visual cues that is geared to guide us with reading as well as maintain and understand the information. Cues are used as signals in understanding the information in a way that the creator has intended. For example, within a document if the creator emphasizes a word using Bold, Italics or Underlining it will draw the reader’s attention to the specific area. If a writer were to do all three at once, this would let the reader know to pay particular attention to that sentence.

When we use visual aids within the structure of a document we are enabling the mind to process it and understand the content at a better rate. However, it is important to use these visual aids with caution, the format requires that the reader be able to recognize the cue and understand it. When using written words in your document structure, people generally make use of captions, emphasis, titles and headings, these are things the mind typically understands. For other documentation, the reader may need to be able to understand the vocabulary that is used within the document. For example, a computer, if you use a workflow diagram, the computer will not understand the cues you are giving it. It will need to know what to do when it encounters this vocabulary as well as how to recognize it. This is the same scenario as markup, you are giving the processor a cue to the action or actions you need it to perform on the content.

Management of content requires information to be organized in an upstream manner. This has to occur in a way where the individual ingredients of the content is able to keep each identity until the time of use. Doing so will allow the client to individually choose them or the ability to combine them when they feel it is necessary. This type of organization creates a clear and precise choice for the user, as well as allowing the information to be configured in a different manner for reuse.

We will now take a look at a few topics that are closely related to each other, these will include upstream content organization, then we will investigate the impact they have on authors.

Structure plays an important role in all types of documentation. For instance, when sending an e-mail to a colleague or client we will structure it in such a way that is appealing to the reader. We will spend time thinking of a title that is catchy and give great thought to the recipients of the e-mail. Rather you know it or not, this is structuring.

When looking at it in terms of content management, we but begin to understand that the text within the document cannot be a group of characters, instead we must understand that the content is made up with the following:

  • Specific content building blocks or pieces of information
  • A reference or references of other pieces of information with the same content or outside of it.
  • Specified rules that state how the blocks can be created and the configuration of the content.

Packaging is just as important, each block of information must have a package that will allow the information to be stored, delivered, and moved without losing any of the content or damage to the content. In general, every block of information must have the ability to be independent, if the information is generally part of a whole, the relationship of all aspects needs to be known.

Packages that we create typically share properties and characteristics that are important to each other. It is important to maintain predictability with any type of content processing. You must be able to understand and describe the characteristics so offer this predictability. To better understand this, consider a word processing package template. In order to maintain a certain model of use, you must pay close attention to all particular details. By leveraging SharePoint and SharePoint XML web services, you are allowing this type of predictability that is needed.

This type of packaging and content are made useful in a variety of aspects in the world today. Many people create various items for users such as, form letters, templates and models. These are typically provided bundled with an application or the user will have the ability to create their own structure using any tools that are built in the application, for example, application programming languages or macro editors. When a user conforms to any of these molds they are establishing structure, that are similar to database form fields, and all of the content becomes managed within the mold. No matter the tools used or the approach that is used the goal remains the same:

  • The ability to eliminate tasks that are repeated when creating text of similar structure
  • The ability to maintain control over the way the information is provided.

The molds are typically created to use with presentations, however many people feel they are used to receive content that is predefined and of a type that is expected. In general, these are created prior to the texts that are to use them. It is important that if you should find yourself using a structure feature that is provided by a commercial package of word processing, that you be certain to use design features instead of default. Specifically, you should try to map their use with the structure within your content.

One of the first steps to take in regards to content management is to knowledge molds that any enterprise would use or create. It must be understandable, because if the content is not created correctly the first time, time has just been wasted because of the damage repairs.

Here are some important things you should remember:

  • Be certain the content works well with the mold. If it does not work, you should not break the mold, search for another mold that will allow the content to work well. For example, if you have a mold that is created for an email you will not be able to fit a sales report into that mold.
  • Be certain that contents can still be identified by the ingredients, a user should be able to specifically identify an area of text that give a description of your process or holds your forecast in sales.
  • Knowledge molds should create an environment for an author that will allow them to be concentrated on the content instead of the structure. Additionally, it should provide support in the areas of validating and adding any information in regards to reference.

Proper labeling will allow for a package to be identified in the proper manner. It is important to note that even though the labeling and packaging may be clear, it does not allow for identifying the quality of the content. It is important to understand and manage how your content will find the way into the package. You will need to manage who is allowed to put the content in, how to apply the quality controls, a required sell by date and what will allow you to determine that the content processing is completed.

Share

4 Basic Steps To SharePoint Business Compliance

One of the hotter topics that gets brought up when rolling out SharePoint within sensitive industry environments is the concept of business and regulatory compliance (it is something that I am pretty passionate about anyways, and if you are in a vertical that is subject to one you should be too whether you are an architect or developer). This becomes a concern for organizations that want to meet some sort of benchmark for operational and legal efficiency / excellence, as a result require the use of implementing a certain set of defined, empirical standards within the collaboration and communication framework that will achieve and maintain said standards. In the realm of SharePoint as a web framework, this becomes a huge concern since many organizations leverage it as an ECM (Enterprise Content Management) system, and the process of become compliant with an arbitrary standards is a relatively simple one.

Now, I bring up the concept of an ECM for an important reason. I am not restricting SharePoint to this very specific function. Rather, I am pointing out where SharePoint functionality related to regulations becomes the most ingrained. ECM within an organization is complicated, very complicated, as it integrates likely hundreds of preexisting processes for which may or may not have their risks already defined for them. Therefore, when choosing to extract and exploit the ECM functionality out of SharePoint for your business to leverage, you may be implementing solely the basics of SharePoint, however several sister controls will have to be developed to compensate for these ingrained processes.

1) Define Your Compliance Goals In The Realm Of SharePoint

The first action that you have to take when implementing compliance standards into SharePoint is to understand the benchmarks that you want to meet before, during, and following your deployment. This is when you look at the legal and business regulations that your organization must adhere to within your specific industry vertical, which can be include objects like the Sarbanes-Oxley Act which is tiered towards fiscal accountability for companies subscribed within the United States stock exchange, the United States Patriot Act (Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act) requiring implementing regulations so that businesses are aware of with whom they are doing business with (including several other required actions), and HIPAA (Health Insurance Portability and Accountability Act) which among several components verifies the privacy of patient information and insurance. This doesn’t mean that these are the only regulations that can affect a SharePoint environment, there are several. Among those that I have run into with past SharePoint projects are:

– Sarbanes-Oxley Act (SOX Compliance)
– Healthcare Services (HIPAA)
– California Senate Bill No. 1386
– NERC Cyber Security Standards
– Financial Services (GLBA)
– Visa Cardholder Information Security Program
– MasterCard Site Data Protection Program
– American Express Data Security Standard

And I am sure that there are others that others have run into since the verticals I consult in are relatively small. Regardless, within this phase, you really have to look at the purpose of your SharePoint environment, and how it is going to be used within the realm of your business operations. Also, you have to step back and not only look at the business operations that you are targeting to optimize, but, as well, investigate the processes that could be affected while optimizing other processes (also known as “optimization ripples” within efficiency theories) When you set these goals, you have to ensure that the there are actually buildable standards that will work within the SharePoint environment so that you are within legal boundaries. Sometimes strict regulations will kill the option of having a collaboration framework or not make it fiscally reasonable to even implement it. You have to consider all the ramifications of your organizational regulations, and then consider the limits and purpose of the technology. Start out with the limitations, then work your way towards optimizations, not the other way around. Flashy technology, while daunting and catchy, shouldn’t be as appealing as not ending up costing your technology a load of money and industry embrassement.

2) Commit To You Regulation, And Begin The SharePoint Pilot

In the first step, there was the action where the actual compliance was defined and the technology was studied at a high level for integration so that a cohesive framework could actually be constructed. Now, the rubber hits the road and the assumptions that were made previously regarding SharePoint begin to assimilate and adhere to the regulations that you found that your organization was privy to. This portion of compliance planning involves a lot of peoples fingers in the pot because you have to involve people from several divisions, background, and talents. For example, it would be prudent to bring in your legal department that is most familiar with your regulations and network security so that operations can be verified from many levels, from both security on the wire to visible surface information. If you don’t have one, hire a consultant. It is worth the money as opposed to legal costs.

The pilot is generally inclusive in a network enclave or silo away on closed ring so that each aspect of studying the relation between SharePoint and the compliance rule you are interested in examining can be scrutinized, studied, disseminated. Throughout this phase, there could as a result of a heavy committement to the technology breed a lot of development effort by your programming team, in order to tailor the framework to your regulation. This is important to realize. You can tailor a framework to a regulation, but you can never tailor a regulation to a framework, that is totally out of your control (unless you are the governing body of the arbitrary regulation I suppose). And why would you want to? You could be risking a huge financial obligation to your organization if you chose that route, so it is important to make the best technology baseline decisions that you can.

The most important thing to take from this is use an enclave. If you aren’t harnessing production data, you aren’t at risk for breaching regulation since you are on a closed segment. This pilot environment should remain throughout the lifecycle of SharePoint at your organization, in order to ensure that custom development efforts don’t effect any legal regulations.

3) Pushing SharePoint From Pilot To Staging, Staging To Production

In the pilot phase you examined the framework at a more detailed level getting all the required business units involved in the process that were required such as your legal and security teams, and pushed out a compliant pilot. Now that you are assured that you are adhering to your regulation, you can being to move the compliant framework into a staging, and, finally, to the production environment. Now why is this a two step process? This is for several reasons beyond the scope of this particular blog post. Most importantly however, this is because you need to firstly assimilate the new regulatory bound SharePoint environment into the production network without touching the complete organization user base and other sensitive systems that are on the same wire. While in staging, it is common that a small user base will be chosen to pound the machine and attempt to break the compliance procedure. I can’t stress how important this is because this is when you define “responses”, such as “scrubbing methods.” A scrubbing method is a term pulled from military computing tactics, which basically means a piece of classified data has been posted to a unclassified network, which can be accessed by those that aren’t appropriately cleared and therefore the machine environment must be scrubbed. Several frameworks are subject to this type of activity, and therefore, it either has to be guarded against, or procedures must be defined in order to maintaining clean-up activities for when this illegal action is performed. Basically, you want a user base that can try their hardest to both break the operations of the environment to catch problems before it goes into production, and also define all the cleanup portions.

After the selected user base has finished their testing of the pilot environment and you have found that you have satisfactory benchmarks of compliance for your system to go live (along with all necessary response actions as described above), the SharePoint environment can start to be pushed into a production environment where the final architecture is defined. This is generally when you can remove the pilot which is generally setup within a single server configuration into a production environment where you begin to use predefined network assets such as attaching to the organizational SQL cluster. This migration can take many forms, depending on the development that was put forward previously in the staging and development phases and other actions that you took to maintain compliance in your SharePoint environment.

4) Maintaining Compliance In Your SharePoint Environment

Irrevocably tied to step 3, you have to maintain and administer your compliant SharePoint environment. This doesn’t imply the general administrative tasks that you will encounter with your SharePoint environment. Just because you have a regulation compliant SharePoint environment now does not mean that you are going to have one following 6 months of activity. This process is “living”, in the sense that you will consistently have to compensate for new revisions of the software (such as Service Packs), arbitrary user activity, and extending the framework with your own custom development. Furthermore, just because you have achieved regulatory compliance one month, does not mean that the regulation is going to be the same in 2 days, 2 weeks, or 2 years. Regulations are always shifting, and therefore your SharePoint environment must also evolve and adapt to the changes that are required by the regulation. This type of shift can be slightly compensated for by defining processes to tackle these adjustments in step 3, however it is very difficult to now what changes lawmakers are going to put forward.

A lot of companies approach this effort by simply defining change templates that help to build up the change process and make the transition as smooth as possible as regulations are put forward.

This part also will tie back into your pilot when developing extensions of the SharePoint framework, doing Line of Business (LOB) integration into your environment, or just doing simple custom component development (i.e. WebParts). For each of these efforts, they should be put through the same cycles as the actual SharePoint environment went through, so that the integrity of the system can be maintained, tested throughly by the appropriate parties, and documented as much as possible in order to bring the customizations into objects like the previously defined change templates.

I know this is not a comprehensive guide to SharePoint compliance, that was not the aim of this post. It was solely done in order to attempt to bring to a light a simple process by which before doing your SharePoint deployment, you have some foundational building blocks to think of so that you are inline with business and legal regulations.

Share

Native Cryptography in SP Doc Libs and Lists (RSA WebPart Test)

Privacy is important, especially in a collaborative environment such as SharePoint where users are heavily empowered with actions that can directly effect sensitive files and file containers. The most typical way to protect privacy for file types is to implement a form of Multi-Level Access Control, and/or native encryption components that are easy for users to understand, ones that are perferably built directly into familiar toolsets such as Office (as is the case with IRM).

End to end encryption solutions aren’t a native thing to SharePoint (outside of WRM / IRM, which aren’t “native” in the sense in that they aren’t outright built into the product COTS, they require further configuration). Furthermore, there are a lot of side effects of going down a standard encryption road when examining a collaborative cryptographic solution, most we see when using IRM encryption envelopes. When I say standard road, I mean just using a client level encryption solution (something running on a user desktop), an arbitrary scramble method / encryption algorithm, by which I mean something like:

Symmetric Encryption

(Block and Stream Ciphers)

Asymmetric Encryption

Rijndael

Diffie-Hellman

Twofish

ELGamal1

Blowfish

Elliptic Curve Techniques

DES

Paillier Cryptosystem

TripleDES

PGP

RC2

Vectors and Salts

and then re-upload the document to a document library whereby you are responsible for key maintenance between an unknown amount of parties. The most obvious problem that is spawned is encrypted content cannot be read by the SharePoint gatherer since it does not sponsor anything besides plain-text reads.

For my final paper for my Masters (I am doing my degree in Applied Mathematics / Cryptography) I received approval from my professor to move forward with creating /verifying an end-to-end cryptography solution that solves all these issues and stays within the native SharePoint framework. I am not sure how / what it will look like yet, how it will be invoked, or any of the architectural considerations of it yet. My only stipulation is that it is easy, and directly available from within the SharePoint interface.

The aggregate objective of the solution, eventually (this is probably down the road a little bit), is to exhibit the feasibility of extending a native Microsoft Windows Server operating system and Microsoft Office SharePoint Server (MOSS) environment such that it enhances global security of federal Multi-Level Security (MLS) systems which are historically built on the Bell-LaPueda model. This technique should eventually leverage a lattice MAC (Mandatory Access Control) based Multi-Policy Access Control (MACM) object labeling controls regulated by comprehensive security policy implementation. The standard notion of Discretionary Access Controls (DAC) that the Microsoft Server System currently provides are incontestably bound to native Windows Identities, allowing users management of individual objects, expressed as P=SxOxA where s translates to the subject, O to an arbitrary set of objects, and A to access modes. Use MAC MACM this instead promotes M=(S,O,A,SA,f,R), such that S is the set of subjects, O is the set of objects, A is the access mode set when a subject accesses object: SxO {, {r},{w},{e},{a},…..}, SA is the security attribute, f is the security functions, R is the set of security rules that prescribe the constraints conditions of how subjects access objects. Using this model, it is possible to procure the federal notion of no-read-up and no-write-down.

The security attribute can be defined in tuple format: SA={L,A} such that: L is the security level of an entity, denoting the security right of the entity. Different security levels are comparable. Defining “” is a partially ordered on the set of the security level, if the security level of entity i ()

This promotes a new-read-up rule denoting that a subject s S is granted read access to object o iff f(o) f(s). No write-down rule denotes that a subject s S is granted write access to object o iff f(s) (f(o)

I started toying with the idea last night, just working with getting an RSA encryption engine to take a sample plain text string in a TextBox child control. Most importantly I spent some time building some foundational abstract base classes to provide a Multiple-precision shared library (for for multiple-precision floating-point computations) and a Barrett modular reduction library to implement Barrett’s Modular Reduction Method for RSA . Although the encryption exponent is pretty tiny for the test, the plain text conversion for my shared components was successful, however this is only 10% of the battle obviously (I was mainly interested in the web basd encryption performance, I would like to eventually integrate instruction level parallelism). What I plan on ultimately doing is building this concept into a complete multi-level system with the native encryption components. In aggregate, the solution should facilitate Enterprise Content Management (ECM) and Team Virtualization (eTV) using Microsoft technologies as the institutional foundation using partitioned layers of classification as well as native encryption features. The purpose of the encryption functions with Microsoft Office Server System will be to provide practical applied cryptographic functions (again, taking into account instruction level parallelism and memory bandwidth) while maintaining the two most important concepts to cryptography, privacy and correctness. Below is my example SharePoint WebPart that I have been playing around with.

(Select Image To Expand)

I started to push the data back and forth between list items and as I expected, the universal issue of not gathering encrypted items was encountered. I am pretty sure I can get around this with floated carrying key tokens however, so that the gatherer will basically have a Skelton key to the documents that it wishes to implement full-text indexing against, which although will need to be looked at for security vulnerabilities, is really the only feasible solution that comes to mind. I think for most organizations, this will be a good solution. Although I was just pushing the data back and further programmatically, I plan on making the encryption option a item-level context menu or something along those lines.

As always, good security software is always open-source. When this project (which, for the paper / research thesis is taking on the name CryptoCollaboration) becomes production and has been somewhat tested for my grading, it will be released to the public open at no cost, probably under a GNU license.

Share