Monday, May 14, 2007

What are the Differences Between SQL Server 2000 and SQL Server 2005?

What are the Differences Between SQL Server 2000 and SQL Server 2005?

I've been asked this question every time that there's a new version and yet I've never been able to give what I think is a nice, concise, logical answer that satisfies the asker. Probably it's a lack of my ability to easily form words in my mouth and get them out in the proper order, so I decided it might make some sense to do this on paper (metaphorically speaking) and help others out.

Like many of you, I usually get this question from someone outside of SQL Server. A windows admin, a network guy, etc., someone who has little contact with SQL Server. Or maybe it's someone who's been stuck with admin'ing a SQL Server instance.

In any case, I wanted to try and explain this concisely for the non-DBAs. As I began this project, however I soon realized that it's not easy to just give a good general answer. As with everything else in SQL Server it seems that "it depends" is the best general answer, so I broke this up into a few areas. This part will look at the administrative differences and the next will cover more of the development differences.

The Administrative Differences

Administering a SQL Server instance to me means making sure the server service runs efficiently and is stable and allows clients to access the data. The instance should keep data intact and function according to the rules of the code implemented while being well maintained.

Or for the non-DBAs, it means that you are the sysadmin and it just works.

The overall differences are few. Sure we use Management Studio instead of Enterprise Manager, but that's not really a big deal. Really many of the changes, like being able to change connections for a query, are superficial improvements that don't really present a substantial change. If you think they do, you might be in the wrong job.

Security is one area that is a very nice improvement. The separation of the schema from the owner makes administrative changes easier, but that is a big deal because it greatly increases the chances you won't keep an old account active because it's a pain to change owners on objects. There's also more granularity and ease of administration using the schema as another level of assigning permissions.

Another big security change is the ability to secure your web services using certificates instead of requiring authentication using a name and password. Add to that the capability to encrypt data, and manage the keys, can make a big difference in the overall security of your data. You have to carefully ensure your application and access is properly secured, but just the marketing value of encryption when you have credit card, financial, or medical data is huge. SQL Server 2000 had no real security features for data, allowing an administrator to see all data. You could purchase a third party add-on, but it was expensive and required staff training. Not that you don't need to learn about SQL Server 2005, but it should be a skill that most DBAs will learn and be able to bring to your organization over time.

High availability is becoming more and more important to all sizes of businesses. In the past, clustering or log shipping were your main choices, but both were expensive and required the Enterprise Edition. This put these features out of the reach of many companies, or at least, out of many DBAs' budgets. With SQL Server 2005, you can now implement clustering, log shipping, or the new Database Mirroring with the Standard edition. With the ability of Database Mirroring to use commodity hardware, even disparate hardware between the primary and mirror databases, this is a very reasonable cost solution for almost any enterprise.

There are also online indexes, online restores, and fast recovery in the Enterprise Edition that can help ensure that you take less downtime. Fast recovery especially can be an important feature, allowing the database to be accessed as the undo operations start. With a log of open transactions when a database is restarted, this can really add up to significant amounts of time. In SQL Server 2000, you had to have a complete, intact database before anyone could access it. With redo/undo operations sometimes taking a significant amount of time, this could delay the time from Windows startup to database availability by minutes.

Data sizes always grow and for most companies, performance is always an issue on some server. With SQL Server 2000, you were limited to using 2GB of RAM and 4 CPUs on the Standard Edition. The number of CPUs hasn't changed, but you can now use as much RAM as the OS allows. There also is no limit to the database size, not that the 1,048,516 TB in SQL Server 2000. Since RAM is usually a limiting factor in the performance of many databases, upgrading to SQL Server 2005 could be something you can take advantage of. SQL Server 2005 also has more options and capabilities on the 64-bit platform than SQL Server 2000.

Why Upgrade?

This is an interesting question and one I've been asked quite a bit over the last 18 months since SQL Server 2005 has been released. The short answer is that if SQL Server 2000 meets your needs, then there's no reason to upgrade. SQL Server 2000 is a strong, stable platform that has worked well for millions of installations. If it meets your needs, you are not running up against the limits of the platform, and you are happy with your system, then don't upgrade.

However, there is a caveat to this. First the support timeline for SQL Server 2000 shows mainstream support ending next year, in April 2008. I can't imagine that Microsoft wouldn't extend that given the large number of installations of SQL Server 2000, but with the next version of SQL Server likely to come out next year, I can see this being the point at which you cannot call for regular support. The extended support timeline continues through 2013, but that's an expensive option.

The other consideration is that with a new version coming out next year, you might want to just start making plans to upgrade to that version even if you're happy with SQL Server 2000. If the plan is to release a new version every 2-3 years, you'll need to upgrade at least every 5-6 years to maintain support options.

Be sure that in any case you are sure the application you are upgrading, if it's a third party, is supported on SQL Server 2005.

Lastly, if you have multiple servers and are considering new hardware for more than 1 of them, it might make some sense to be sure to look at buying one large 64-bit server and performing some consolidations. I might recommend that you wait for the next version of SQL Server if you are worried about conflicts as I have heard rumors of switches to help govern the resource usage in Katmai (SQL Server 2008).

A quick summary of the differences:

Feature SQL Server 2000 SQL Server 2005
Security Owner = Schema, hard to remove old users at times Schema is separate. Better granularity in easily controlling security. Logins can be authenticated by certificates.
Encryption No options built in, expensive third party options with proprietary skills required to implement properly. Encryption and key management build in.
High Availability Clustering or Log Shipping require Enterprise Edition. Expensive hardware. Clustering, Database Mirroring or Log Shipping available in Standard Edition. Database Mirroring can use cheap hardware.
Scalability Limited to 2GB, 4CPUs in Standard Edition. Limited 64-bit support. 4 CPU, no RAM limit in Standard Edition. More 64-bit options offer chances for consolidation.

Conclusion

These seem to be the major highlights from my perspective as an administrator. While there are other improvements, such as the schema changes flowing through replication, I'm not sure that they represent compelling changes for the non-DBA.

In the next article, I'll examine some of the changes from a developer perspective and see if any of those give you a reason to upgrade.

And I welcome your comments and thoughts on this as well. Perhaps there are some features I've missed in my short summary.

The Secrets of Great Due Diligence

The Secrets of Great Due Diligence

This is in continuation of self-study material, I have searched during my exploration of new ideas and knowledge base. My intention is just to compile related material at one place for everyone.
Food for Thoughts.



With Thanks:
A Harvard Business Review excerpt.



Sealing the deal is the easy part. But first comes due diligence. Here's how to calculate your target's stand-alone value.


Deal making is glamorous; due diligence is not. That simple statement goes a long way toward explaining why so many companies have made so many acquisitions that have produced so little value. Although big companies often make a show of carefully analyzing the size and scope of a deal in question—assembling large teams and spending pots of money—the fact is, the momentum of the transaction is hard to resist once senior management has the target in its sights. Due diligence all too often becomes an exercise in verifying the target's financial statements rather than conducting a fair analysis of the deal's strategic logic and the acquirer's ability to realize value from it. Seldom does the process lead managers to kill potential acquisitions, even when the deals are deeply flawed. [...]

What can companies do to improve their due diligence? To answer that question, we've taken a close look at twenty companies—both public and private—whose transactions have demonstrated high-quality due diligence. We calibrated our findings against our experiences in 2,000-odd deals we've screened over the past ten years. We've found that successful acquirers view due diligence as much more than an exercise in verifying data. While they go through the numbers deeply and thoroughly, they also put the broader, strategic rationale for their acquisitions under the microscope. They look at the business case in its entirety, probing for strengths and weaknesses and searching for unreliable assumptions and other flaws in the logic. They take a highly disciplined and objective approach to the process, and their senior executives pay close heed to the results of the investigations and analyses—to the extent that they are prepared to walk away from a deal, even in the very late stages of negotiations. For these companies, due diligence acts as a counterweight to the excitement that builds when managers begin to pursue a target.

The successful acquirers we studied were all consistent in their approach to due diligence. Although there were idiosyncrasies and differences in emphasis placed on their inquiries, all of them built their due diligence process as an investigation into four basic questions:

  • What are we really buying?

  • What is the target's stand-alone value?

  • Where are the synergies—and the skeletons?

  • What's our walk-away price?

[Here] we'll examine each of these questions in depth, demonstrating how they can provide any company with a solid framework for effective due diligence. [...]

Once the wheels of an acquisition are turning, it becomes difficult for senior managers to step on the brakes.

What is the target's stand-alone value?
Once the wheels of an acquisition are turning, it becomes difficult for senior managers to step on the brakes; they become too invested in the deal's success. Here, again, due diligence should play a critical role by imposing objective discipline on the financial side of the process. What you find in your bottom-up assessment of the target and its industry must translate into concrete benefits in revenue, cost and earnings, and, ultimately, cash flow. At the same time, the target's books should be rigorously analyzed not just to verify reported numbers and assumptions but also to determine the business's true value as a stand-alone concern. The vast majority of the price you pay reflects the business as is, not as it might be once you've won it. Too often the reverse is true: The fundamentals of the business for sale are unattractive relative to its price, so the search begins for synergies to justify the deal.

Of course, determining a company's true value is easier said than done. Ever since the old days of the barter economy, when farmers would exaggerate the health and understate the age of the livestock they were trading, sellers have always tried to dress up their assets to make them look more appealing than they really are. That's certainly true in business today, when companies can use a wide range of accounting tricks to buff their numbers. Here are just a few of the most common examples of financial trickery used:

  • Stuffing distribution channels to inflate sales projections. For instance, a company may treat as market sales many of the products it sells to distributors—which may not represent recurring sales.

  • Using overoptimistic projections to inflate the expected returns from investments in new technologies and other capital expenditures. A company might, for example, assume that a major uptick in its cross selling will enable it to recoup its large investment in customer relationship management software.

  • Disguising the head count of cost centers by decentralizing functions so you never see the full picture. For instance, some companies scatter the marketing function among field offices and maintain just a coordinating crew at headquarters, which hides the true overhead.

  • Treating recurring items as extraordinary costs to get them off the P&L. A company might, for example, use the restructuring of a sales network as a way to declare bad receivables as a onetime expense.

  • Exaggerating a Web site's potential for being an effective, cheap sales channel.

  • Underfunding capital expenditures or sales, general, and administrative costs in the periods leading up to a sale to make cash flow look healthier. For example, a manufacturer may decide to postpone its machine renewals a year or two so those figures won't be immediately visible in the books. But the manufacturer will overstate free cash flow—and possibly mislead the investor about how much regular capital a plant needs.

  • Encouraging the sales force to boost sales while hiding costs. A company looking for a buyer might, for example, offer advantageous terms and conditions on postsale service to boost current sales. The product revenues will show up immediately in the P&L, but the lower profit margin on service revenues will not be apparent until much later.

To arrive at a business's true stand-alone value, all these accounting tricks must be stripped away to reveal the historical and prospective cash flows. Often, the only way to do this is to look beyond the reported numbers—to send a due diligence team into the field to see what's really happening with costs and sales.

That's what Cinven, a leading European private equity company, did before acquiring Odeon Cinemas, a UK theater chain, in 2000. Instead of looking at the aggregate revenues and costs, as Odeon reported them, Cinven's analysts combed through the numbers of every individual cinema in order to understand the P&L dynamics at each location. They were able to paint a rich picture of local demand patterns and competitor activities, including data on attendance, revenues, operating costs, and capital expenditures that would be required over the next five years. This microexamination of the company revealed that the initial market valuation was flawed; estimates of sales growth at the national level were not justified by local trends. Armed with the findings, Cinven negotiated to pay £45 million less than the original asking price.

Getting ground-level numbers usually requires the close cooperation of the acquisition target's top brass. An adversarial posture almost always backfires. Cinven, for example, took pains to explain to Odeon's executives that a deep understanding of Odeon's business would help ensure the ultimate success of the merger. Cinven and Odeon executives worked as a team to examine the results of each cinema and to test the assumptions of Odeon's business model. They held four daylong meetings in which they went through each of the sites and agreed on the most important levers for revenue and profit growth in the local markets. Although the process may strike the target company as excessively intrusive, target managers will find there are a number of benefits to going along with it beyond pleasing a potential acquirer. Even if the deal with Cinven had fallen apart, Odeon would have emerged from the deal's due diligence process with a much better understanding of its own economics.

Of course, no matter how friendly the approach, many targets will be prickly. The company may have something to hide. Or the target's managers may just want to retain their independence; people who believe that knowledge is power naturally like to hold on to that knowledge. But innocent or not, a target's hesitancy or outright hostility during due diligence is a sign that a deal's value will be more difficult to realize than originally expected. As Joe Trustey, managing partner of private equity firm Summit Partners, says: "We walk away from a target whose management is uncooperative in due diligence. For us, that's a deal breaker."

Principles of Negotiating

27 Principles of Negotiating

Feb 1, 2003 12:00 PM, RCM Staff Report

Basics to Keep in Mind

  1. If you ask for something before a contract is signed, it's called “negotiating.” If you ask for something after a contract is signed, it's called “begging.” It's better to be a good negotiator than an expert beggar.

  2. From negotiator Chester Karras: “You don't get what you deserve, you get what you negotiate.”

  3. From motivational expert Zig Ziglar: “You can get anything in life, if you help enough other people get what they want.”

  4. Everything is negotiable, but everything has a price.

  5. Quoted prices are invitations to buy, but not statements of value.

Important Fundamentals

  1. Terms are just as important as dollars. Many people focus on rates, dates, and space (the big three of meeting planning), but the other fine print — such as liability and attrition — can have just as much importance. These things will translate into dollars.

  2. Negotiate at the proper authority level. Negotiate with the person who can say “yes.” Don't let your negotiation get lost in the translation. You don't want to have to negotiate it more than once. Ask to negotiate with someone who has the authority to go “off the script” or the rate card. Refuse to negotiate with someone who doesn't have that authority.

  3. If you want something, ask for it. Good negotiators do not put their best terms on the table first.

  4. Focus on the relationship. It's important that the relationship is still there once you're through with the negotiations. You don't want to get to the end of an agreement and never want to see each other again.

The Four Unwritten Rules

<A TARGET="_blank" HREF="http://ad.doubleclick.net/click%3Bh=v8/350e/3/0/%2a/r%3B55417913%3B0-0%3B1%3B6912568%3B4307-300/250%3B18859635/18877530/1%3B%3B%7Esscs%3D%3fhttp://www.ahl-harrisonprinceton.com"><IMG SRC="http://m1.2mdn.net/1264125/HCCH-MtgNet-300x2501.gif" BORDER=0></A>

In every negotiation, there are four unwritten variables. All exist in every negotiation, whether or not you know or understand that.

  1. Power

    This is the ability to get the other side to do things in the way you see favorable. The top two power sources are competition and the printed word. If a hotel knows that four other hotels in town want your business, then that hotel likely will want your business, too. Hotels play that game, too. They try to get more than one group interested in the hotel. And remember: Always question the printed word. Printed rates are not final rates.

  2. Time

    Ninety percent of the negotiating happens in the last 10 percent of the time allotted. Negotiating will go on forever unless one side imposes a deadline. The corollary is that time works against the person who doesn't have it. Never reveal your real deadline, and never negotiate when you're in a hurry.

  3. Knowledge

    Knowledge is a combination of expertise and information-gathering regarding the wants and needs of the other side. How and when is the person you're dealing with evaluated? How experienced is the person? What's the hotel's average daily rate, its peak season, and does it have other customers who want the same dates?

  4. Leverage

    Leverage is your ability to get the hotel to want your business and to give you favorable terms.

Negotiating Gambits

Beginning Gambits occur at the start of negotiations.

  1. The Flinch

    Most religious meeting planners are born with this: the ability to express shock and dismay at what the other side is presenting. This technique forces the other side to adjust.

  2. Feel/Felt/Found Technique

    This is a way of acknowledging another person's feelings without giving any ground. It's also a way to disagree without being disagreeable. Here's the script: “I understand how you feel. Others have felt the same way, but when they have found out more about us, they have come around.”

  3. First Offers

    The general rule is to never accept the first offer.

  4. The Vise

    The purpose of the vise is to squeeze the price range up or down in your favor. When someone names a price, you say: “You'll have to do better than that.” But be prepared for the response: “How much better do I have to do?”

Middle Gambits occur during the middle of negotiations, the point at which most negotiations begin to stall. Middle gambits are used to keep things going, assuming that you want to do business with this party. There are two basic techniques.

  1. The Trade-Off

    Never give a concession without getting a concession. This is the secret to keeping a negotiation balanced. It keeps the other side from nibbling you to death. They know they'll have to give up something for everything they get.

  2. The Set-Aside

    When you're deadlocked on an issue, set it aside and come back to it after you've reached agreement on the easier issues. Why leave the toughest issues for last? Because by the end of negotiations, the process has momentum and both sides will have the motivation to be flexible.

Ending Gambits are the end games.

  1. BATNA

    When you reach the end and are asking yourself if you should go through with what you've negotiated, ask yourself: “What's my Best Alternative To a Negotiated Agreement?”

  2. The Walk-Away

    Your ability to negotiate is tied to your ability to walk away from the deal. This is why you want to give yourself options.

Requirements Elicitation

Requirements Elicitation

Summary

An overview of the Requirements Elicitation problem, emphasising its difficulties, is followed by a description of the Active Structure approach to generating and maintaining complex specifications.

Introduction

Requirements Elicitation might be described as eliciting a specification of what is required by allowing experts in the problem domain to describe the goals to be reached during the problem resolution. This being so, we may be limited to having a vague desire at the beginning of the process, such as "We want a new aircraft carrier", and at the end of the process having a detailed description of the goals and some clear idea of the steps necessary to reach the goals.

There are several things wrong with this description. Where does the logical model reside - in people’s heads? Is there an expert with sufficient breadth and depth of domain knowledge to ensure the goal and all its subgoals are consistent and achievable? If there is not, are we merely leaving to the design stage the process of systematising the subgoals. It is very likely that many goals will be inconsistent, even deliberately contradictory. Can we say the Requirements Elicitation stage is complete while this is so? We certainly can if our methods for supporting the elicitation have no means of establishing the consistency of the goals, or even describing many of them. How precise do we need to be in specifying our goals. Is there anything left to do in the design stage, or have we gazumped it by not having a method of description that permits variability in the requirements.

Let’s use an example of a helicopter. Before the first helicopter had flown, and people had become familiar with its performance envelope, how successful would requirements elicitation have been in eliciting a consistent and achievable set of goals. An airborne vehicle which could hover, move up, down, sideways, even backwards. The models in people’s heads would have veered among : a hot air balloon, a hummingbird, a hovering hawk, a winged craft of the time, like a DC3. Until at least a rough performance envelope appeared, to shape and limit what people thought, requirements elicitation would have been unlikely to have been successful if it had relied only on the users (what users?) to provide a consistent mental model. Does it turn out that Requirements Elicitation only works well when people know almost exactly what they want, and hardly works at all when there is significant design required to move from a concept with very hazy boundaries to an object within the limits of current or very near-term technology.

Let’s go back to the aircraft carrier. It’s 20 years since we built one. How do we gather experts in the design, building and use of such an object. Are the mental models of at least some of the experts mired in the past, in terms of ship propulsion, low speed abilities of fighter aircraft, weapons systems command and control, vulnerability to attack? We can mix in experts in new fields with no integration experience, but how to get a consistent mix of old and new? One way is to build an Active Structure as we gather requirements. Some typical requirements:

Role - why a sea based platform is necessary

Capabilities - number and type of aircraft, range, speed, sea keeping.

Survivability - who is attacking it and its permissible failure space

How much

How long to service

Service life

Are there requirements, like why are we building it, we won’t attempt to formalise, or even mention? Will requirements that no-one chooses to mention bring the project down later on? How far will we go out into the larger system of which this is a component to understand and validate the decisionmaking?

The area receiving most attention for the use of Requirements Elicitation is software systems. We do other types of complex system badly, but their physical reality restricts the grossness of the errors we commit. Software systems can be more complex than any other system we attempt to build, but our difficulty in visualising their behaviour can lead to the grossest design errors. Even if we do them well, their nature allows a good idea to radically alter the topology of their structure, invalidating much of the analysis that was supporting them.

The end result of eliciting requirements needs to be a compromise among competing requirements. Everyone in the group may wish to have a voice, but this may leave us with a mishmash, disenfranchising those who are not present at a later point where consistency is enforced. The sooner we can impose some discipline on the requirements, the sooner people are pushed to expose what it is they really need. A way to do this is to ensure we do not allow inconsistent requirements to propagate in parallel for very long. The longer they propagate, the more elaboration, consensual agreement and frozen structure around them, the harder they are to root out. A planning system which makes no attempt to model consequences of decisions contributes to the management problem by allowing user experts to build expectations on poor foundations.

Imposing a Structure

There have been many attempts to impose some particular structure on the high level planning process. The chosen structures are often those that people happen to be familiar with, whether or no they are relevant to the problem - from decision trees to expert systems to object oriented hierarchies. While it is certainly worthwhile to systematise what we do, the imposition of a structure which is rigid can drive the planning process in undesirable directions. As example, a rich and detailed hierarchy for motor vehicle design may be specifying the glove box material before the engine placement or even type is decided. The apparent rapidity and completeness of the design process masks the fact that the design was already implicit in the structure we adopted - that is, no new design above the trivial level can occur because cross connections are not permissible in the hierarchical structure. Successful designs come from a rethink of the requirements to avoid what seems necessary but is not, or a realisation that new materials or configurations or processes are economically viable - a nonstick frypan, an east-west engine, the cellular phone. The topology of a logical model of the requirements is so fluid in the early stages of specification that any attempt to impose an alien structure is doomed to failure.

It may be that the requirements elicitation process is of its nature limited to copying an existing design or choosing among a few well known alternatives. Then a simple predetermined structure may be possible. Even here, the structure will need to adapt as competing requirements are compromised out. In all but the simplest cases, the variability in the topology defeats rigid preconceived notions of a directed decision structure, because there is no stage to which we may not be driven back in the search for a solution. There is a structure, the structure of the relationships that make up the problem. Using it may require us to think more flexibly than we might like, because nothing is certain.

The Planning Spectrum

The initial requirements starts at one, "We want a new carrier", has some outline by the time the number of requirements is ten or fifty, and the final detailed requirements might number fifty thousand, and require continual maintenance through the entire planning process. What method is available to support the move from one end of the spectrum to the other, or at least to the point where all variability has gone? Do we need a tool for Requirements Elicitation, another for high level design, another for monitoring development. All of these steps are affecting the requirements.

In the later stages we may not be able to handle the sheer weight of detail in a flexible way, but that does not mean we cannot handle the control aspects with the same tool that supported the RE step. In the diagram, we are ignoring the step before Requirements Elicitation, the step that added the proposal to the program in the first place. Planning support can extend back to encompass this stage as well. At each stage boundary, we are passing across, and can pass back, a web of constraints on what is proposed at the particular stage.

What Are The Requirements

Why not elicit the requirements for the Requirements Elicitation process?

We would like to build a logical/existential model as we go, with continuous Truth Maintenance so inconsistent requirements are weeded out as soon as possible.

We need a language of approximate discourse with designers and other stakeholders. We want to tell people where we would like to be on the cost/capability curve, not screw everything down to precise values that take on holy writ, then turn out to be unachievable or wrong.

We should be able to quantify things, but may wish to approximate, so a range can be accepted and used for computation of alternatives. We might be specifying integers, 40-70 aircraft, or real numbers, 123.3-140.7 tons of fuel per hour.

Not everything we want to specify will be analysable - sometimes it will be preferable to use stochastic methods of specification instead of researching relationships, sometimes we will have no choice. We should be able to smoothly integrate analysis with piecewise approximation and probabilistic methods.

What structure should we use? Why not just use the structure of the requirements, because any other structure will either be restrictive, or we will need to change it as we go along. Adding requirements changes the topology of the structure. If we choose to link aircraft numbers to hull speed, that is what we want. There may be a carrier solution with 40 planes and fifty knots, as well as the conventional solution with 75 planes and 20 knots, but we want to rule out 40 planes and 20 knots.

We may wish to hinge one requirement off another or link them together in some way, if this then that, but still at the tentative stage, exercising logical or existential control over one or more requirements by outcomes of other requirements, and vice versa.

The requirements may start off planar, a few scribbled on a whiteboard, but will rapidly develop a layered structure, where subgroups of experts are defining requirements for reasonably independent components. The highest level requirements that we are working on may well turn out to be a component of some even higher level requirement (someone has to sell the acquisition budget, of which our carrier is a part), so the language of requirements description should permit unlimited layering, both up and down from where we notionally started.

We should be able to link across components from any level to any level. There is a weak form of inheritance in the structure - the propulsion inherits the weight and the hull shape, but in most cases the inheritance is circular - the weight is assumed to be in the range of the proposed propulsion. It is the long range connections among largely independent entities that are the most valuable to describe, as they are likely to be least understood by experts in the particular specialty.

We would like to start with one requirement and reach tens of thousands. As the outlines firm up, we want to keep the potential variability in the components. Occasionally we will need to back out of a dead end, where what we thought possible is found not to be so within the cost/time frame allowable. That means requirements that have long since been frozen anywhere in the overall specification may need to become as malleable as they were at first, while we search for another consistent configuration to which to jump. This is typical of design - there are small islands of success in an ocean of failure.

Time and money are fundamental requirements, just as much as speed and range. Their specification should not have the rigidity of conventional project planning methods, certainly not while we are determining whether an outcome is even possible.

The requirements should encompass everything that will lead to success or failure, and allow a wide range of stakeholders to see that their requirements are influencing the outcome.

The Network Approach

A logical network (we mean a network combining logic and existence and time) of analytic operators acting as undirected agents would seem to meet all of the requirements stated. This assumes that analysis of the requirements is worthwhile or even possible. Some of the requirements will not be analysable - they may still be modellable in an analytic system, using approximation methods.

The structure of the logical network comes only from the structure of the statements used to represent the requirements. The statements link objects through analytic operators under logical control. A seemingly simple change to the requirements can drastically change the topology of the network by causing two objects in logical space that were notionally far away to be adjoining, just as it can completely recast the structure of the requirements.

The statements in the logical network can be seen as constraints - that is, after all, what requirements are. At an early stage, there is no sense of solving the constraints, but there is a sense of reasoning about them. The statements are more than just constraints, they form a web of logical control over what is being described. Statements can also add two numbers together to get a third, that is, they are extensible into areas that are not constraints. If we attempt to maintain consistency of specification with a paper-based approach, we rapidly get inconsistencies creeping in, someone or something else having to attend to all the connections.

Numbers in the network can be represented as singular values or ranges, either of integers or reals, and these ranges can be propagated through the structure and used for calculation or logical reasoning.

Time and money can be represented in a flexible way, using ranges and logical control of variability. Alternatives and contingencies fit easily into the network structure, just as they should in any well thought out set of requirements. [1].

Previous papers (see Technical Discussion) have described various specie of network operators - logical structure, simple and complex analytic operators. Other operators in the network can store distributions and correlations between variables that have been found by reading databases resulting from scenario analysis. These operators go through a learn-store-output cycle, and respond to changing their probability control by changing their outputs from a range encompassing all alternatives, through to singular outputs.

Layering of Knowledge

Every field of knowledge has its specialties - whether medicine, aircraft design or physics. There are areas which are largely independent, but still need a few connections to other specific areas or to general facts. This problem of structuring knowledge in shells might be shown diagrammatically as

There are conduits at interfaces, the surfaces of these shells of knowledge, with still a need to penetrate the shell for a more detailed connection, one that had not been thought of when the boundaries of the particular knowledge were established. The boundaries keep changing as the knowledge contained in the shells changes - too many intrusive connections and the boundaries need to be re-established to minimise intrusion on the specialist knowledge, but intrusions there will be. For a detailed specification of a complex entity, fifty levels of knowledge shelling can be easily reached.

The environments in a logical network provide unlimited layering, while retaining the ability to connect in an undirected way across any boundary.

Conclusion

Requirements Elicitation demands flexibility of description.

The undirected logical network of Active Structure can provide support across the stages of Requirements Elicitation, high level design, development, manufacture. It can do this because it uses the structure of the problem, not some preconceived directed structure. The lack of directionality in its connections allows anything to be the current subgoal of its analysis. The undirectedness provides Consistent Reasoning, that is, maintenance of consistency, throughout the structure. The network propagates messages to operators which can change the topology of the network and show the results of decisions and new requirements at every stage of the planning process. It appears to be an appropriate support tool for any phase of high level planning.

The Next Step

Requirements Elicitation can be extended to the reading of text - this allows people to describe, in a flexible way, exactly it is that they require. The machine converts what it reads into a form capable of extension by other text, and it becomes capable of checking for validity among several descriptions at different levels of granularity - see The “Swing Space” – Understanding the meaning of a complex paragraph (the example is for property leases, the principle is universal). There is a cost for preparing the machine to read in a particular domain, but where the cost of what is described is in the many millions or billions, this cost is trivial.