With reference to assignments 8 and 9, what characteristics does an analyst(you) examine when evaluating DFD quality?
An analyst must know how to define every process on a Data flow diagram. He/ she must know how to keep braking down Data flow diagram to more detailed one. He/ she must understand the algorithm to be able to describe the process. He/ she must know how to decide. He/ she must identify the decision variable. He/ she must need to consider the number of locations of users, the processing and data access requirement, volume and timing of processing and access. He/ she must be able to translate the process into a diagram. He/ she must describe complex interactions and also alternative approaches. And also the systems-analyst author knows when he or she has reached the end.
But before I give details to the characteristics I examine when evaluating Data flow diagram, I would like to give some details what a Data flow diagram is. This is according to www.about-knowledge.com. Data flow diagram is a geographical tool that shows, process, flows, stores and external entities in a system. Data flow diagram shows the transformation of data into a system. Data flow diagram has got the following symbols. In Process flow diagrams, process symbol has got the following entities, process number (tells the number of the process), locality (where activity is happening) and a process name. Data flow datagram process symbol rules symbolizes the transformation of data. There must be data flowing into/out of the process. Process can have several inputs to it or output to it. Process with no out becomes a null process. Data store symbol consist of the following entities, data store number and name of data store. The function of data store is to designate the storage of data in a data flow diagram. Data flow symbol may appear in different shape and they signify the movement of data. They do not signify the movement of people, and goods. Doubles arrows signifies that activities occur at the same time which is wrong. Data flow in is never equal to data flow out. In Extended entity symbol, extended entity is sources and destination of data. This means that source is the origin and destination is the sink of data. There are Dos and Don’ts in external entity. External entity never communicate with each other, this signify that there is no need for the process. External entity should not communicate directly with data store because external entities can be identifier with the record of files and databases. There are two main types of analysis. The structured analysis and the object oriented analysis. The main phases of Structured Analysis are as follows: Requirements determination, structured modeling, Data modeling, and Alternative solution generation. The main phases of Object Oriented Analysis are as follows: Requirements determination, Object oriented modeling, Data modeling, and Alternative solution generation. The main sources from which the analyst can gather requirements are books, journals, people, current system documentation, websites, and articles. Break the business rules. For example, there is a rule that an organization can only use standalone system for all of its branches because of security reasons and an analyst can convince them to use online system having Internet securities.
According to Conrad Weisert, Information Disciplines, Inc., Chicago.
The elements of a Data flow datagram
A Data flow datagram contains four kinds of symbol:
1. Processes -- The only active elements. Processes cause something to happen. They have embedded descriptions, often in verb-object form. (Sometimes informally called "bubbles" because of their shape in an early version of SA.)
2. Terminators -- Represent users or other systems, i.e. entities outside the boundary of the system being described.
3. Data flows -- Composite data items (or objects) that pass either
From any element to a process (input dataflow) or from a process to any element (output dataflow)
4. Data stores -- Holding places for data flows; often implemented by databases.
Systems analysts apply this checklist to look for errors in their Data flow datagrams:
1. Every process must have at least one input dataflow (Violators are called "magic" processes, since they claim to do something based on no input, not even a trigger.)
2. Every process must have at least one output dataflow (Violators are called "black hole" processes, since their inputs are swallowed up for no reason.)
3. Every dataflow must connect two elements. One of them must be a process; the other can be a terminator, a data store or another process.
4. Each dataflow diagram should contain no more than six or seven processes and no more than six or seven data stores, and all the processes should be conceptually at the same level of detail. If a part of the system is too big or too complicated to describe in an easily grasped diagram, break it down into two or three lower-level diagrams. (We sometimes see hanging on an office wall a huge tour de force DFD that tries to describe an entire large system at a low level of detail with several dozen processes and convoluted intersecting dataflow arrows. That's not something to be proud of. It doesn't communicate to any audience.)
5. For every process, one of the following must eventually be true:
1. The description label is so simple and unambiguous that every reader will understand it in exactly the same way.
2. It is expanded or decomposed into a separate lower-level dataflow diagram that preserves exactly the same net inputs and outputs, but shows internal detail, such as data stores and internal processes.
3. It is rigorously described by a separate process specification (business rule, decision rule, function definition, algorithm, etc.).
The starting point: Context (level-0) diagram
The systems analyst begins by preparing the top-level DFD. This "context diagram" shows the entire system as a single process. Interactions with users and other external entities are shown as data flows.
The context diagram, although often almost trivially simple, serves two essential purposes:
* It clarifies to the user audience the analyst's understanding of the scope of the proposed system, the kinds of users the system will have, and the data coming out from and going into the system. A surprising number of misunderstandings are exposed at this early stage.
* It motivates and establishes a framework for the more complicated next level (below).
The system diagram (level-1 DFD)
After everyone agrees that the context diagram is correct and complete, the systems analyst examines the first-level breakdown of major functions. Most systems can be decomposed into between two and seven major areas.
The result is called the "system diagram". It gives a clear overview of the system and serves as a base for further decomposition.
The end
The dataflow diagrams are complete when:
* Every process on every DFD complies with rule number 5 above.
* Every dataflow shown on every DFD is defined in the data dictionary.
There's more to come, but the remaining components of the system specification (or detailed user requirements documentation) have little or no effect on the functionality of the proposed system. Note that the information contained in these documents is essential not only as a foundation for building a custom application but also as a basis for evaluating and choosing a packaged application software product.
Analyst is the most responsible person for the new system project management and therefore he should possess and behaves the following characteristics. The analyst should: Question about every aspect of current and new system. Assume that every thing is possible and do not accept such sort of statements, "This is a routine operation to perform such type of job in our organization". Pay intension to each and every single answer because each answer could generate many important questions related to problems of current system. Restructure his mind before the development of each new system and do not correlate with previously developed systems. Have excellent communication skills. Identify and translate the problem of current system. Generate the maximum solution of the problem. Explain the logic of his solutions. Learn from mistakes. Not be bias to the customer or the software house for which he/she is working. In developing a physical data flow diagram, an analyst must explore the process for more details. Maintain consistency between the processes. Follow meaningful leveling convention. Ensure that data flow diagram clarifies what is happening in the system. Always remember data flow diagram audience. Evaluate data flow diagram for correctness. The system analyst must be able to communicate in writing and orally. The analyst must easily get along with people. The analyst must be a good listener and be able to react to what people say. The analyst must be knowledgeable of technology. The analyst is not expected to know the intricacies of programming, but a decent general knowledge of concepts and terms is essential. The analyst must be knowledgeable of business. The analyst is not expected to be an expert in business but a decent understanding of the client's world is required.
References:
http://www.docstoc.com/docs/16829052/ITEC-2010-Chapter-6---Data-Flow-Diagrams
http://www.about-knowledge.com/software-project-analysis-and-characteristics-of-an-analyst-and-types-of-analysis/
http://hubpages.com/hub/What-is-a-data-flow-diagram
http://www.idinews.com/life-cycle/dataflow.html
http://www.idinews.com/life-cycle/dataflow.html
An analyst must know how to define every process on a Data flow diagram. He/ she must know how to keep braking down Data flow diagram to more detailed one. He/ she must understand the algorithm to be able to describe the process. He/ she must know how to decide. He/ she must identify the decision variable. He/ she must need to consider the number of locations of users, the processing and data access requirement, volume and timing of processing and access. He/ she must be able to translate the process into a diagram. He/ she must describe complex interactions and also alternative approaches. And also the systems-analyst author knows when he or she has reached the end.
But before I give details to the characteristics I examine when evaluating Data flow diagram, I would like to give some details what a Data flow diagram is. This is according to www.about-knowledge.com. Data flow diagram is a geographical tool that shows, process, flows, stores and external entities in a system. Data flow diagram shows the transformation of data into a system. Data flow diagram has got the following symbols. In Process flow diagrams, process symbol has got the following entities, process number (tells the number of the process), locality (where activity is happening) and a process name. Data flow datagram process symbol rules symbolizes the transformation of data. There must be data flowing into/out of the process. Process can have several inputs to it or output to it. Process with no out becomes a null process. Data store symbol consist of the following entities, data store number and name of data store. The function of data store is to designate the storage of data in a data flow diagram. Data flow symbol may appear in different shape and they signify the movement of data. They do not signify the movement of people, and goods. Doubles arrows signifies that activities occur at the same time which is wrong. Data flow in is never equal to data flow out. In Extended entity symbol, extended entity is sources and destination of data. This means that source is the origin and destination is the sink of data. There are Dos and Don’ts in external entity. External entity never communicate with each other, this signify that there is no need for the process. External entity should not communicate directly with data store because external entities can be identifier with the record of files and databases. There are two main types of analysis. The structured analysis and the object oriented analysis. The main phases of Structured Analysis are as follows: Requirements determination, structured modeling, Data modeling, and Alternative solution generation. The main phases of Object Oriented Analysis are as follows: Requirements determination, Object oriented modeling, Data modeling, and Alternative solution generation. The main sources from which the analyst can gather requirements are books, journals, people, current system documentation, websites, and articles. Break the business rules. For example, there is a rule that an organization can only use standalone system for all of its branches because of security reasons and an analyst can convince them to use online system having Internet securities.
According to Conrad Weisert, Information Disciplines, Inc., Chicago.
The elements of a Data flow datagram
A Data flow datagram contains four kinds of symbol:
1. Processes -- The only active elements. Processes cause something to happen. They have embedded descriptions, often in verb-object form. (Sometimes informally called "bubbles" because of their shape in an early version of SA.)
2. Terminators -- Represent users or other systems, i.e. entities outside the boundary of the system being described.
3. Data flows -- Composite data items (or objects) that pass either
From any element to a process (input dataflow) or from a process to any element (output dataflow)
4. Data stores -- Holding places for data flows; often implemented by databases.
Systems analysts apply this checklist to look for errors in their Data flow datagrams:
1. Every process must have at least one input dataflow (Violators are called "magic" processes, since they claim to do something based on no input, not even a trigger.)
2. Every process must have at least one output dataflow (Violators are called "black hole" processes, since their inputs are swallowed up for no reason.)
3. Every dataflow must connect two elements. One of them must be a process; the other can be a terminator, a data store or another process.
4. Each dataflow diagram should contain no more than six or seven processes and no more than six or seven data stores, and all the processes should be conceptually at the same level of detail. If a part of the system is too big or too complicated to describe in an easily grasped diagram, break it down into two or three lower-level diagrams. (We sometimes see hanging on an office wall a huge tour de force DFD that tries to describe an entire large system at a low level of detail with several dozen processes and convoluted intersecting dataflow arrows. That's not something to be proud of. It doesn't communicate to any audience.)
5. For every process, one of the following must eventually be true:
1. The description label is so simple and unambiguous that every reader will understand it in exactly the same way.
2. It is expanded or decomposed into a separate lower-level dataflow diagram that preserves exactly the same net inputs and outputs, but shows internal detail, such as data stores and internal processes.
3. It is rigorously described by a separate process specification (business rule, decision rule, function definition, algorithm, etc.).
The starting point: Context (level-0) diagram
The systems analyst begins by preparing the top-level DFD. This "context diagram" shows the entire system as a single process. Interactions with users and other external entities are shown as data flows.
The context diagram, although often almost trivially simple, serves two essential purposes:
* It clarifies to the user audience the analyst's understanding of the scope of the proposed system, the kinds of users the system will have, and the data coming out from and going into the system. A surprising number of misunderstandings are exposed at this early stage.
* It motivates and establishes a framework for the more complicated next level (below).
The system diagram (level-1 DFD)
After everyone agrees that the context diagram is correct and complete, the systems analyst examines the first-level breakdown of major functions. Most systems can be decomposed into between two and seven major areas.
The result is called the "system diagram". It gives a clear overview of the system and serves as a base for further decomposition.
The end
The dataflow diagrams are complete when:
* Every process on every DFD complies with rule number 5 above.
* Every dataflow shown on every DFD is defined in the data dictionary.
There's more to come, but the remaining components of the system specification (or detailed user requirements documentation) have little or no effect on the functionality of the proposed system. Note that the information contained in these documents is essential not only as a foundation for building a custom application but also as a basis for evaluating and choosing a packaged application software product.
Analyst is the most responsible person for the new system project management and therefore he should possess and behaves the following characteristics. The analyst should: Question about every aspect of current and new system. Assume that every thing is possible and do not accept such sort of statements, "This is a routine operation to perform such type of job in our organization". Pay intension to each and every single answer because each answer could generate many important questions related to problems of current system. Restructure his mind before the development of each new system and do not correlate with previously developed systems. Have excellent communication skills. Identify and translate the problem of current system. Generate the maximum solution of the problem. Explain the logic of his solutions. Learn from mistakes. Not be bias to the customer or the software house for which he/she is working. In developing a physical data flow diagram, an analyst must explore the process for more details. Maintain consistency between the processes. Follow meaningful leveling convention. Ensure that data flow diagram clarifies what is happening in the system. Always remember data flow diagram audience. Evaluate data flow diagram for correctness. The system analyst must be able to communicate in writing and orally. The analyst must easily get along with people. The analyst must be a good listener and be able to react to what people say. The analyst must be knowledgeable of technology. The analyst is not expected to know the intricacies of programming, but a decent general knowledge of concepts and terms is essential. The analyst must be knowledgeable of business. The analyst is not expected to be an expert in business but a decent understanding of the client's world is required.
References:
http://www.docstoc.com/docs/16829052/ITEC-2010-Chapter-6---Data-Flow-Diagrams
http://www.about-knowledge.com/software-project-analysis-and-characteristics-of-an-analyst-and-types-of-analysis/
http://hubpages.com/hub/What-is-a-data-flow-diagram
http://www.idinews.com/life-cycle/dataflow.html
http://www.idinews.com/life-cycle/dataflow.html