This project made by the team including members Ghulam Mujtaba Khan(Admin), Muhammad Muhazzab Hussain (Admin: usefulwap.com), Muhammad Waqas, Ghulam Abbas. So all the credit goes to these four member. Not: This project is offered free and only for educational purpose. Anyone is not allowed to sell or buy. So Please use it for educational purpose.
So Starts:
First of we will look on the ATM database development life cycle.
So, the development life cycle is a necessary thing and first step to know that how is it work and how many steps needed and what the things are needed and what the information we must gather before starting on this project. So first of all the thing to survey and get the info from people and gather information that actually what the people need.
The Database System Development
Lifecycle
As a database system is a fundamental component of the larger organization-wide information system, the database system development lifecycle is inherently associated with the lifecycle of the information system. The stages of the database system development lifecycle are shown in the following finger.
Ø Database planning
Ø System definition
Ø Requirements collection and analysis
Ø Database design
Ø DBMS selection (optional)
Ø Application design
Ø Prototyping (optional)
Ø Implementation
Ø Data conversion and loading
Ø Testing
Operational maintenance
Ø Database Planning
The management activities that allow the stages of the database system development lifecycle to be realized as efficiently and effectively as possible.
Database planning must be integrated with the overall IS strategy of the organization.
There are three main issues involved in formulating an IS strategy, which are:
§ Identification of enterprise plans and goals with subsequent determination of information system needs;
§ Evaluation of current information systems to determine existing strengths and weaknesses;
Appraisal of IT opportunities that might yield competitive advantage. Database planning should also include the development of standards that govern how data will be collected, how the format should be specified, what necessary documentation will be needed, and how design and implementation should proceed. Standards can be very time consuming to develop and maintain, requiring resources to set them up initially, and to continue maintaining them. However, a well-designed set of standards provides a basis for training staff and measuring quality control, and can ensure that work conforms to a pattern, irrespective of staff skills and experience. For example, specific rules may govern how data items can be named in the data dictionary, which in turn may prevent both redundancy and inconsistency. Any legal or enterprise requirements concerning the data should be documented, such as the stipulation that some types of data must be treated confidentially.
Ø System Definition
The system definition describes the scope and boundaries of the database application and the major user views.
Before attempting to design a database system, it is essential that we first identify the boundaries of the system that we are investigating and how it interfaces with other parts of the organization’s information system. It is important that we include within our system boundaries not only the current users and application areas, but also future users and applications. We present a diagram that represents the scope and boundaries of the ATMNO database system in following Figure. Included within the scope and boundary of the database system are the major user views that are to be supported by the database.
Ø Requirements Collection and Analysis
The process of collecting and analyzing information about the part of the organization that is to be supported by the database system and using this information to identify the requirements for the new system.
Requirements collection and analysis is a preliminary stage to database design. The amount of data gathered depends on the nature of the problem and the policies of the enterprise. Too much study too soon leads to paralysis by analysis. Too little thought can result in an unnecessary waste of both time and money due to working on the wrong solution to the wrong problem.
Another important activity associated with this stage is deciding how to deal with the situation where there is more than one user view for the database system. There are three main approaches to
managing the requirements of a database system with multiple user views, namely:
§ The centralized approach
§ The view integration approach
§ A combination of both approaches
ATM Management Platform |
1) Build Conceptual Data Model
1.1) Entity Types:
1. Administrator
2. User
3. ATM Card
4. Account
5. Branch
6. Security
1.2) Relationship Type: For Conceptual Model
Initial Level:
Next Level:
Complete Diagram:
1.1)
ATM Management Platform |
1. Administrator:
Adminid (single value attribute)
Name (composite: fname, lname)
Phone (multi valued)
City (single valued)
EMAIL (composite attribute)
Branchno (single valued)
2. User:
Username (single valued)
Name (composite: fname, lname)
Phone (multi valued)
ATMno (single valued)
City (single valued)
EMAIL (composite attribute)
CINIC (single valued)
Accountno (single valued)
Accounttype (composite attribute)
Branchno (single valued)
Skey (composite attribute)
D_O_T/D (single valued)
3. Account:
Accountno (single valued)
Accounttype (composite attribute)
ATMno (single valued)
Branchno (single valued)
City (single valued)
Transaction (simple attribute)
Deposit (simple attribute)
Totalamount (simple attribute)
Skey (composite attribute)
1.
ATM Management Platform |
Username (single valued)
ATMno (single valued)
Branchno (single valued)
Accountno (single valued)
Accounttype (composite attribute)
Skey (composite attribute)
2. ATM Branch:
Username (single valued)
ATMno (single valued)
Branchno (single valued)
Accountno (single valued)
Accounttype (composite attribute)
Skey (single valued)
City (single valued)
3. Security:
Skey (single valued)
1.1) Attribute domain:
A. Administrator:
ID REAL
FANME NVARCHAR(15)
LNAME NVARCHAR(15)
PASSCODE NVARCHAR(15)
PHONE NVARCHAR(30)
ATMNO REAL
BRANCHNO NVARCHAR(500)
ATM Management Platform |
DATEOFBIRH DATE
EMAIL NVARCHAR
CITY NVARCHAR(500)
CONTRY NVARCHAR(100)
REFERENCE NVARCHAR(100)
SKEY NVARCHAR(5)
A. Customer:
USERNAME NVARCHAR(200)
FANME NVARCHAR(200)
LNAME NVARCHAR(200)
PASSCODE NVARCHAR(150)
PHONE NVARCHAR(30)
ATMNO REAL
ACCOUNTNO REAL
BRANCHNO NVARCHAR(500)
ACCOUNTTYPE NVARCHAR(500)
CCV NVARCHAR(100)
DATEOFBIRH DATE
EMAIL NVARCHAR(500)
CITY NVARCHAR(500)
CONTRY NVARCHAR(100)
ATM Management Platform |
SKEY NVARCHAR(50)
TOTALAMOUNT REAL
DEPOSITAMNT REAL
TRANSATAMNT REAL
B. Account:
TOTALAMOUNT REAL
DEPOSITAMNT REAL
TRANSATAMNT REAL
CONTRY NVARCHAR(100)
BRANCHNO NVARCHAR(500)
ACCOUNTTYPE NVARCHAR(500)
ATMNO REAL
ACCOUNTNO REAL
C. ATM Card:
USERNAME NVARCHAR(200)
ATMNO REAL
ACCOUNTNO REAL
BRANCHNO NVARCHAR(500)
ACCOUNTTYPE NVARCHAR(500)
CITY NVARCHAR(500)
D. ATM Branch:
ATM Management Platform |
ACCOUNTNO REAL
BRANCHNO REAL
ACCOUNTTYPE NVARCHAR(500)
CITY NVARCHAR(500)
SKEY INT
E. Account_Ledger:
TID INT
PASSCODE NVARCHAR(200)
----------------------------------------------------------
1.1) Determine keys:
Ø Primary keys:
· Administrator è ADMINID
· User è USERNAME
· Account è ACCOUNTNO
· ATM Card è ATMNO
· ATM Branch è BRANCHNO
· Security è NULL
Ø Foreign Keys
· Username references ATM Card
· Branchno references ATM Card
· Accountno references ATM Card
· ATMNO references Administrator
· ATMNO references Branch, Account
1.2)
ATM Management Platform |
· Administrator:
A ATM management platform has more than one admin to run the whole system.
· User
A user can make many accounts.
· Account:
All the records of the user will store in it’s own account.
· Branch:
User and administrator play role in every branch.
· ATM Card:
Use can make many ATM cards.
1.3) Check Model for Redundancy:
Objective
It is use to check for presence of any redundancy in the model.
There are three steps of it.
Re Examine 1:1
In this section reexamine the all modeling process to check the whole to reduce redundancy and identification of entities, we may have identified two entities that represent the same object in the enterprise.
Remove redundant relationships
A relationship is redundant if the same information can be obtained via other relationships. We are trying to develop a minimal data model and, as redundant relationships are unnecessary, they should be removed. It is relatively easy to identify whether there is more than one path between two entities. However, this does not necessarily imply that one of the relationships is redundant, as they may represent different associations between the entities.
Consider time dimension
The time dimension of relationships is important when assessing redundancy.
1.4) Validate conceptual model against user transactions
We now have a conceptual data model that represents the data requirements of the enterprise. The objective of this step is to check the model to ensure that the model supports the required transactions. Using the model, we attempt to perform the operations manually. If we can resolve all transactions in this way, we have checked that the conceptual data model supports the required transactions. However, if we are unable to perform a transaction manually there must be a problem with the data model, which must be resolved. In this case, it is likely that we have omitted an entity, a relationship, or an attribute from the data model.
ATM Management Platform |
Describing the transactions:
Using the first approach, we check that all the information (entities, relationships, and their attributes) required by each transaction is provided by the model, by documenting a description of each transaction’s requirements.
Using transaction pathways:
The second approach to validating the data model against the required transactions involves diagrammatically representing the pathway taken by each transaction directly on the ER diagram.
1.5) Review conceptual data model with user
It is use to review the conceptual data model with the users to ensure that they consider the model to be a ‘true’ representation of the data requirements of the enterprise.
The conceptual data model includes the ER diagram and the supporting documentation that describes the data model. If any anomalies are present in the data model, we must make the appropriate changes, which may require repeating the previous step(s). We repeat this process until the user is prepared to ‘sign off’ the model as being a ‘true’ representation of the part of the enterprise that we are modeling.
Logical Database Design
1) Build and validate logical Data Model
To translate the conceptual data model into a logical data model and then to validate this model to check that it is structurally correct and able to support the required transactions.
2.1)
ATM Management Platform |
To create relations for the logical data model to represent the entities, relationships, and attributes that have been identified.
It has nine steps to drive the relations for logical data model. These are following:
Strong entity type:
It included all the simple attribute of the any entity of project ATM Management platform including the primary key.
For example:
ATMCARD (ATMNO, CCV, ACCOUNTNO, TYPE, USERNAME)
PRIMARY KEY ATMNO
Week entity type:
It included all simple attribute of that entity.
For example:
ATMCARD (CCV, ACCOUNTNO, TYPE, USERNAME)
3 One to many binary relationship type:
The user can have many ATM CARDS so this relation is called the one to many relation.
USERS(USERNAME,EMAIL,CITY)|ATMCARD(ATMNO,CCV,USERNAME)
PRIMARY KEY: USERNAME PRIMARY KEY: ATMNO
ALTENATE KEY: CCV
FOREGN KEY: USERNAME
4) One to One binary relationship:
The administrator can assign one username for one user and the username will be unique.
ADMINITRATOR(ADMINID,ACCOUNTNO)|USERS(USERNAME,ADMINID)
PRIMARY KEY: ADMINID PRIMARY KEY: USERNAME
FORIGN MKEY: ADMINID
5) One to One recursive relationship type:
For a 1:1 recursive relationship, follow the rules for participation as described above for a 1:1 relationship. However, in this special case of a 1:1 relationship, the entity on both sides of the relationship is the same. For a 1:1 recursive relationship with mandatory participation on both sides, represent the recursive relationship as a single relation with two copies of the primary key. As before, one copy of the primary key represents a foreign key and should be renamed to indicate the relationship it represents.
ATM Management Platform |
6) Supper class/subclass relationship type:
For each superclass/subclass relationship in the conceptual data model, we identify the superclass entity as the parent entity and the subclass entity as the child entity. There are various options on how to represent such a relationship as one or more relations. The selection of the most appropriate option is dependent on a number of factors such as the dis joint ness and participation constraints on the superclass/subclass relationship.
For example:
1) Many to many relationship type:
Many users have many accounts more than one or two.
USERS(USERNAME,EMAIL,CITY)|ACCOUNT(ACOUNTNO,CCV,USERNAME)
PRIMARY KEY: USERNAME PRIMARY KEY: ATMNO
ALTENATE KEY: CCV
FOREGN KEY: ACCOUNTNO
2) Complex relationship type:
One table has to relation with other more than one table for example
USERS (USERNAME, EMAIL, CITY)
ATMCARD (ATMNO, CCV, USERNAME)
BRANCH (BRANCHNO, ACCOUNTTYPE)
ACCOUNTNO (ACCOUTNO, BRANCHNO, ATMNO, USERNAME, ACCOUNTTYPE)
3) Multivalued attributes:
The attribute having the value more than one is called the multivalued attributes. For example the telephone’s no can be more than one in number.
2.1) Drive relation for logical data model
The use of normalization requires that we first identify the functional dependencies that hold between the attributes in each relation,
2.2) Validate relations against user transaction
ATM Management Platform |
The objective of this step is to validate the logical data model to ensure that the model supports the required transactions, as detailed in the users’ requirements specification.
This type of check was carried out in Step 1.8 to ensure that the conceptual data model supported the required transactions. In this step, we check that the relations created in the previous step also support these transactions, and thereby ensure that no error has been introduced while creating relations.
2.3) Check integrity constraints
To check integrity constraints are represented in the logical data model.
Integrity constraints are the constraints that we wish to impose in order to protect the database from becoming incomplete, inaccurate, or inconsistent. Although DBMS controls for integrity constraints may or may not exist, this is not the question here. At this stage we are concerned only with high-level design that is, specifying what integrity constraints are required, irrespective of how this might be achieved. A logical data model that includes all important integrity constraints is a ‘true’ representation of the data requirements for the enterprise. We consider the following types of integrity constraint:
· Required data
· Attribute domain constraints;
· Multiplicity
· Entity integrity
· Referential integrity;
2.4) Review logical data model with user
This step is to review the logical data model with the users to ensure that they consider the model to be a true representation of the data requirements of the enterprise.
The logical data model should now be complete and fully documented. However, to confirm this is the case, users are requested to review the logical data model to ensure that they consider the model to be a true representation of the data requirements of the enterprise. If the users are dissatisfied with the model then some repetition of earlier steps in the methodology may be required.
2.5) Merge logical data models into global model (optional step)
2.6) Check for future growth
To determine whether there are any significant changes likely in the foreseeable future and to assess whether the logical data model can accommodate these changes.
ATM Management Platform |
to accommodate new requirements. It is important to develop a model that is extensible and has the ability to evolve to support new requirements with minimal effect on existing users.
Of course, this may be very difficult to achieve, as the enterprise may not know what it wants to do in the future. Even if it does, it may be prohibitively expensive both in time and money to accommodate possible future enhancements now. Therefore, it may be necessary to be selective in what is accommodated. Consequently, it is worth examining the model to check its ability to be extended with minimal impact. However, it is not necessary to incorporate any changes into the data model unless requested by the user.
At the end of Step 2 the logical data model is used as the source of information for physical database design, which is described in the following two chapters as Steps 3 to 8 of the methodology.
Post A Comment:
0 comments: