Potential sales of the departmental stores includes materials such as cement, paint, nails, wire mesh, floor tiles, gates and door locks. The departmental store transactions includes payment method, type of commodity, calculate the total cost of the item and the number of items in the departmental store.
Retails stores use the database to retain its customers through various ways such as ensuring every commodity the customers want is present, goods and services are up-to-date and every customer should enroll in the process of creating the system so that it can capture the requirements of the customers. The retails stores use the database to retain the customer by allowing concurrent access of any commodity the customer wants without any restriction. The retail stores also ensure that there is a locking mechanism where the customer can apply the locking mechanism to protect his good. The database should be transparent and simple to use that is it should be user friendly. They should also ensure that any occurrence of the problem is solved without any delay.
The retail stores should make sure the price of the products in the database is in accordance with the reputation of the customers. They should make sure that when developing the database they should consider the customer needs and their financial ability so that they can retain their customers and encourage other to join (Haley, 2006). At the same time they should acquire the information about customer preference, transaction maintenance and problem solution. Other than these during the development of the database they should evaluate and cross-sell their products to look for the best option that will ensure the database data is in accordance with the information that the customers needs.
Retail stores ensure the database has big data which is evaluated in combination with custom enterprise data, enterprises can develop a more careful and intuitive understanding of their business, which can enhanced productivity, a stronger competitive position and larger innovation – all of which can have a important impact on the bottom line. The retail stores ensure the database has the ability to acquire big data and can be able to analyze it.
The retail stores database should have transactions –that is a complete history of the purchases made by the customer’ including the price paid, buying date, and whether the trade in was purchased in response to a special, marketing task, Customer Contacts that is a record of the interactions that the customer has had with the retailer, including many visits to the retailer’s web site, number of calls dialed to the retailer’s call location, inquiries made by means of store kiosks, and information concerning contacts instigated by the retailer, like catalogs and mail sent to the customer, Customer preferences that is what customers likes, such as favorite brands, colors and apparel sizes (Vadaparty, 1992).
Two main approaches to sales forecasting: qualitative, quantitative and or judgmental. Frequently companies utilize both methods at the same time. In a simple way the word quantitative means estimating a particular, considerable or indefinite amount of anything. Quantitative techniques rely mainly on numbers to conclude forecasts.
History of the sales is one of the important tools used in sales forecasting. Additionally, it is also the basis for inventory forecasting. Sales history analyzes the knowledge of the market, products, industry and customer. Various sales forecasting techniques have been used such as open-model time-series techniques which pertains analyzing sales history data for patterns to use in sales prediction. Another sales forecasting technique is exponential smoothing that compares the previous forecast to actual result to get the wrong figure to use in the current and future prediction.
A trend is also an important forecasting tool for planning and preparation of departmental database. It ensures that enough inventories are ordered and enough shipping staff is on hand of high sales. Microsoft excel is an accounting spreadsheet program that enhance users to organize sales history data for forecasting. Customer lists, products, sales personnel lists, and sales history by year organized in Excel into display sales information, recruitment and delivery techniques. Excel has many features important to sales forecasting, like averaging tools and graphing.
The calculation involved in sales forecasting is actually very simple. The difficult part is basically maintaining the comprehensive and correct financial records to make those calculations. Examples of the most important information for calculating sales forecast are: the sales amount for every product broken down by month of the year, external factors impacting sales, such as economic forecast, increased competition, employee contract negotiation and price changes in raw materials.
The ordinary process for calculating a sales forecast with no existing sales is to base your forecasts on the performance of parallel businesses that sell similar products. The easiest sales forecasting technique is an annual sales forecast. Assuming that your sales are moderately stable -- no major modification in your competition, your customer base from year to year -- you just have to account for inflation. We calculate this formula as we add last year’s annual sales and last year’s annual sales x rate of inflation which is equal to next year’s sales forecast.
For many enterprises, sales change with time. Because of this, the only thing is to break down your sales predict month by month. The first thing you have to do is evaluate the past few years of sales numbers to calculate what percentage of the year's total sales are made each month. When creation sales forecasts, there are quite a few other factors that may need to be added to the calculation: Sales contracts that won't be renewed, industry analysts' predictions for growth or decrease in your market segment, new sales contracts that are on the horizon, economic analysts' predictions for the increased or diminished buying power of consumers in your market and political adjustment that could influence government contracts (Michael, 1979).
In some situations where a business has no proven sales, sales forecasts are significant for attracting investors. It's essential to base your forecast on businesses that sell to the same customer demographic and have the same geographic locality. For retail sales, you'll need to understand the average monthly or annual sales volume per square foot of retail space. In this situation you can adjust for the comparative size of your store.
Inventory forecasting in my thinking is a positive and revolutionary strategy aimed at providing expected stock level to meet demand at a particular point in time. Pro-activeness can be deduced as a step taken, preface to a known incident.
Broadly, three techniques can be agreed to when conducting inventory forecasting. They are Intuitive, Extrinsic and Intrinsic. The intuitive technique is based on subjective judgment and up to date opinions that are not chronological based. The intrinsic technique is historical based and works on the principle that a prior event might recur. The extrinsic approach is reliant on activities in another area and accepts the theory of proportionality. For instance, the sale of exercise books is relative to the number of students.
By and large, in appreciating forecasting as a tool for optimizing business process via objective decision making as contrasting to the otherwise, it is relevant to make allowances for sensible estimation error. Prediction for a group of product is encouraged because it is usually more accurate than for ones product (Piasecki, 2009). The prediction horizon should be realistically small, say a week. This is because the further the timeline, the more forecasting errors becomes. However, long period forecast can be agreed to if the metrics are relatively true.
Security is one of the important concerns of businesses in any form. Whether an enterprise is multi-million online ventures, security or small should be implemented. Vulnerability of the business to different security flaws is always tempting to different elements with malicious objective. Appropriate implementation of security counter measures is highly recommended for cloud computing. The simple fact that the application is initiated through internet makes it susceptible to any time of attack.
Cloud computing should consider protecting its users. Developers should make sure that data associated to the user should not be changed and could be extracted just by one person. There are two methods to ensure cloud computing security: certifications and restrictive user access. Restrictive access is a kind of logical security measure that could come from easy username and password challenge to complicated log in forms. However, applications programs in cloud computing should not only base itself on these confrontations. IP particular applications programs and user time outs are only some of the security counter measures that should be implemented accordingly. The major drawback in restrictive user access is to limit the access rights of the user. Every user will have to be allocated manually with security permission to ensure restriction of access to diverse files.
Certifications are also significant for user certification. Software developers have to open their application to security specialists that offer certifications for security. This is one way of guarantee users that the application program has been fully tested against different kinds of attacks. This is frequently the dilemma for cloud computing as external security checks might open the organization secrets on cloud computing. But this has to be sacrificed to enhance the security of their users.
Apart from user protection against different kinds of attacks, the data or the information itself should be protected. In this essence, the software and hardware linked to cloud computing should be analyzed explicitly. Once more, a certification is highly preferred in this part of cloud computing.
The hardware component for cloud computing on the other hand requires a various type of security deliberation. The locality of data center should not only be chosen because of its closeness to controllers and intended users but also on its security from external factors. The data center should be protected against diverse kinds of weather situation and physical attacks that might demolish the center physically.
With consideration to the hardware component in correlation to the application, some manual components have to be presented for increased security. Amongst them is manual closure to prevent further access of the information. Even though data could be controlled with another application that data could be penetrate unless the application is turned off immediately.
Cloud computing security should not only focus itself on prevention but the ample resources should be focused on recovery if the unlucky event really strikes. Even before catastrophe happens, certain plans have to be in place to make sure that every person will be working in unity towards recovery. The strategy do not have to be focused on software attacks alone but certain external disasters such as weather conditions should have different recovery strategies (Rittinghouse & Ransome, 2010).
A threat is any object that has the potential to cause los or harm. Threats may bring about difficulties when trying to manage any department store. Susceptibility of the operating system may lead to attack by database of the department store which may further cause loss or harm of data.
A transaction is an automated series of steps. The store database maintains the sales and inventory data of the store. Database security threats that could be applicable to the department store: database security component in any way needs to cover access control, vulnerability, application access, inference, and auditing mechanisms. The most important method used to protect data is restraining access to the data. This can be done through verification, authorization confidentiality and access control. These three mechanisms are specifically different but usually used in combination with a focus on access control for granularity in assigning privileges to specific things and users. For example, most database systems use some type of verification, like username and password to restrict access right to the system.
Access control is an important notion in security. Access control restricts events on objects to particular users. In store database security, objects concern to data objects such as columns and tables as well as SQL objects such as stored and procedures views. Controlling access to database tables or columns is often required and can be acted out by only granting rights to one of these objects. Excessive privilege use occurs when an application or users are given database access privileges that surpass the requirements of their work function, these privileges may be abused for malicious or cruel reason. For example, a college administrator whose job needs only the ability to modify student identification information may take benefit of excessive database update privileges to modify grades.
A specified database user ends up with excessive privileges for the easy reason that database administrators do not have the time to describe and update difficult access privilege control mechanisms for each user. Therefore, all users or large groups of users are granted generic default access privileges that far exceed specific job requirements.
Users may also misuse legal database privileges for unauthorized reasons. Consider an imaginary rogue healthcare employee with rights to view individual patient records by the use of a customized Web application. The organization of the Web application usually restricts users from viewing an individual patient’s healthcare background – many records cannot be viewed at the same time and electronic copies are not permitted. Nonetheless, the rogue employee may get around these limitations by linking to the database using another client such as Microsoft excel. By using Microsoft excel and his genuine login credentials, the worker may recover and save all patient details (Shih, 1991).
Attackers or intruders may take advantage of database platform software weaknesses to change access privileges from those of a normal user to those of an administrator. Weaknesses may be found in built-in functions, stored procedures, SQL statements and even protocol implementations. For instance, a software developer at a financial organization might take advantage of a weakness function to gain the database administrative rights. With administrative rights, the system developer may turn off evaluation mechanisms, transfer funds, create fake accounts, etc
Susceptibility of the platform and additional services installed on a database server may result to data corruption, unauthorized access, or denial of service. The Blaster Worm, for instance, took advantage of a Windows 1998 weakness to create denial of service conditions. Another form of threat is structured query language injection attack where-by a perpetrator basically injects unauthorized database procedures into in a weak structured query language data channel. Generally, the targeted data channels include stored functions and web application input factors. These inserted functions are then passed to the database where they are run. Using structured query language injection, intruders may gain unrestricted access to the whole database.
Weak audit trail may be another reason that may result to susceptibility of database departmental store. The electronic recording of all sensitive or unusual database transaction should be part of the foundation surrounding any database deployment. A weak audit policy of the database represents a serious organization risk on various levels such as regulatory risk and Deterrence. Audit mechanisms will aid in detection and recovery of departmental store data and information. Auditing of data can identify the existence of violation and also can be used to link to a particular user.
DBMS is a database whose storage devices are not attached to the computer CPU at all. In most case it may be stored in computers located in the same substantial lacation.They may also be dispersed over a network made of interconnected computers.
This DBMS manages data as if it was stored in one place that is in one computer. It’s able to synchronize the data in a periodical way, In the process of ensuring that various users have access to the data at the same time, its ensures that there is an automatic reflection of all deletes and updates done to the data stored elsewhere.
There should be an interaction between the administrators and the users to the distributed system as if in were centralized in one area. This form of transparency enables eradication of programming requirements because the required functionality is achieved and therefore, access of all numbers of remote and local tables across all networks. There different number of DBMS transparencies sought, but in the data distribution transparency its requires that any user of the database to have knowledge on how data has been fragmented. This awareness is on where the data the users are in access its location or whether there is existence of any multiple copies of data. DBMS also gives guarantee on the fact that there is no interference arising from concurrent transactions with each other and also ensure data recovery.
There are different ways of ensuring data base optimization. One should be aware of when to optimize the database. The load speed should be the first consideration this is because it affects the user’s experience of the database if it’s very slow. If the speed or its very busy the database is not fast enough can also affect the rest of the server. The optimization key areas should be in the following ways; the performance, profiling method, having a close examination of the execution plans query, and also common optimizations should be considered .These includes looping queries, taking only needed columns should be picked.
Lost update problem can also be refers to as the multiple update problem. In the lost update data written by use of one transaction is easily overwritten by another transaction. Another way this problem may occur is where two transactions in access of the same database can get its operations interleaved a way making the items of the database incorrect (Son & Kouloumbis, 1991). This problem can also further be explained in the following example which uses two sections of a database. That is section A and B they both read a record and at the same time updating it. The effect will be the overwriting of the first update by the second one. The problem which arises from this is that both connection A and B read values which are old. Where by connection A goes ahead and updates the row and after short while connection B updates the same row again. From this explanation it’s very clear that the updates made by connection A will be lost because they will be overwritten by B’s updates.
Ensuring that a shared lock is banned is one of the solutions, where use of batch processing is made effective, the locking system implementation is done and finally, use of two phase lock would help in the problem.
The uncommitted data is also known as the dirty read. This problem in a database occurs when data which is written by one transaction is translated by another transaction before the transaction is committed by the first transaction. The occurrence of this problem is when one transaction is given a go ahead to make out the intermediate results of the previous transaction before its self is committed.
This can be explained in the example below where there a where we use two connections A and B .Then connection A reads an object already updated by connection B.In this case connection B rolls back the transaction and the data which has not been committed and which is read by connection A leading to the uncommitted dependency situation. The problem here is that permission was granted to connection A to go ahead and read the intermediate results of connection B before the termination of connection B occurred.
The resolution to this hitch is try and prevent connection A from in anyway read the results until a termination in connection B has occurred or a rollback has been applied.
Concurrency in a database is the coordination of concurrent accesses in the database in a multiuser database system.Concurency control is the one that permits the database users to have access to the data in a fashionable way hiding the delusion that the user is executing alone on a dedicated system. The most technical difficulty in achieving these goals is on how to prevent updates made by one user interfere with the data updates as well as other updates made by another user.
There reasons behind the exacerbation of DBMS the first one is because users can be able to access data stored in different computers in the distributed system. The other reason the concurrency control in one computer cannot or is not in any way aware of interactions in other computers.
There are a number of ways of controlling existence of concurrency in a database .These includes;
Optimistic this a case where there is delay in checking of the integrity rules until the end of transactions .This done by carefully to make sure that there is no any sort of blocking that occurs in the read and the write operations.
The other factor method is the pessimistic, in this method that ensures the transaction of operations. This is done when violation of rules is done. When this method is used there is reduction in the performance system.
Finally we have the semi-optimistic, this where some systems at times may be blocked like in the pessimistic approach. At the same time some other system may be left running, that’s not all the systems are blocked when this method is used which deepens into the situation. Its therefore, noted that each of these system categories provides different system performance and which depends on several factors. Some of these factors are; types of transactions, the implemented parallelism and on how they are connected. When we talk of parallelism clearly outlines and shows that a parallel program is the one that runs on multiple processors. This happens with the hope that it will run faster than it would in a single CPU.
In the case of transaction types as a concurrency factor, it ensures that the rollback or commit command used completes the transaction. In case a transaction is committed permanence occurs into all in the transaction. In case of a rollback in the transaction all the changes which had been done gets lost forcing the database to reverse up to the last committed transaction which was successful.Threfore during a transaction the database engine examines the changes that are done in the database since the startup of the transaction. If there is evidence or occurrence of conflicting changes then rolling back of a transaction occurs.
Restoration and backup of data in most case is based on physical files which builds up the database, the two methods I would suggest be used is the automatic back up storage management and the backup script. In the automatic back up storage management is where the backup retention policy is retained. This enable in giving redundant level in the database protection. The database automatically manages the backups and logs in the area of recovery and deletes any that is absolute.
The other option is the use of the backup script; this performs the backups online of the database. Online backups are those that continue when the database itself is running while offline are those ones that run when the database is mounted. In offline mode these scripts puts the database in the state that is proper.
There a number of database threats, these include; access by authorized persons in the database ,attacks from brute wall or firewall attacks, inappropriate usage or vulnerability in the database assessment and encryption from stolen laptops, these are some of the problems. The solutions to these problems can be analyzed as ensuring that there is set strict policies concerning access into the database. Ensuring that monitoring and hardware procedures are done and on regular bases to the database, this will especially help out in the personal hardware set.