CLICK HERE FOR BLOGGER TEMPLATES AND MYSPACE LAYOUTS »

Saturday, April 9, 2011

Information Technology Team Ready for the Future

Is Your Information Technology Team Ready for the Future?

Leadership & Talent
A Shifting Landscape Impacts Critical Leadership Competencies
Succession planning is a fundamental activity within every corporate function. For Information Technology (IT), it is especially important as technology’s role as a business enabler continues to grow. Constantine Alexandrakis and the Information Officers Practice present the firm’s findings on the shifting landscape of IT organizations and the critical competencies for key IT roles.

e Shifting CIO Role

Much has been written recently about the evolution of the Fortune 500 chief information officer (CIO) from technology caretaker to business strategist. As the impact of technology has grown across global enterprises, IT has become critical to every corporate function, and IT effectiveness has become a competitive weapon.
From supply chain to marketing and sales, the CIO is now expected to understand the strategic levers in each function, and to provide detailed insight into and oversight of the functions’ use of data and technology in ways that bolster the corporation’s bottom line.

CHAPTER 15 : YOUR FUTURE AND INFORMATION TECHNOLOGY

Competencies
  • Individual strategy
  • Technology changing competition
  •  React to new technology
  •  Computer competence
  •  Job definitions

 Changing Times
  1. Successful individuals have a strategy
  2. Technology changes responsibilities
  3. Some jobs are made obsolete, but new ones are created
  4. Successful individuals are best at changing

 Technology and Organizations
  1. New enterprises
  2. Internet service providers
  3. Webmasters
  4. New customer and supplier relationships
Technology and People
  1. Different coping styles with technology
  2. Cynicism
  3. Computer use is overrated
  4. Naiveté
  5. Magic boxes
  6. Frustration
  7. Imposition to learn something new
  8. Proactivity
  9. Acting in anticipation
  

Information Technology, CAD/CAM and Your Future

Article by Dave Grubb
A few months ago I had an opportunity to attend a “sales seminar” for one of the more popular CAD/CAM programs in the cabinet/store fixture/millwork industries. The participants in this particular group were predominantly small-shop cabinetmakers. I have to admit to being a little surprised by the limited level of automation currently employed by some of the participants. However, the fact that they were in attendance indicates they are interested in implementing technology and automation in their operations.
 
My surprise at the lack of automation being employed by many in attendance forced me out of my normal arena to look more closely at these smaller shops. My research found a huge range of productivity and apparent automation across the industry. Annual sales per employee for kitchen cabinet manufacturers range from less than $40,000 to over $300,000 for larger companies with very high levels of automation and CAD/CAM integration. It should come as no surprise that the shops with lower productivity have very high labor content in their products; some indicate more than one-third of their costs are labor. Labor is actually the largest single component of their total production cost. On the other end of the productivity spectrum, the large companies are producing cabinets with labor content on the order of 10 percent; in those companies material is the largest single component of cost.
 
Clearly there is a huge range of productivity and production costs across the cabinet industry. I believe it is safe to say a similar range applies to virtually all areas of the wood products industry. 
 
Looking at selling prices, which the market ultimately sets, shows there simply is a practical ceiling to the price most producers can demand for their product. Obviously, as that price ceiling is reached, higher production costs ultimately limit the potential profit margin. As you move down the pricing scale, high production costs simply become more burdensome and the margins disappear more quickly. So, if you happen to be in the 33 percent plus labor content group, you might well be an endangered species. To remain in business over the long run, production costs must shrink at very least to the level of your competition.
 
Most manufacturers have limited potential to substantially reduce material costs. Generally, the cost component that can most likely be reduced is the labor costs. A key element in accomplishing that is through effective automation and efficient handling of information (data ) — beginning at the order entry stage. A commonly accepted number often applied to a shop utilizing minimal CAD/CAM technology and one not utilizing those tools is twice the output per employee with minimal CAD/CAM utilization. As the degree of CAD/CAM and electronic data management increases, the output per employee continues to increase.
 
Introduction of automation and CAD/CAM technology represents an intimidating step for most, especially for the first step. There are some basic elements that should be kept in mind to leverage the greatest benefit from this investment. Prior to committing to any software or hardware purchase, develop an overall strategic plan. I cannot overstate the importance of this step. Too often the first steps into CAD/CAM and electronic data management are taken without a strategic “enterprise” plan. The painful results are often costly and time consuming. For those of you already on this path without a long term strategic plan — please take the time to create this long term plan; it will yield great benefits going forward. If you need help (and most do) in developing the plan, get it because a poorly developed plan, or worse, none at all, is a formula for costly failure. The most common result is islands of data and automation that are unable to communicate and cooperate with each other. This is as true for a first implementation as for an expansion project. At what point(s) you begin implementing the plan is less important than having a clear understanding of how all the components cooperatively support the overall plan.
 
Base your plan on clearly defined goals and incremental milestones to accomplish the overall plan. Part of the good news is that so many have already traveled this road it is no longer “pioneering” — it is far more like going to grandmother’s house on the interstate, and just like going to grandmother’s house, you do it one mile at a time.
 
One goal that should always be included in your plan is to minimize manual data entry and manipulation. The “golden ring” is a system that based on the information entered at the point of order entry would require no additional data entry. All the required functions including: inventory management, material requisitions and purchase order generation, shop orders, machine programs, shipping documents and invoicing — can be done seamlessly from the original order information entered in the system — only once.
 
Keep an eye to the future when developing your plan. At your current volume, you may not see a benefit to applying labels and tracking individual parts through your shop—and you may well be correct. However, looking out 5 years, you may be at a volume that makes part tracking highly beneficial. It will be far easier and less costly to add that feature if the original system architecture is designed to accommodate the feature.
If you are producing standard cabinets and do not have drawings in CAD format, there is software available that does not require CAD drawings be created before you introduce CNC machines and CAM. These programs generate all the information necessary (including drawings) based on a rules-based configurator. Beyond the ability to generate the detailed bill of materials and a drawing of the cabinet, the entire program is parametric based — so if the order calls for a 24 ½-in.-wide base cabinet instead of a standard 24-in.-wide cabinet, all that needs to be done is for the width to be entered and all the impacted parts are adjusted for the new width. Generally, these programs are available with vast “libraries” of standard products which greatly simplify the implementation.
Independent of how much CAM integration you have, these software tools have tremendous value. Let’s assume you currently utilize a single CNC machining center. What value is electronic data management for you at a sliding table saw? The value lies in the accuracy of the information contained in the automatically generated cut list — it is correct —t here is no figuring or remembering the dimensional changes required for a 24 ½-in.-wide cabinet. Parts aren’t cut wrong — unless the saw is set up wrong; material is not wasted and labor is not lost.
 
As you evaluate the interface between your information system and your machines, don’t lose sight of the fact that your machines are both expendable — and expandable. Today you may have only one or two machines, and those machines are from the same manufacturer and have the same configuration. That presents one level of complexity for a CAD/CAM system — but is not likely to be long term. Over time, you will add machines and they most likely will not match the existing machines; that represents a higher level of complexity. It is best to maintain the CAD/CAM programs on a central server and not utilize machine specific programming software. This allows better maintenance of data. If you maintain part machining programs at the individual machines and you make a change to those programs, you have to be certain to change those programs in every location. Maintaining them on a central server requires the change only being made once, thus ensuring better data integrity. 
 
This article obviously cannot be a “How To Booklet” for information technology and CAD/CAM, but I would hope that you go away with two keys thoughts:
  1. Navigating the maze of CAD/CAM tools and electronic data management should only be undertaken after developing a long term strategic plan for the needs of the entire business — and you will be well served to seek advice in doing this.
  2.  Failing to embrace these tools for reducing your labor costs will ensure long term failure for all those not in the most secure “artistic niche."
And one last thought: reducing your labor content does not mean you will have to fire your brother-in-law — most likely your increased sales, improved quality, reduced lead times and lower production costs will force you to hire your cousin!
 



Friday, April 8, 2011

CHAPTER 14: PROGRAMMING & LANGUAGES

The Difference Between Do While And Do Until




The difference between "do while" and "do until" is that a "do while" loops while the test case is true, whereas "do until" loops UNTIL the test case is true (which is equivalent to looping while the test case is false).

The difference between a "do ...while" loop and a "while {} " loop is that the while loop tests its condition before execution of the contents of the loop begins; the "do" loop tests its condition after it's been executed at least once. As noted above, if the test condition is false as the while loop is entered the block of code is never executed. Since the condition is tested at the bottom of a do loop, its block of code is always executed at least once.
To further clear your concept on this, understand the syntax and description of the two loop types:
while
The while loop is used to execute a block of code as long as some condition is true. If the condition is false from the start the block of code is not executed at al. The while loop tests the condition before it's executed so sometimes the loop may never be executed if initially the condition is not met. Its syntax is as follows.
while (tested condition is satisfied)
{
block of code
}
In all constructs, curly braces should only be used if the construct is to execute more than one line of code. The above program executes only one line of code so it not really necessary (same rules apply to if...else constructs) but you can use it to make the program seem more understandable or readable.
Here is a simple example of the use of the while loop. This program counts from 1 to 100.

#include <stdio.h>
int main(void)
{

int count = 1;
while (count <= 100)
{
printf("%d\n",count);
count += 1; // Notice this statement
}
return 0;

}
Note that no semi-colons ( ; ) are to be used after the while (condition) statement. These loops are very useful because the condition is tested before execution begins. However i never seem to like these loops as they are not as clear to read as the do ...while loops. The while loop is the favorite amongst most programmers but as for me, i definitely prefer the do ...while loop.
do ....while
The do loop also executes a block of code as long as a condition is satisfied.
Again, The difference between a "do ...while" loop and a "while {} " loop is that the while loop tests its condition before execution of the contents of the loop begins; the "do" loop tests its condition after it's been executed at least once. As noted above, if the test condition is false as the while loop is entered the block of code is never executed. Since the condition is tested at the bottom of a do loop, its block of code is always executed at least once.
Some people don't like these loops because it is always executed at least once. When i ask them "so what?", they normally reply that the loop executes even if the data is incorrect. Basically because the loop is always executed, it will execute no matter what value or type of data is supposed to be required. The "do ....while" loops syntax is as follows
do
{
block of code
} while (condition is satisfied);

Note that a semi-colon ( ; ) must be used at the end of the do ...while loop. This semi-colon is needed because it instructs whether the while (condition) statement is the beginning of a while loop or the end of a do ...while loop. Here is an example of the use of a do loop.
include <stdio.h>
int main(void)
{
int value, r_digit; printf("Enter a number to be reversed.\n");
scanf("%d", &value); do
{
r_digit = value % 10;
printf("%d", r_digit);
value = value / 10;
} while (value != 0); printf("\n"); return 0;

}
 



Tuesday, March 29, 2011

DATABASE

     DATABASE is a system intended to organize, store, and retrieve large amounts of data easily. It consists of an organized collection of data for one or more uses, typically in digital form. One way of classifying databases involves the type of their contents, for example: bibliographic, document-text, statistical. Digital databases are managed using database management systems, which store database contents, allowing data creation and maintenance, and search and other access.
      THE ADVANTAGES of database is Reduced data redundancy, Reduced updating errors and increased consistency, Greater data integrity and independence from applications programs, Improved data access to users through use of host and query languages, Improved data security, Reduced data entry, storage, and retrieval costs, Facilitated development of new applications program.
     THE DISADVANTAGES of database is Database systems are complex, difficult, and time-consuming to design, Substantial hardware and software start-up costs, Damage to database affects virtually all applications programs, Extensive conversion costs in moving form a file-based system to a database system, Initial training required for all programmers and users.


DATABASE MANAGEMENT SYSTEM


structure of DBMS:


DBMS (Database Management System) acts as an interface between the user and the database. The user requests the DBMS to perform various operations (insert, delete, update and retrieval) on the database. The components of DBMS perform these requested operations on the database and provide necessary data to the users. The various components of DBMS are shown below: 







1. DDL Compiler - Data Description Language compiler processes schema definitions specified in the DDL. It includes metadata information such as the name of the files, data items, storage details of each file, mapping information and constraints etc.
2. DML Compiler and Query optimizer - The DML commands such as insert, update, delete, retrieve from the application program are sent to the DML compiler for compilation into object code for database access. The object code is then optimized in the best way to execute a query by the query optimizer and then send to the data manager.
3. Data Manager - The Data Manager is the central software component of the DBMS also knows as Database Control System. 
The Main Functions Of Data Manager Are:
Convert operations in user's Queries coming from the application programs or combination of DML Compiler and Query optimizer which is known as Query Processor from user's logical view to physical file system, 
Controls DBMS information access that is stored on disk, 
It also controls handling buffers in main memory, 
 It also enforces constraints to maintain consistency and integrity of the data, 
It also synchronizes the simultaneous operations performed by the concurrent users, 
t also controls the backup and recovery operations.
4. Data Dictionary - Data Dictionary is a repository of description of data in the database. It contains information about 
Data - names of the tables, names of attributes of each table, length of attributes, and number of rows in each table,  Relationships between database transactions and data items referenced by them which is useful in determining which transactions are affected when certain data definitions are changed,  Constraints on data i.e. range of values permitted,  Detailed information on physical database design such as storage structure, access paths, files and record sizes,  Access Authorization - is the Description of database users their responsibilities and their access rights,  Usage statistics such as frequency of query and transactions.Data dictionary is used to actually control the data integrity, database operation and accuracy. It may be used as a important part of the DBMS.
5. Data Files - It contains the data portion of the database.
6. Compiled DML - The DML complier converts the high level Queries into low level file access commands known as compiled DML.
7. End Users -
DBMS MODEL
  • Hierarchical Model
-The hierarchical data model organizes data in a tree structure. There is a hierarchy of parent and child data segments.
-implies that a record can have repeating information, generally in the child data segments. Data in a series of records, which have a set of field values attached to it. It collects all the instances of a specific record together as a record type. These record types are the equivalent of tables in the relational model, and with the individual records being the equivalent of rows.
-example, an organization might store information about an employee, such as name, employee number, department, salary. The organization might also store information about an employee's children, such as name and date of birth. 

  • Network Model
- The popularity of the network data model coincided with the popularity of the hierarchical data model.
 - The basic data modeling construct in the network model is the set construct. A set consists of an owner record type, a set name, and a member record type. A member record type can have that role in more than one set, hence the multiparent concept is supported. An owner record type can also be a member or owner in another set. 
- The data model is a simple network, and link and intersection record types (called junction records by IDMS) may exist, as well as sets between them . Thus, the complete network of relationships is represented by several pairwise sets; in each set some (one) record type is owner (at the tail of the network arrow) and one or more record types are members (at the head of the relationship arrow). Usually, a set defines a 1:M relationship, although 1:1 is permitted. 
- The CODASYL network model is based on mathematical set theory

  • Relational Model
-  A database based on the relational model developed by E.F. Codd.
- allows the definition of data structures, storage and retrieval operations and integrity constraints. In such a database the data and relations between them are organised in tables. A table is a collection of records and each record in a table contains the same fields. 
- Certain fields may be designated as keys, which means that searches for specific values of that field will use indexing to speed them up. Where fields in two different tables take values from the same set, a join operation can be performed to select related records in the two tables by matching values in those fields. Often, but not always, the fields will have the same name in both tables.
-example, an "orders" table might contain (customer-ID, product-code) pairs and a "products" table might contain (product-code, price) pairs so to calculate a given customer's bill you would sum the prices of all products ordered by that customer by joining on the product-code fields of the two tables. This can be extended to joining multiple tables on multiple fields.
-  based on the Relational Algebra. 

  • Object-Oriented Model

    - add database functionality to object programming languages.bring much more than persistent storage of programming language objects.
    - extend the semantics of the C++, Smalltalk and Java object programming languages to provide full-featured database programming capability, while retaining native language compatibility. 
    - benefit of this approach is the unification of the application and database development into a seamless data model and language environment. As a result, applications require less code, use more natural data modeling, and code bases are easier to maintain. Object developers can write complete database applications with a modest amount of additional effort.
    - According to Rao (1994), "The object-oriented database (OODB) paradigm is the combination of object-oriented programming language (OOPL) systems and persistent systems. The power of the OODB comes from the seamless treatment of both persistent data, as found in databases, and transient data, as found in executing programs."




Sunday, March 20, 2011

What is Information Systems?

CHAPTER 11:INFORMATION SYSTEM

Definition of: information system 


A business application of the computer. It is made up of the database, application programs and manual and machine procedures. It also encompasses the computer systems that do the processing.

Processing the Data
The database stores the subjects of the business (master files) and its activities (transaction files). The application programs provide the data entry, updating, query and report processing.

The Procedures
The manual procedures document how data are obtained for input and how the system's output is distributed. Machine procedures instruct the computer how to perform scheduled activities, in which the output of one program is automatically fed into another.

Transaction Processing
The daily work is the online, interactive processing of the business transactions and updating of customer, inventory and vendor files (master files).

Batch Processing
At the end of the day or other period, programs print reports and update files that were not updated on a daily basis. Periodically, files must be updated for routine maintenance such as adding and deleting employees and making changes to product descriptions.See transaction processing.




Types of Information System


Categories of Information SystemCharacteristices
Transaction Processing System
Substitutes computer-based processing for manual procedures.

Deals with well-structured processes. Includes record keeping applications.
Management information system
Provides input to be used in the managerial decision process. Deals with supporting well structured decision situations. Typical information requirements can be anticipated.
Decision support system
Provides information to managers who must make judgements about particular situations. Supports decision-makers in situations that are not well structured.

Sunday, March 13, 2011

CHAPTER 10 PRIVACY & SECURITY

PRIVACY

Privacy is a concerns the collection and use of data about individuals. So, here is the three primary privacy issues to know:

1. Accuracy 
It is related to the responsibility of those who collect data to ensure that the data is correct.

2. Property
It is relates to who owns data and rights to software.

3.Access
It is relates to the responsibility of those who have data to control and who is able to use that data.

SPYWARE

Spyware is similar to a Trojan horse in that users unwittingly install the product when they install something else. A common way to become a victim of spyware is to download certain peer-to-peer file swapping products that are available today.

Aside from the questions of ethics and privacy, spyware steals from the user by using the computer's memory resources and also by eating bandwidth as it sends information back to the spyware's home base via the user's Internet connection. Because spyware is using memory and system resources, the applications running in the background can lead to system crashes or general system instability.

Because spyware exists as independent executable programs, they have the ability to monitor keystrokes, scan files on the hard drive, snoop other applications, such as chat programs or word processors, install other spyware programs, read cookies, change the default home page on the Web browser, consistently relaying this information back to the spyware author who will either use it for advertising/marketing purposes or sell the information to another party.

Licensing agreements that accompany software downloads sometimes warn the user that a spyware program will be installed along with the requested software, but the licensing agreements may not always be read completely because the notice of a spyware installation is often couched in obtuse, hard-to-read legal disclaimers.