Last Updated on December 13, 2021 by Admin 3

CISA : Certified Information Systems Auditor : Part 49

  1. Which of the following component of an expert system enables the expert system to collect data from nonhuman sources, such as measurement instruments in a power plant?

    • Decision tree
    • Rules
    • Semantic nets
    • Data interface

    Explanation:

    Data Interface enables the expert system to collect data from nonhuman sources, such as measurement instruments in a power plant.

    For CISA Exam you should know below information about Artificial Intelligence and Expert System
    Artificial intelligence is the study and application of the principles by which:

    Knowledge is acquired and used
    Goals are generated and achieved
    Information is communicated
    Collaboration is achieved
    Concepts are formed
    Languages are developed

    Two main programming languages that have been developed for artificial intelligence are LISP and PROLOG.
    Expert system are compromised primary components, called shells, when they are not populated with particular data, and the shells are designed to host new expert system.

    Keys to the system is the knowledge base (KB), which contains specific information or fact patterns associated with a particular subject matter and the rule for interpreting these facts. The KB interface with a database in obtaining data to analyze a particular problem in deriving an expert conclusion. The information in the KB can be expressed in several ways:

    Decision Tree – Using questionnaires to lead the user through a series of choices, until a conclusion is reached. Flexibility is compromised because the user must answer the questions in an exact sequence.

    Rule – Expressing declarative knowledge through the use of if-then relationships. For example, if a patient’s body temperature is over 39 degrees Celsius and their pulse is under 60, then they might be suffering from a certain disease.

    Semantic nets – Consist of a graph in which the node represent physical or conceptual object and the arcs describe the relationship between the nodes. Semantic nets resemble a data flow diagram and make use of an inheritance mechanism to prevent duplication of a data.

    Additionally, the inference engine shown is a program that uses the KB and determines the most appropriate outcome based on the information supplied by the user. In addition, an expert system includes the following components

    Knowledge interface – Allows the expert to enter knowledge into the system without the traditional mediation of a software engineer.

    Data Interface – Enables the expert system to collect data from nonhuman sources, such as measurement instruments in a power plant.

    The following were incorrect answers:

    Decision Tree – Using questionnaires to lead the user through a series of choices, until a conclusion is reached. Flexibility is compromised because the user must answer the questions in an exact sequence.

    Rule – Expressing declarative knowledge through the use of if-then relationships.

    Semantic nets – Semantic nets consist of a graph in which the node represent physical or conceptual object and the arcs describe the relationship between the nodes.

    Reference:

    CISA review manual 2014 Page number 187

  2. Which of the following component of an expert system allows the expert to enter knowledge into the system without the traditional mediation of a software engineer?

    • Decision tree
    • Rules
    • Semantic nets
    • Knowledge interface
    Explanation:

    Knowledge interface allows the expert to enter knowledge into the system without the traditional mediation of a software engineer.

    For CISA Exam you should know below information about Artificial Intelligence and Expert System
    Artificial intelligence is the study and application of the principles by which:

    Knowledge is acquired and used
    Goals are generated and achieved
    Information is communicated
    Collaboration is achieved
    Concepts are formed
    Languages are developed

    Two main programming languages that have been developed for artificial intelligence are LISP and PROLOG.
    Expert system are compromised primary components, called shells, when they are not populated with particular data, and the shells are designed to host new expert system.

    Keys to the system is the knowledge base (KB), which contains specific information or fact patterns associated with a particular subject matter and the rule for interpreting these facts. The KB interface with a database in obtaining data to analyze a particular problem in deriving an expert conclusion. The information in the KB can be expressed in several ways:

    Decision Tree – Using questionnaires to lead the user through a series of choices, until a conclusion is reached. Flexibility is compromised because the user must answer the questions in an exact sequence.

    Rule – Expressing declarative knowledge through the use of if-then relationships. For example, if a patient’s body temperature is over 39 degrees Celsius and their pulse is under 60, then they might be suffering from a certain disease.

    Semantic nets – Consist of a graph in which the node represent physical or conceptual object and the arcs describe the relationship between the nodes. Semantic nets resemble a data flow diagram and make use of an inheritance mechanism to prevent duplication of a data.

    Additionally, the inference engine shown is a program that uses the KB and determines the most appropriate outcome based on the information supplied by the user. In addition, an expert system includes the following components

    Knowledge interface – Allows the expert to enter knowledge into the system without the traditional mediation of a software engineer.

    Data Interface – Enables the expert system to collect data from nonhuman sources, such as measurement instruments in a power plant.

    The following were incorrect answers:

    Decision Tree – Using questionnaires to lead the user through a series of choices, until a conclusion is reached. Flexibility is compromised because the user must answer the questions in an exact sequence.

    Rule – Expressing declarative knowledge through the use of if-then relationships.

    Semantic nets – Semantic nets consist of a graph in which the node represent physical or conceptual object and the arcs describe the relationship between the nodes.

    Reference:

    CISA review manual 2014 Page number 187

  3. Which of the following method of expressing knowledge base consist of a graph in which nodes represent physical or conceptual objects and the arcs describes the relationship between nodes?

    • Decision tree
    • Rules
    • Semantic nets
    • Knowledge interface
    Explanation:

    Semantic nets consist of a graph in which the node represent physical or conceptual object and the arcs describe the relationship between the nodes.

    For CISA Exam you should know below information about Artificial Intelligence and Expert System
    Artificial intelligence is the study and application of the principles by which:

    Knowledge is acquired and used
    Goals are generated and achieved
    Information is communicated
    Collaboration is achieved
    Concepts are formed
    Languages are developed

    Two main programming languages that have been developed for artificial intelligence are LISP and PROLOG.
    Expert system are compromised primary components, called shells, when they are not populated with particular data, and the shells are designed to host new expert system.

    Keys to the system is the knowledge base (KB), which contains specific information or fact patterns associated with a particular subject matter and the rule for interpreting these facts. The KB interface with a database in obtaining data to analyze a particular problem in deriving an expert conclusion. The information in the KB can be expressed in several ways:

    Decision Tree – Using questionnaires to lead the user through a series of choices, until a conclusion is reached. Flexibility is compromised because the user must answer the questions in an exact sequence.

    Rule – Expressing declarative knowledge through the use of if-then relationships. For example, if a patient’s body temperature is over 39 degrees Celsius and their pulse is under 60, then they might be suffering from a certain disease.

    Semantic nets – Consist of a graph in which the node represent physical or conceptual object and the arcs describe the relationship between the nodes. Semantic nets resemble a data flow diagram and make use of an inheritance mechanism to prevent duplication of a data.

    Additionally, the inference engine shown is a program that uses the KB and determines the most appropriate outcome based on the information supplied by the user. In addition, an expert system includes the following components

    Knowledge interface – Allows the expert to enter knowledge into the system without the traditional mediation of a software engineer.

    Data Interface – Enables the expert system to collect data from nonhuman sources, such as measurement instruments in a power plant.

    The following were incorrect answers:

    Decision Tree – Using questionnaires to lead the user through a series of choices, until a conclusion is reached. Flexibility is compromised because the user must answer the questions in an exact sequence.

    Rule – Expressing declarative knowledge through the use of if-then relationships.

    Semantic nets – Semantic nets consist of a graph in which the node represent physical or conceptual object and the arcs describe the relationship between the nodes.

    Reference:
    CISA review manual 2014 Page number 187

  4. The information in the knowledge base can be expressed in several ways. Which of the following way uses questionnaires to lead the user through a series of choices until a conclusion is reached?

    • Decision tree
    • Rules
    • Semantic nets
    • Knowledge interface
    Explanation:

    Decision tree uses questionnaires to lead the user through a series of choices, until a conclusion is reached. Flexibility is compromised because the user must answer the questions in an exact sequence.

    For CISA Exam you should know below information about Artificial Intelligence and Expert System
    Artificial intelligence is the study and application of the principles by which:

    Knowledge is acquired and used
    Goals are generated and achieved
    Information is communicated
    Collaboration is achieved
    Concepts are formed
    Languages are developed

    Two main programming languages that have been developed for artificial intelligence are LISP and PROLOG.
    Expert system are compromised primary components, called shells, when they are not populated with particular data, and the shells are designed to host new expert system.

    Keys to the system is the knowledge base (KB), which contains specific information or fact patterns associated with a particular subject matter and the rule for interpreting these facts. The KB interface with a database in obtaining data to analyze a particular problem in deriving an expert conclusion. The information in the KB can be expressed in several ways:

    Decision Tree – Using questionnaires to lead the user through a series of choices, until a conclusion is reached. Flexibility is compromised because the user must answer the questions in an exact sequence.

    Rule – Expressing declarative knowledge through the use of if-then relationships. For example, if a patient’s body temperature is over 39 degrees Celsius and their pulse is under 60, then they might be suffering from a certain disease.

    Semantic nets – Consist of a graph in which the node represent physical or conceptual object and the arcs describe the relationship between the nodes. Semantic nets resemble a data flow diagram and make use of an inheritance mechanism to prevent duplication of a data.

    Additionally, the inference engine shown is a program that uses the KB and determines the most appropriate outcome based on the information supplied by the user. In addition, an expert system includes the following components

    Knowledge interface – Allows the expert to enter knowledge into the system without the traditional mediation of a software engineer.

    Data Interface – Enables the expert system to collect data from nonhuman sources, such as measurement instruments in a power plant.

    The following were incorrect answers:

    Rule – Expressing declarative knowledge through the use of if-then relationships.

    Semantic nets – Semantic nets consist of a graph in which the node represent physical or conceptual object and the arcs describe the relationship between the nodes.
    Knowledge interface – Allows the expert to enter knowledge into the system without the traditional mediation of a software engineer.

    Reference:

    CISA review manual 2014 Page number 187

  5. An IS auditor should aware of various analysis models used by data architecture. Which of the following analysis model depict data entities and how they relate?

    • Context Diagrams
    • Activity Diagrams
    • Swim-lane diagrams
    • Entity relationship diagrams
    Explanation:

    Entity relationship diagram – Depict data entities and how they relate. These data analysis methods obviously play an important part in developing an enterprise data model. However, it is also crucial that knowledgeable business operative is involved in the process. This way proper understanding can be obtained of the business purpose and context of the data. This also mitigates the risk of replication of suboptimal data configuration from existing systems and database into DW.

    For CISA exam you should know below information about business intelligence:

    Business intelligence(BI) is a broad field of IT encompasses the collection and analysis of information to assist decision making and assess organizational performance.
    To deliver effective BI, organizations need to design and implement a data architecture. The complete data architecture consists of two components

    The enterprise data flow architecture (EDFA)
    A logical data architecture

    Various layers/components of this data flow architecture are as follows:

    Presentation/desktop access layer – This is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data Source Layer – Enterprise information derives from number of sources:

    Operational data – Data captured and maintained by an organization’s existing systems, and usually held in system-specific database or flat files.
    External Data – Data provided to an organization by external sources. This could include data such as customer demographic and market share information.
    Nonoperational data – Information needed by end user that is not currently maintained in a computer accessible format.

    Core data warehouse – This is where all the data of interest to an organization is captured and organized to assist reporting and analysis. DWs are normally instituted as large relational databases. A property constituted DW should support three basic form of an inquiry.

    Drilling up and drilling down – Using dimension of interest to the business, it should be possible to aggregate data as well as drill down. Attributes available at the more granular levels of the warehouse can also be used to refine the analysis.

    Drill across – Use common attributes to access a cross section of information in the warehouse such as sum sales across all product lines by customer and group of customers according to length of association with the company.
    Historical Analysis – The warehouse should support this by holding historical, time variant data. An example of historical analysis would be to report monthly store sales and then repeat the analysis using only customer who were preexisting at the start of the year in order to separate the effective new customer from the ability to generate repeat business with existing customers.

    Data Mart Layer – Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data Staging and quality layer – This layer is responsible for data copying, transformation into DW format and quality control. It is particularly important that only reliable data into core DW. This layer needs to be able to deal with problems periodically thrown by operational systems such as change to account number format and reuse of old accounts and customer numbers.

    Data Access Layer – This layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Data Preparation layer – This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed. Data mining is concern with exploring large volume of data to determine patterns and trends of information. Data mining often identifies patterns that are counterintuitive due to number and complexity of data relationships. Data quality needs to be very high to not corrupt the result.

    Metadata repository layer – Metadata are data about data. The information held in metadata layer needs to extend beyond data structure names and formats to provide detail on business purpose and context. The metadata layer should be comprehensive in scope, covering data as they flow between the various layers, including documenting transformation and validation rules.

    Warehouse Management Layer – The function of this layer is the scheduling of the tasks necessary to build and maintain the DW and populate data marts. This layer is also involved in administration of security.

    Application messaging layer – This layer is concerned with transporting information between the various layers. In addition to business data, this layer encompasses generation, storage and targeted communication of control messages.

    Internet/Intranet layer – This layer is concerned with basic data communication. Included here are browser based user interface and TCP/IP networking.

    Various analysis models used by data architects/ analysis follows:

    Context diagram – Outline the major processes of an organization and the external parties with which business interacts.

    Activity or swim-lane diagram – De-construct business processes.

    Entity relationship diagram – Depict data entities and how they relate. These data analysis methods obviously play an important part in developing an enterprise data model. However, it is also crucial that knowledgeable business operative is involved in the process. This way proper understanding can be obtained of the business purpose and context of the data. This also mitigates the risk of replication of suboptimal data configuration from existing systems and database into DW.

    The following were incorrect answers:

    Context diagram – Outline the major processes of an organization and the external parties with which business interacts.
    Activity or swim-lane diagram – De-construct business processes.

    Reference:
    CISA review manual 2014 Page number 188

  6. An IS auditor should aware of various analysis models used by data architecture. Which of the following analysis model outline the major process of an organization and the external parties with which business interacts?

    • Context Diagrams
    • Activity Diagrams
    • Swim-lane diagrams
    • Entity relationship diagrams
    Explanation:

    Context diagram – Outline the major processes of an organization and the external parties with which business interacts.

    For CISA exam you should know below information about business intelligence:

    Business intelligence(BI) is a broad field of IT encompasses the collection and analysis of information to assist decision making and assess organizational performance.
    To deliver effective BI, organizations need to design and implement a data architecture. The complete data architecture consists of two components

    The enterprise data flow architecture (EDFA)
    A logical data architecture

    Various layers/components of this data flow architecture are as follows:

    Presentation/desktop access layer – This is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data Source Layer – Enterprise information derives from number of sources:

    Operational data – Data captured and maintained by an organization’s existing systems, and usually held in system-specific database or flat files.
    External Data – Data provided to an organization by external sources. This could include data such as customer demographic and market share information.
    Nonoperational data – Information needed by end user that is not currently maintained in a computer accessible format.

    Core data warehouse – This is where all the data of interest to an organization is captured and organized to assist reporting and analysis. DWs are normally instituted as large relational databases. A property constituted DW should support three basic form of an inquiry.

    Drilling up and drilling down – Using dimension of interest to the business, it should be possible to aggregate data as well as drill down. Attributes available at the more granular levels of the warehouse can also be used to refine the analysis.

    Drill across – Use common attributes to access a cross section of information in the warehouse such as sum sales across all product lines by customer and group of customers according to length of association with the company.
    Historical Analysis – The warehouse should support this by holding historical, time variant data. An example of historical analysis would be to report monthly store sales and then repeat the analysis using only customer who were preexisting at the start of the year in order to separate the effective new customer from the ability to generate repeat business with existing customers.

    Data Mart Layer – Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data Staging and quality layer – This layer is responsible for data copying, transformation into DW format and quality control. It is particularly important that only reliable data into core DW. This layer needs to be able to deal with problems periodically thrown by operational systems such as change to account number format and reuse of old accounts and customer numbers.

    Data Access Layer – This layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Data Preparation layer – This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed. Data mining is concern with exploring large volume of data to determine patterns and trends of information. Data mining often identifies patterns that are counterintuitive due to number and complexity of data relationships. Data quality needs to be very high to not corrupt the result.

    Metadata repository layer – Metadata are data about data. The information held in metadata layer needs to extend beyond data structure names and formats to provide detail on business purpose and context. The metadata layer should be comprehensive in scope, covering data as they flow between the various layers, including documenting transformation and validation rules.

    Warehouse Management Layer – The function of this layer is the scheduling of the tasks necessary to build and maintain the DW and populate data marts. This layer is also involved in administration of security.

    Application messaging layer – This layer is concerned with transporting information between the various layers. In addition to business data, this layer encompasses generation, storage and targeted communication of control messages.

    Internet/Intranet layer – This layer is concerned with basic data communication. Included here are browser based user interface and TCP/IP networking.

    Various analysis models used by data architects/ analysis follows:

    Activity or swim-lane diagram – De-construct business processes.

    Entity relationship diagram – Depict data entities and how they relate. These data analysis methods obviously play an important part in developing an enterprise data model. However, it is also crucial that knowledgeable business operative is involved in the process. This way proper understanding can be obtained of the business purpose and context of the data. This also mitigates the risk of replication of suboptimal data configuration from existing systems and database into DW.

    The following were incorrect answers:

    Context diagram – Outline the major processes of an organization and the external parties with which business interacts.
    Activity or swim-lane diagram – De-construct business processes.

    Reference:

    CISA review manual 2014 Page number 188

  7. Which of the following layer of an enterprise data flow architecture is concerned with basic data communication?

    • Data preparation layer
    • Desktop Access Layer
    • Internet/Intranet layer
    • Data access layer
    Explanation:

    Internet/Intranet layer – This layer is concerned with basic data communication. Included here are browser based user interface and TCP/IP networking.
    For CISA exam you should know below information about business intelligence:

    Business intelligence(BI) is a broad field of IT encompasses the collection and analysis of information to assist decision making and assess organizational performance.
    To deliver effective BI, organizations need to design and implement a data architecture. The complete data architecture consists of two components

    The enterprise data flow architecture (EDFA)
    A logical data architecture

    Various layers/components of this data flow architecture are as follows:

    Presentation/desktop access layer – This is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data Source Layer – Enterprise information derives from number of sources:

    Operational data – Data captured and maintained by an organization’s existing systems, and usually held in system-specific database or flat files.
    External Data – Data provided to an organization by external sources. This could include data such as customer demographic and market share information.
    Nonoperational data – Information needed by end user that is not currently maintained in a computer accessible format.

    Core data warehouse – This is where all the data of interest to an organization is captured and organized to assist reporting and analysis. DWs are normally instituted as large relational databases. A property constituted DW should support three basic form of an inquiry.

    Drilling up and drilling down – Using dimension of interest to the business, it should be possible to aggregate data as well as drill down. Attributes available at the more granular levels of the warehouse can also be used to refine the analysis.

    Drill across – Use common attributes to access a cross section of information in the warehouse such as sum sales across all product lines by customer and group of customers according to length of association with the company.
    Historical Analysis – The warehouse should support this by holding historical, time variant data. An example of historical analysis would be to report monthly store sales and then repeat the analysis using only customer who were preexisting at the start of the year in order to separate the effective new customer from the ability to generate repeat business with existing customers.

    Data Mart Layer – Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data Staging and quality layer – This layer is responsible for data copying, transformation into DW format and quality control. It is particularly important that only reliable data into core DW. This layer needs to be able to deal with problems periodically thrown by operational systems such as change to account number format and reuse of old accounts and customer numbers.

    Data Access Layer – This layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Data Preparation layer – This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed. Data mining is concern with exploring large volume of data to determine patterns and trends of information. Data mining often identifies patterns that are counterintuitive due to number and complexity of data relationships. Data quality needs to be very high to not corrupt the result.

    Metadata repository layer – Metadata are data about data. The information held in metadata layer needs to extend beyond data structure names and formats to provide detail on business purpose and context. The metadata layer should be comprehensive in scope, covering data as they flow between the various layers, including documenting transformation and validation rules.

    Warehouse Management Layer – The function of this layer is the scheduling of the tasks necessary to build and maintain the DW and populate data marts. This layer is also involved in administration of security.

    Application messaging layer – This layer is concerned with transporting information between the various layers. In addition to business data, this layer encompasses generation, storage and targeted communication of control messages.

    Internet/Intranet layer – This layer is concerned with basic data communication. Included here are browser based user interface and TCP/IP networking.

    Various analysis models used by data architects/ analysis follows:

    Activity or swim-lane diagram – De-construct business processes.

    Entity relationship diagram – Depict data entities and how they relate. These data analysis methods obviously play an important part in developing an enterprise data model. However, it is also crucial that knowledgeable business operative is involved in the process. This way proper understanding can be obtained of the business purpose and context of the data. This also mitigates the risk of replication of suboptimal data configuration from existing systems and database into DW.

    The following were incorrect answers:

    Desktop access layer or presentation layer is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data preparation layer – This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed.

    Data access layer – his layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Reference:
    CISA review manual 2014 Page number 188

  8. Which of the following layer of an enterprise data flow architecture is concerned with transporting information between the various layers?

    • Data preparation layer
    • Desktop Access Layer
    • Application messaging layer
    • Data access layer
    Explanation:

    Application messaging layer – This layer is concerned with transporting information between the various layers. In addition to business data, this layer encompasses generation, storage and targeted communication of control messages.
    For CISA exam you should know below information about business intelligence:

    Business intelligence(BI) is a broad field of IT encompasses the collection and analysis of information to assist decision making and assess organizational performance.
    To deliver effective BI, organizations need to design and implement a data architecture. The complete data architecture consists of two components

    The enterprise data flow architecture (EDFA)
    A logical data architecture

    Various layers/components of this data flow architecture are as follows:

    Presentation/desktop access layer – This is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data Source Layer – Enterprise information derives from number of sources:

    Operational data – Data captured and maintained by an organization’s existing systems, and usually held in system-specific database or flat files.
    External Data – Data provided to an organization by external sources. This could include data such as customer demographic and market share information.
    Nonoperational data – Information needed by end user that is not currently maintained in a computer accessible format.

    Core data warehouse – This is where all the data of interest to an organization is captured and organized to assist reporting and analysis. DWs are normally instituted as large relational databases. A property constituted DW should support three basic form of an inquiry.

    Drilling up and drilling down – Using dimension of interest to the business, it should be possible to aggregate data as well as drill down. Attributes available at the more granular levels of the warehouse can also be used to refine the analysis.

    Drill across – Use common attributes to access a cross section of information in the warehouse such as sum sales across all product lines by customer and group of customers according to length of association with the company.
    Historical Analysis – The warehouse should support this by holding historical, time variant data. An example of historical analysis would be to report monthly store sales and then repeat the analysis using only customer who were preexisting at the start of the year in order to separate the effective new customer from the ability to generate repeat business with existing customers.

    Data Mart Layer – Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data Staging and quality layer – This layer is responsible for data copying, transformation into DW format and quality control. It is particularly important that only reliable data into core DW. This layer needs to be able to deal with problems periodically thrown by operational systems such as change to account number format and reuse of old accounts and customer numbers.

    Data Access Layer – This layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Data Preparation layer – This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed. Data mining is concern with exploring large volume of data to determine patterns and trends of information. Data mining often identifies patterns that are counterintuitive due to number and complexity of data relationships. Data quality needs to be very high to not corrupt the result.

    Metadata repository layer – Metadata are data about data. The information held in metadata layer needs to extend beyond data structure names and formats to provide detail on business purpose and context. The metadata layer should be comprehensive in scope, covering data as they flow between the various layers, including documenting transformation and validation rules.

    Warehouse Management Layer – The function of this layer is the scheduling of the tasks necessary to build and maintain the DW and populate data marts. This layer is also involved in administration of security.

    Application messaging layer – This layer is concerned with transporting information between the various layers. In addition to business data, this layer encompasses generation, storage and targeted communication of control messages.

    Internet/Intranet layer – This layer is concerned with basic data communication. Included here are browser based user interface and TCP/IP networking.

    Various analysis models used by data architects/ analysis follows:

    Activity or swim-lane diagram – De-construct business processes.

    Entity relationship diagram – Depict data entities and how they relate. These data analysis methods obviously play an important part in developing an enterprise data model. However, it is also crucial that knowledgeable business operative is involved in the process. This way proper understanding can be obtained of the business purpose and context of the data. This also mitigates the risk of replication of suboptimal data configuration from existing systems and database into DW.

    The following were incorrect answers:

    Desktop access layer or presentation layer is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data preparation layer – This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed.

    Data access layer – his layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are
    organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Reference:

    CISA review manual 2014 Page number 188

  9. Which of the following layer of an enterprise data flow architecture does the scheduling of the tasks necessary to build and maintain the Data Warehouse (DW) and also populates Data Marts?

    • Data preparation layer
    • Desktop Access Layer
    • Warehouse management layer
    • Data access layer
    Explanation:

    Warehouse Management Layer – The function of this layer is the scheduling of the tasks necessary to build and maintain the DW and populate data marts. This layer is also involved in administration of security.
    For CISA exam you should know below information about business intelligence:

    Business intelligence(BI) is a broad field of IT encompasses the collection and analysis of information to assist decision making and assess organizational performance. To deliver effective BI, organizations need to design and implement a data architecture. The complete data architecture consists of two components

    The enterprise data flow architecture (EDFA)
    A logical data architecture

    Various layers/components of this data flow architecture are as follows:

    Presentation/desktop access layer – This is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data Source Layer – Enterprise information derives from number of sources:

    Operational data – Data captured and maintained by an organization’s existing systems, and usually held in system-specific database or flat files.
    External Data – Data provided to an organization by external sources. This could include data such as customer demographic and market share information.
    Nonoperational data – Information needed by end user that is not currently maintained in a computer accessible format.

    Core data warehouse – This is where all the data of interest to an organization is captured and organized to assist reporting and analysis. DWs are normally instituted as large relational databases. A property constituted DW should support three basic form of an inquiry.

    Drilling up and drilling down – Using dimension of interest to the business, it should be possible to aggregate data as well as drill down. Attributes available at the more granular levels of the warehouse can also be used to refine the analysis.

    Drill across – Use common attributes to access a cross section of information in the warehouse such as sum sales across all product lines by customer and group of customers according to length of association with the company.
    Historical Analysis – The warehouse should support this by holding historical, time variant data. An example of historical analysis would be to report monthly store sales and then repeat the analysis using only customer who were preexisting at the start of the year in order to separate the effective new customer from the ability to generate repeat business with existing customers.

    Data Mart Layer – Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data Staging and quality layer – This layer is responsible for data copying, transformation into DW format and quality control. It is particularly important that only reliable data into core DW. This layer needs to be able to deal with problems periodically thrown by operational systems such as change to account number format and reuse of old accounts and customer numbers.

    Data Access Layer – This layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Data Preparation layer – This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed. Data mining is concern with exploring large volume of data to determine patterns and trends of information. Data mining often identifies patterns that are counterintuitive due to number and complexity of data relationships. Data quality needs to be very high to not corrupt the result.

    Metadata repository layer – Metadata are data about data. The information held in metadata layer needs to extend beyond data structure names and formats to provide detail on business purpose and context. The metadata layer should be comprehensive in scope, covering data as they flow between the various layers, including documenting transformation and validation rules.

    Warehouse Management Layer – The function of this layer is the scheduling of the tasks necessary to build and maintain the DW and populate data marts. This layer is also involved in administration of security.

    Application messaging layer – This layer is concerned with transporting information between the various layers. In addition to business data, this layer encompasses generation, storage and targeted communication of control messages.

    Internet/Intranet layer – This layer is concerned with basic data communication. Included here are browser based user interface and TCP/IP networking.

    Various analysis models used by data architects/ analysis follows:

    Activity or swim-lane diagram – De-construct business processes.

    Entity relationship diagram – Depict data entities and how they relate. These data analysis methods obviously play an important part in developing an enterprise data model. However, it is also crucial that knowledgeable business operative is involved in the process. This way proper understanding can be obtained of the business purpose and context of the data. This also mitigates the risk of replication of suboptimal data configuration from existing systems and database into DW.

    The following were incorrect answers:

    Desktop access layer or presentation layer is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data preparation layer – This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed.

    Data access layer – his layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Reference:

    CISA review manual 2014 Page number 188

  10. Which of the following layer of an enterprise data flow architecture represents subset of information from the core Data Warehouse selected and organized to meet the needs of a particular business unit or business line?

    • Data preparation layer
    • Desktop Access Layer
    • Data Mart layer
    • Data access layer
    Explanation:

    Data Mart layer – Data mart represents subset of information from the core Data Warehouse selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.
    For CISA exam you should know below information about business intelligence:

    Business intelligence(BI) is a broad field of IT encompasses the collection and analysis of information to assist decision making and assess organizational performance. To deliver effective BI, organizations need to design and implement a data architecture. The complete data architecture consists of two components

    The enterprise data flow architecture (EDFA)
    A logical data architecture

    Various layers/components of this data flow architecture are as follows:

    Presentation/desktop access layer – This is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data Source Layer – Enterprise information derives from number of sources:

    Operational data – Data captured and maintained by an organization’s existing systems, and usually held in system-specific database or flat files.
    External Data – Data provided to an organization by external sources. This could include data such as customer demographic and market share information.
    Nonoperational data – Information needed by end user that is not currently maintained in a computer accessible format.

    Core data warehouse – This is where all the data of interest to an organization is captured and organized to assist reporting and analysis. DWs are normally instituted as large relational databases. A property constituted DW should support three basic form of an inquiry.

    Drilling up and drilling down – Using dimension of interest to the business, it should be possible to aggregate data as well as drill down. Attributes available at the more granular levels of the warehouse can also be used to refine the analysis.

    Drill across – Use common attributes to access a cross section of information in the warehouse such as sum sales across all product lines by customer and group of customers according to length of association with the company.
    Historical Analysis – The warehouse should support this by holding historical, time variant data. An example of historical analysis would be to report monthly store sales and then repeat the analysis using only customer who were preexisting at the start of the year in order to separate the effective new customer from the ability to generate repeat business with existing customers.

    Data Mart Layer – Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data Staging and quality layer – This layer is responsible for data copying, transformation into DW format and quality control. It is particularly important that only reliable data into core DW. This layer needs to be able to deal with problems periodically thrown by operational systems such as change to account number format and reuse of old accounts and customer numbers.

    Data Access Layer – This layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Data Preparation layer – This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed. Data mining is concern with exploring large volume of data to determine patterns and trends of information. Data mining often identifies patterns that are counterintuitive due to number and complexity of data relationships. Data quality needs to be very high to not corrupt the result.

    Metadata repository layer – Metadata are data about data. The information held in metadata layer needs to extend beyond data structure names and formats to provide detail on business purpose and context. The metadata layer should be comprehensive in scope, covering data as they flow between the various layers, including documenting transformation and validation rules.

    Warehouse Management Layer – The function of this layer is the scheduling of the tasks necessary to build and maintain the DW and populate data marts. This layer is also involved in administration of security.

    Application messaging layer – This layer is concerned with transporting information between the various layers. In addition to business data, this layer encompasses generation, storage and targeted communication of control messages.

    Internet/Intranet layer – This layer is concerned with basic data communication. Included here are browser based user interface and TCP/IP networking.

    Various analysis models used by data architects/ analysis follows:

    Activity or swim-lane diagram – De-construct business processes.
    Entity relationship diagram – Depict data entities and how they relate. These data analysis methods obviously play an important part in developing an enterprise data model. However, it is also crucial that knowledgeable business operative is involved in the process. This way proper understanding can be obtained of the business purpose and context of the data. This also mitigates the risk of replication of suboptimal data configuration from existing systems and database into DW.

    The following were incorrect answers:

    Desktop access layer or presentation layer is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data preparation layer – This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed.

    Data access layer – his layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Reference:

    CISA review manual 2014 Page number 188

  11. Which of the following layer of an enterprise data flow architecture is concerned with the assembly and preparation of data for loading into data marts?

    • Data preparation layer
    • Desktop Access Layer
    • Data Mart layer
    • Data access layer
    Explanation:

    Data preparation layer – This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed.
    For CISA exam you should know below information about business intelligence:

    Business intelligence(BI) is a broad field of IT encompasses the collection and analysis of information to assist decision making and assess organizational performance. To deliver effective BI, organizations need to design and implement a data architecture. The complete data architecture consists of two components

    The enterprise data flow architecture (EDFA)
    A logical data architecture

    Various layers/components of this data flow architecture are as follows:

    Presentation/desktop access layer – This is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data Source Layer – Enterprise information derives from number of sources:

    Operational data – Data captured and maintained by an organization’s existing systems, and usually held in system-specific database or flat files.
    External Data – Data provided to an organization by external sources. This could include data such as customer demographic and market share information.
    Nonoperational data – Information needed by end user that is not currently maintained in a computer accessible format.

    Core data warehouse – This is where all the data of interest to an organization is captured and organized to assist reporting and analysis. DWs are normally instituted as large relational databases. A property constituted DW should support three basic form of an inquiry.

    Drilling up and drilling down – Using dimension of interest to the business, it should be possible to aggregate data as well as drill down. Attributes available at the more granular levels of the warehouse can also be used to refine the analysis.

    Drill across – Use common attributes to access a cross section of information in the warehouse such as sum sales across all product lines by customer and group of customers according to length of association with the company.
    Historical Analysis – The warehouse should support this by holding historical, time variant data. An example of historical analysis would be to report monthly store sales and then repeat the analysis using only customer who were preexisting at the start of the year in order to separate the effective new customer from the ability to generate repeat business with existing customers.

    Data Mart Layer – Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data Staging and quality layer – This layer is responsible for data copying, transformation into DW format and quality control. It is particularly important that only reliable data into core DW. This layer needs to be able to deal with problems periodically thrown by operational systems such as change to account number format and reuse of old accounts and customer numbers.

    Data Access Layer – This layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Data Preparation layer – This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed. Data mining is concern with exploring large volume of data to determine patterns and trends of information. Data mining often identifies patterns that are counterintuitive due to number and complexity of data relationships. Data quality needs to be very high to not corrupt the result.

    Metadata repository layer – Metadata are data about data. The information held in metadata layer needs to extend beyond data structure names and formats to provide detail on business purpose and context. The metadata layer should be comprehensive in scope, covering data as they flow between the various layers, including documenting transformation and validation rules.

    Warehouse Management Layer – The function of this layer is the scheduling of the tasks necessary to build and maintain the DW and populate data marts. This layer is also involved in administration of security.

    Application messaging layer – This layer is concerned with transporting information between the various layers. In addition to business data, this layer encompasses generation, storage and targeted communication of control messages.

    Internet/Intranet layer – This layer is concerned with basic data communication. Included here are browser based user interface and TCP/IP networking.

    Various analysis models used by data architects/ analysis follows:

    Activity or swim-lane diagram – De-construct business processes.
    Entity relationship diagram – Depict data entities and how they relate. These data analysis methods obviously play an important part in developing an enterprise data model. However, it is also crucial that knowledgeable business operative is involved in the process. This way proper understanding can be obtained of the business purpose and context of the data. This also mitigates the risk of replication of suboptimal data configuration from existing systems and database into DW.

    The following were incorrect answers:

    Desktop access layer or presentation layer is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data Mart layer – Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data access layer – his layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Reference:

    CISA review manual 2014 Page number 188

  12. Which of the following layer of an enterprise data flow architecture is responsible for data copying, transformation in Data Warehouse (DW) format and quality control?

    • Data Staging and quality layer
    • Desktop Access Layer
    • Data Mart layer
    • Data access layer
    Explanation:

    Data Staging and quality layer – This layer is responsible for data copying, transformation into DW format and quality control. It is particularly important that only reliable data into core DW. This layer needs to be able to deal with problems periodically thrown by operational systems such as change to account number format and reuse of old accounts and customer numbers.
    For CISA exam you should know below information about business intelligence:

    Business intelligence(BI) is a broad field of IT encompasses the collection and analysis of information to assist decision making and assess organizational performance. To deliver effective BI, organizations need to design and implement a data architecture. The complete data architecture consists of two components

    The enterprise data flow architecture (EDFA)
    A logical data architecture

    Various layers/components of this data flow architecture are as follows:

    Presentation/desktop access layer – This is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data Source Layer – Enterprise information derives from number of sources:

    Operational data – Data captured and maintained by an organization’s existing systems, and usually held in system-specific database or flat files.
    External Data – Data provided to an organization by external sources. This could include data such as customer demographic and market share information.
    Nonoperational data – Information needed by end user that is not currently maintained in a computer accessible format.

    Core data warehouse – This is where all the data of interest to an organization is captured and organized to assist reporting and analysis. DWs are normally instituted as large relational databases. A property constituted DW should support three basic form of an inquiry.

    Drilling up and drilling down – Using dimension of interest to the business, it should be possible to aggregate data as well as drill down. Attributes available at the more granular levels of the warehouse can also be used to refine the analysis.

    Drill across – Use common attributes to access a cross section of information in the warehouse such as sum sales across all product lines by customer and group of customers according to length of association with the company.
    Historical Analysis – The warehouse should support this by holding historical, time variant data. An example of historical analysis would be to report monthly store sales and then repeat the analysis using only customer who were preexisting at the start of the year in order to separate the effective new customer from the ability to generate repeat business with existing customers.

    Data Mart Layer – Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data Staging and quality layer – This layer is responsible for data copying, transformation into DW format and quality control. It is particularly important that only reliable data into core DW. This layer needs to be able to deal with problems periodically thrown by operational systems such as change to account number format and reuse of old accounts and customer numbers.

    Data Access Layer – This layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Data Preparation layer – This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed. Data mining is concern with exploring large volume of data to determine patterns and trends of information. Data mining often identifies patterns that are counterintuitive due to number and complexity of data relationships. Data quality needs to be very high to not corrupt the result.

    Metadata repository layer – Metadata are data about data. The information held in metadata layer needs to extend beyond data structure names and formats to provide detail on business purpose and context. The metadata layer should be comprehensive in scope, covering data as they flow between the various layers, including documenting transformation and validation rules.

    Warehouse Management Layer – The function of this layer is the scheduling of the tasks necessary to build and maintain the DW and populate data marts. This layer is also involved in administration of security.

    Application messaging layer – This layer is concerned with transporting information between the various layers. In addition to business data, this layer encompasses generation, storage and targeted communication of control messages.

    Internet/Intranet layer – This layer is concerned with basic data communication. Included here are browser based user interface and TCP/IP networking.

    Various analysis models used by data architects/ analysis follows:

    Activity or swim-lane diagram – De-construct business processes.
    Entity relationship diagram – Depict data entities and how they relate. These data analysis methods obviously play an important part in developing an enterprise data model. However, it is also crucial that knowledgeable business operative is involved in the process. This way proper understanding can be obtained of the business purpose and context of the data. This also mitigates the risk of replication of suboptimal data configuration from existing systems and database into DW.

    The following were incorrect answers:

    Desktop access layer or presentation layer is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data Mart layer – Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data access layer – his layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Reference:

    CISA review manual 2014 Page number 188

  13. Which of the following layer of an enterprise data flow architecture represents subsets of information from the core data warehouse?

    • Presentation layer
    • Desktop Access Layer
    • Data Mart layer
    • Data access layer
    Explanation:

    Data Mart layer – Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.
    For CISA exam you should know below information about business intelligence:

    Business intelligence(BI) is a broad field of IT encompasses the collection and analysis of information to assist decision making and assess organizational performance. To deliver effective BI, organizations need to design and implement a data architecture. The complete data architecture consists of two components

    The enterprise data flow architecture (EDFA)
    A logical data architecture

    Various layers/components of this data flow architecture are as follows:

    Presentation/desktop access layer – This is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data Source Layer – Enterprise information derives from number of sources:

    Operational data – Data captured and maintained by an organization’s existing systems, and usually held in system-specific database or flat files.
    External Data – Data provided to an organization by external sources. This could include data such as customer demographic and market share information.
    Nonoperational data – Information needed by end user that is not currently maintained in a computer accessible format.

    Core data warehouse – This is where all the data of interest to an organization is captured and organized to assist reporting and analysis. DWs are normally instituted as large relational databases. A property constituted DW should support three basic form of an inquiry.

    Drilling up and drilling down – Using dimension of interest to the business, it should be possible to aggregate data as well as drill down. Attributes available at the more granular levels of the warehouse can also be used to refine the analysis.

    Drill across – Use common attributes to access a cross section of information in the warehouse such as sum sales across all product lines by customer and group of customers according to length of association with the company.
    Historical Analysis – The warehouse should support this by holding historical, time variant data. An example of historical analysis would be to report monthly store sales and then repeat the analysis using only customer who were preexisting at the start of the year in order to separate the effective new customer from the ability to generate repeat business with existing customers.

    Data Mart Layer – Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data Staging and quality layer – This layer is responsible for data copying, transformation into DW format and quality control. It is particularly important that only reliable data into core DW. This layer needs to be able to deal with problems periodically thrown by operational systems such as change to account number format and reuse of old accounts and customer numbers.

    Data Access Layer – This layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Data Preparation layer – This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed. Data mining is concern with exploring large volume of data to determine patterns and trends of information. Data mining often identifies patterns that are counterintuitive due to number and complexity of data relationships. Data quality needs to be very high to not corrupt the result.

    Metadata repository layer – Metadata are data about data. The information held in metadata layer needs to extend beyond data structure names and formats to provide detail on business purpose and context. The metadata layer should be comprehensive in scope, covering data as they flow between the various layers, including documenting transformation and validation rules.

    Warehouse Management Layer – The function of this layer is the scheduling of the tasks necessary to build and maintain the DW and populate data marts. This layer is also involved in administration of security.

    Application messaging layer – This layer is concerned with transporting information between the various layers. In addition to business data, this layer encompasses generation, storage and targeted communication of control messages.

    Internet/Intranet layer – This layer is concerned with basic data communication. Included here are browser based user interface and TCP/IP networking.

    Various analysis models used by data architects/ analysis follows:

    Activity or swim-lane diagram – De-construct business processes.

    Entity relationship diagram – Depict data entities and how they relate. These data analysis methods obviously play an important part in developing an enterprise data model. However, it is also crucial that knowledgeable business operative is involved in the process. This way proper understanding can be obtained of the business purpose and context of the data. This also mitigates the risk of replication of suboptimal data configuration from existing systems and database into DW.

    The following were incorrect answers:

    Desktop access layer or presentation layer is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data access layer – his layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Reference:

    CISA review manual 2014 Page number 188

  14. Which of the following layer from an enterprise data flow architecture captures all data of interest to an organization and organize it to assist in reporting and analysis?

    • Desktop access layer
    • Data preparation layer
    • Core data warehouse
    • Data access layer
    Explanation:

    Core data warehouse – This is where all the data of interest to an organization is captured and organized to assist reporting and analysis. DWs are normally instituted as large relational databases. A property constituted DW should support three basic form of an inquiry.
    For CISA exam you should know below information about business intelligence:

    Business intelligence(BI) is a broad field of IT encompasses the collection and analysis of information to assist decision making and assess organizational performance.
    To deliver effective BI, organizations need to design and implement a data architecture. The complete data architecture consists of two components

    The enterprise data flow architecture (EDFA)
    A logical data architecture

    Various layers/components of this data flow architecture are as follows:

    Presentation/desktop access layer – This is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data Source Layer – Enterprise information derives from number of sources:

    Operational data – Data captured and maintained by an organization’s existing systems, and usually held in system-specific database or flat files.
    External Data – Data provided to an organization by external sources. This could include data such as customer demographic and market share information.
    Nonoperational data – Information needed by end user that is not currently maintained in a computer accessible format.

    Core data warehouse – This is where all the data of interest to an organization is captured and organized to assist reporting and analysis. DWs are normally instituted as large relational databases. A property constituted DW should support three basic form of an inquiry.

    Drilling up and drilling down – Using dimension of interest to the business, it should be possible to aggregate data as well as drill down. Attributes available at the more granular levels of the warehouse can also be used to refine the analysis.

    Drill across – Use common attributes to access a cross section of information in the warehouse such as sum sales across all product lines by customer and group of customers according to length of association with the company.
    Historical Analysis – The warehouse should support this by holding historical, time variant data. An example of historical analysis would be to report monthly store sales and then repeat the analysis using only customer who were preexisting at the start of the year in order to separate the effective new customer from the ability to generate repeat business with existing customers.

    Data Mart Layer – Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data Staging and quality layer – This layer is responsible for data copying, transformation into DW format and quality control. It is particularly important that only reliable data into core DW. This layer needs to be able to deal with problems periodically thrown by operational systems such as change to account number format and reuse of old accounts and customer numbers.

    Data Access Layer – This layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Data Preparation layer – This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed. Data mining is concern with exploring large volume of data to determine patterns and trends of information. Data mining often identifies patterns that are counterintuitive due to number and complexity of data relationships. Data quality needs to be very high to not corrupt the result.

    Metadata repository layer – Metadata are data about data. The information held in metadata layer needs to extend beyond data structure names and formats to provide detail on business purpose and context. The metadata layer should be comprehensive in scope, covering data as they flow between the various layers, including documenting transformation and validation rules.

    Warehouse Management Layer – The function of this layer is the scheduling of the tasks necessary to build and maintain the DW and populate data marts. This layer is also involved in administration of security.

    Application messaging layer – This layer is concerned with transporting information between the various layers. In addition to business data, this layer encompasses generation, storage and targeted communication of control messages.

    Internet/Intranet layer – This layer is concerned with basic data communication. Included here are browser based user interface and TCP/IP networking.

    Various analysis models used by data architects/ analysis follows:

    Activity or swim-lane diagram – De-construct business processes.

    Entity relationship diagram – Depict data entities and how they relate. These data analysis methods obviously play an important part in developing an enterprise data model. However, it is also crucial that knowledgeable business operative is involved in the process. This way proper understanding can be obtained of the business purpose and context of the data. This also mitigates the risk of replication of suboptimal data configuration from existing systems and database into DW.

    The following were incorrect answers:

    Desktop access layer or presentation layer is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data access layer – his layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.
    Data preparation layer -This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed.

    Reference:

    CISA review manual 2014 Page number 188

  15. Which of the following layer in an enterprise data flow architecture derives enterprise information from operational data, external data and nonoperational data?

    • Data preparation layer
    • Data source layer
    • Data mart layer
    • Data access layer
    Explanation:

    Enterprise information derives from number of sources:

    Operational data – Data captured and maintained by an organization’s existing systems, and usually held in system-specific database or flat files.

    External Data – Data provided to an organization by external sources. This could include data such as customer demographic and market share information.

    Nonoperational data – Information needed by end user that is not currently maintained in a computer accessible format.

    For CISA exam you should know below information about business intelligence:

    Business intelligence(BI) is a broad field of IT encompasses the collection and analysis of information to assist decision making and assess organizational performance. To deliver effective BI, organizations need to design and implement a data architecture. The complete data architecture consists of two components

    The enterprise data flow architecture (EDFA)
    A logical data architecture

    Various layers/components of this data flow architecture are as follows:

    Presentation/desktop access layer – This is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data Source Layer – Enterprise information derives from number of sources:

    Operational data – Data captured and maintained by an organization’s existing systems, and usually held in system-specific database or flat files.
    External Data – Data provided to an organization by external sources. This could include data such as customer demographic and market share information.
    Nonoperational data – Information needed by end user that is not currently maintained in a computer accessible format.

    Core data warehouse -This is where all the data of interest to an organization is captured and organized to assist reporting and analysis. DWs are normally instituted as large relational databases. A property constituted DW should support three basic form of an inquiry.

    Drilling up and drilling down – Using dimension of interest to the business, it should be possible to aggregate data as well as drill down. Attributes available at the more granular levels of the warehouse can also be used to refine the analysis.

    Drill across – Use common attributes to access a cross section of information in the warehouse such as sum sales across all product lines by customer and group of customers according to length of association with the company.
    Historical Analysis – The warehouse should support this by holding historical, time variant data. An example of historical analysis would be to report monthly store sales and then repeat the analysis using only customer who were preexisting at the start of the year in order to separate the effective new customer from the ability to generate repeat business with existing customers.

    Data Mart Layer- Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data Staging and quality layer -This layer is responsible for data copying, transformation into DW format and quality control. It is particularly important that only reliable data into core DW. This layer needs to be able to deal with problems periodically thrown by operational systems such as change to account number format and reuse of old accounts and customer numbers.

    Data Access Layer -This layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Data Preparation layer -This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed. Data mining is concern with exploring large volume of data to determine patterns and trends of information. Data mining often identifies patterns that are counterintuitive due to number and complexity of data relationships. Data quality needs to be very high to not corrupt the result.

    Metadata repository layer – Metadata are data about data. The information held in metadata layer needs to extend beyond data structure names and formats to provide detail on business purpose and context. The metadata layer should be comprehensive in scope, covering data as they flow between the various layers, including documenting transformation and validation rules.

    Warehouse Management Layer -The function of this layer is the scheduling of the tasks necessary to build and maintain the DW and populate data marts. This layer is also involved in administration of security.

    Application messaging layer -This layer is concerned with transporting information between the various layers. In addition to business data, this layer encompasses generation, storage and targeted communication of control messages.

    Internet/Intranet layer – This layer is concerned with basic data communication. Included here are browser based user interface and TCP/IP networking.

    Various analysis models used by data architects/ analysis follows:

    Activity or swim-lane diagram – De-construct business processes.

    Entity relationship diagram -Depict data entities and how they relate. These data analysis methods obviously play an important part in developing an enterprise data model. However, it is also crucial that knowledgeable business operative is involved in the process. This way proper understanding can be obtained of the business purpose and context of the data. This also mitigates the risk of replication of suboptimal data configuration from existing systems and database into DW.

    The following were incorrect answers:

    Data mart layer – Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data access layer – his layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.
    Data preparation layer -This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed.

    Reference:

    CISA review manual 2014 Page number 188

  16. Which of the following layer in in an enterprise data flow architecture is directly death with by end user with information?

    • Desktop access layer
    • Data preparation layer
    • Data mart layer
    • Data access layer
    Explanation:

    Presentation/desktop access layer is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    For CISA exam you should know below information about business intelligence:

    Business intelligence(BI) is a broad field of IT encompasses the collection and analysis of information to assist decision making and assess organizational performance.
    To deliver effective BI, organizations need to design and implement a data architecture. The complete data architecture consists of two components

    The enterprise data flow architecture (EDFA)
    A logical data architecture

    Various layers/components of this data flow architecture are as follows:

    Presentation/desktop access layer – This is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data Source Layer – Enterprise information derives from number of sources:

    Operational data – Data captured and maintained by an organization’s existing systems, and usually held in system-specific database or flat files.
    External Data – Data provided to an organization by external sources. This could include data such as customer demographic and market share information.
    Nonoperational data – Information needed by end user that is not currently maintained in a computer accessible format.

    Core data warehouse -This is where all the data of interest to an organization is captured and organized to assist reporting and analysis. DWs are normally instituted as large relational databases. A property constituted DW should support three basic form of an inquiry.

    Drilling up and drilling down – Using dimension of interest to the business, it should be possible to aggregate data as well as drill down. Attributes available at the more granular levels of the warehouse can also be used to refine the analysis.

    Drill across – Use common attributes to access a cross section of information in the warehouse such as sum sales across all product lines by customer and group of customers according to length of association with the company.
    Historical Analysis – The warehouse should support this by holding historical, time variant data. An example of historical analysis would be to report monthly store sales and then repeat the analysis using only customer who were preexisting at the start of the year in order to separate the effective new customer from the ability to generate repeat business with existing customers.

    Data Mart Layer- Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data Staging and quality layer -This layer is responsible for data copying, transformation into DW format and quality control. It is particularly important that only reliable data into core DW. This layer needs to be able to deal with problems periodically thrown by operational systems such as change to account number format and reuse of old accounts and customer numbers.

    Data Access Layer -This layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Data Preparation layer -This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed. Data mining is concern with exploring large volume of data to determine patterns and trends of information. Data mining often identifies patterns that are counterintuitive due to number and complexity of data relationships. Data quality needs to be very high to not corrupt the result.

    Metadata repository layer – Metadata are data about data. The information held in metadata layer needs to extend beyond data structure names and formats to provide detail on business purpose and context. The metadata layer should be comprehensive in scope, covering data as they flow between the various layers, including documenting transformation and validation rules.

    Warehouse Management Layer -The function of this layer is the scheduling of the tasks necessary to build and maintain the DW and populate data marts. This layer is also involved in administration of security.

    Application messaging layer -This layer is concerned with transporting information between the various layers. In addition to business data, this layer encompasses generation, storage and targeted communication of control messages.

    Internet/Intranet layer – This layer is concerned with basic data communication. Included here are browser based user interface and TCP/IP networking.

    Various analysis models used by data architects/ analysis follows:

    Activity or swim-lane diagram – De-construct business processes.

    Entity relationship diagram -Depict data entities and how they relate. These data analysis methods obviously play an important part in developing an enterprise data model. However, it is also crucial that knowledgeable business operative is involved in the process. This way proper understanding can be obtained of the business purpose and context of the data. This also mitigates the risk of replication of suboptimal data configuration from existing systems and database into DW.

    The following were incorrect answers:

    Data mart layer – Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data access layer – his layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.
    Data preparation layer -This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed.

    Reference:

    CISA review manual 2014 Page number 188

  17. Which of the following property of the core date warehouse layer of an enterprise data flow architecture uses common attributes to access a cross section of an information in the warehouse?

    • Drill up
    • Drill down
    • Drill across
    • Historical Analysis
    Explanation:

    Drill across – Use common attributes to access a cross section of information in the warehouse such as sum sales across all product lines by customer and group of customers according to length of association with the company.

    For CISA exam you should know below information about business intelligence:
    Business intelligence(BI) is a broad field of IT encompasses the collection and analysis of information to assist decision making and assess organizational performance.

    To deliver effective BI, organizations need to design and implement a data architecture. The complete data architecture consists of two components

    The enterprise data flow architecture (EDFA)
    A logical data architecture

    Various layers/components of this data flow architecture are as follows:

    Presentation/desktop access layer – This is where end users directly deal with information. This layer includes familiar desktop tools such as spreadsheets, direct querying tools, reporting and analysis suits offered by vendors such as Congas and business objects, and purpose built application such as balanced source cards and digital dashboards.

    Data Source Layer – Enterprise information derives from number of sources:

    Operational data – Data captured and maintained by an organization’s existing systems, and usually held in system-specific database or flat files.
    External Data – Data provided to an organization by external sources. This could include data such as customer demographic and market share information.
    Nonoperational data – Information needed by end user that is not currently maintained in a computer accessible format.

    Core data warehouse -This is where all the data of interest to an organization is captured and organized to assist reporting and analysis. DWs are normally instituted as large relational databases. A property constituted DW should support three basic form of an inquiry.

    Drilling up and drilling down – Using dimension of interest to the business, it should be possible to aggregate data as well as drill down. Attributes available at the more granular levels of the warehouse can also be used to refine the analysis.

    Drill across – Use common attributes to access a cross section of information in the warehouse such as sum sales across all product lines by customer and group of customers according to length of association with the company.
    Historical Analysis – The warehouse should support this by holding historical, time variant data. An example of historical analysis would be to report monthly store sales and then repeat the analysis using only customer who were preexisting at the start of the year in order to separate the effective new customer from the ability to generate repeat business with existing customers.

    Data Mart Layer- Data mart represents subset of information from the core DW selected and organized to meet the needs of a particular business unit or business line. Data mart can be relational databases or some form on-line analytical processing (OLAP) data structure.

    Data Staging and quality layer -This layer is responsible for data copying, transformation into DW format and quality control. It is particularly important that only reliable data into core DW. This layer needs to be able to deal with problems periodically thrown by operational systems such as change to account number format and reuse of old accounts and customer numbers.

    Data Access Layer -This layer operates to connect the data storage and quality layer with data stores in the data source layer and, in the process, avoiding the need to know to know exactly how these data stores are organized. Technology now permits SQL access to data even if it is not stored in a relational database.

    Data Preparation layer -This layer is concerned with the assembly and preparation of data for loading into data marts. The usual practice is to per-calculate the values that are loaded into OLAP data repositories to increase access speed. Data mining is concern with exploring large volume of data to determine patterns and trends of information. Data mining often identifies patterns that are counterintuitive due to number and complexity of data relationships. Data quality needs to be very high to not corrupt the result.

    Metadata repository layer – Metadata are data about data. The information held in metadata layer needs to extend beyond data structure names and formats to provide detail on business purpose and context. The metadata layer should be comprehensive in scope, covering data as they flow between the various layers, including documenting transformation and validation rules.

    Warehouse Management Layer -The function of this layer is the scheduling of the tasks necessary to build and maintain the DW and populate data marts. This layer is also involved in administration of security.

    Application messaging layer -This layer is concerned with transporting information between the various layers. In addition to business data, this layer encompasses generation, storage and targeted communication of control messages.

    Internet/Intranet layer – This layer is concerned with basic data communication. Included here are browser based user interface and TCP/IP networking.

    Various analysis models used by data architects/ analysis follows:

    Activity or swim-lane diagram – De-construct business processes.

    Entity relationship diagram -Depict data entities and how they relate. These data analysis methods obviously play an important part in developing an enterprise data model. However, it is also crucial that knowledgeable business operative is involved in the process. This way proper understanding can be obtained of the business purpose and context of the data. This also mitigates the risk of replication of suboptimal data configuration from existing systems and database into DW.

    The following were incorrect answers:

    Drilling up and drilling down – Using dimension of interest to the business, it should be possible to aggregate data as well as drill down. Attributes available at the more granular levels of the warehouse can also be used to refine the analysis.

    Historical Analysis – The warehouse should support this by holding historical, time variant data. An example of historical analysis would be to report monthly store sales and then repeat the analysis using only customer who were preexisting at the start of the year in order to separate the effective new customer from the ability to generate repeat business with existing customers.

    Reference:

    CISA review manual 2014 Page number 188

  18. Which of the following level in CMMI model focuses on process innovation and continuous optimization?

    • Level 4
    • Level 5
    • Level 3
    • Level 2
    Explanation:

    Level 5 is the optimizing process and focus on process innovation and continuous integration.

    For CISA Exam you should know below information about Capability Maturity Model Integration (CMMI) mode:

    Maturity model
    A maturity model can be viewed as a set of structured levels that describe how well the behaviors, practices and processes of an organization can reliably and sustainable produce required outcomes.

    CMMI Levels

    CISA Certified Information Systems Auditor Part 49 Q18 034
    CISA Certified Information Systems Auditor Part 49 Q18 034

    A maturity model can be used as a benchmark for comparison and as an aid to understanding – for example, for comparative assessment of different organizations where there is something in common that can be used as a basis for comparison. In the case of the CMM, for example, the basis for comparison would be the organizations’ software development processes.
    Structure

    The model involves five aspects:

    Maturity Levels: a 5-level process maturity continuum – where the uppermost (5th) level is a notional ideal state where processes would be systematically managed by a combination of process optimization and continuous process improvement.
    Key Process Areas: a Key Process Area identifies a cluster of related activities that, when performed together, achieve a set of goals considered important.
    Goals: the goals of a key process area summarize the states that must exist for that key process area to have been implemented in an effective and lasting way. The extent to which the goals have been accomplished is an indicator of how much capability the organization has established at that maturity level. The goals signify the scope, boundaries, and intent of each key process area.
    Common Features: common features include practices that implement and institutionalize a key process area. There are five types of common features: commitment to perform, ability to perform, activities performed, measurement and analysis, and verifying implementation.
    Key Practices: The key practices describe the elements of infrastructure and practice that contribute most effectively to the implementation and institutionalization of the area.

    Levels
    There are five levels defined along the continuum of the model and, according to the SEI: “Predictability, effectiveness, and control of an organization’s software processes are believed to improve as the organization moves up these five levels. While not rigorous, the empirical evidence to date supports this belief”.[citation needed]

    Initial (chaotic, ad hoc, individual heroics) – the starting point for use of a new or undocumented repeat process.
    Repeatable – the process is at least documented sufficiently such that repeating the same steps may be attempted.
    Defined – the process is defined/confirmed as a standard business process, and decomposed to levels 0, 1 and 2 (the last being Work Instructions).
    Managed – the process is quantitatively managed in accordance with agreed-upon metrics.
    Optimizing – process management includes deliberate process optimization/improvement.

    Within each of these maturity levels are Key Process Areas which characteristic that level, and for each such area there are five factors: goals, commitment, ability, measurement, and verification. These are not necessarily unique to CMM, representing — as they do — the stages that organizations must go through on the way to becoming mature.

    The model provides a theoretical continuum along which process maturity can be developed incrementally from one level to the next. Skipping levels is not allowed/feasible.

    Level 1 – Initial (Chaotic)
    It is characteristic of processes at this level that they are (typically) undocumented and in a state of dynamic change, tending to be driven in an ad hoc, uncontrolled and reactive manner by users or events. This provides a chaotic or unstable environment for the processes.

    Level 2 – Repeatable
    It is characteristic of processes at this level that some processes are repeatable, possibly with consistent results. Process discipline is unlikely to be rigorous, but where it exists it may help to ensure that existing processes are maintained during times of stress.

    Level 3 – Defined
    It is characteristic of processes at this level that there are sets of defined and documented standard processes established and subject to some degree of improvement over time. These standard processes are in place (i.e., they are the AS-IS processes) and used to establish consistency of process performance across the organization.

    Level 4 – Managed
    It is characteristic of processes at this level that, using process metrics, management can effectively control the AS-IS process (e.g., for software development). In particular, management can identify ways to adjust and adapt the process to particular projects without measurable losses of quality or deviations from specifications. Process Capability is established from this level.

    Level 5 – Optimizing
    It is a characteristic of processes at this level that the focus is on continually improving process performance through both incremental and innovative technological changes/improvements.

    At maturity level 5, processes are concerned with addressing statistical common causes of process variation and changing the process (for example, to shift the mean of the process performance) to improve process performance. This would be done at the same time as maintaining the likelihood of achieving the established quantitative process-improvement objectives.

    The following were incorrect answers:

    Level 4 – Focus on process management and process control
    Level 3 – Process definition and process deployment.
    Level 2 – Performance management and work product management.

    Reference:

    CISA review manual 2014 Page number 188

  19. Which of the following level in CMMI model focuses on process definition and process deployment?

    • Level 4
    • Level 5
    • Level 3
    • Level 2
    Explanation:

    Level 3 is the defined step and focus on process definition and process deployment.

    For CISA Exam you should know below information about Capability Maturity Model Integration (CMMI) mode:

    Maturity model
    A maturity model can be viewed as a set of structured levels that describe how well the behaviors, practices and processes of an organization can reliably and sustainable produce required outcomes.

    CMMI Levels

    CISA Certified Information Systems Auditor Part 49 Q19 035
    CISA Certified Information Systems Auditor Part 49 Q19 035

    A maturity model can be used as a benchmark for comparison and as an aid to understanding – for example, for comparative assessment of different organizations where there is something in common that can be used as a basis for comparison. In the case of the CMM, for example, the basis for comparison would be the organizations’ software development processes.
    Structure

    The model involves five aspects:

    Maturity Levels: a 5-level process maturity continuum – where the uppermost (5th) level is a notional ideal state where processes would be systematically managed by a combination of process optimization and continuous process improvement.
    Key Process Areas: a Key Process Area identifies a cluster of related activities that, when performed together, achieve a set of goals considered important.
    Goals: the goals of a key process area summarize the states that must exist for that key process area to have been implemented in an effective and lasting way. The extent to which the goals have been accomplished is an indicator of how much capability the organization has established at that maturity level. The goals signify the scope, boundaries, and intent of each key process area.
    Common Features: common features include practices that implement and institutionalize a key process area. There are five types of common features: commitment to perform, ability to perform, activities performed, measurement and analysis, and verifying implementation.
    Key Practices: The key practices describe the elements of infrastructure and practice that contribute most effectively to the implementation and institutionalization of the area.

    Levels
    There are five levels defined along the continuum of the model and, according to the SEI: “Predictability, effectiveness, and control of an organization’s software processes are believed to improve as the organization moves up these five levels. While not rigorous, the empirical evidence to date supports this belief”.[citation needed]

    Initial (chaotic, ad hoc, individual heroics) – the starting point for use of a new or undocumented repeat process.
    Repeatable – the process is at least documented sufficiently such that repeating the same steps may be attempted.
    Defined – the process is defined/confirmed as a standard business process, and decomposed to levels 0, 1 and 2 (the last being Work Instructions).
    Managed – the process is quantitatively managed in accordance with agreed-upon metrics.
    Optimizing – process management includes deliberate process optimization/improvement.

    Within each of these maturity levels are Key Process Areas which characteristic that level, and for each such area there are five factors: goals, commitment, ability, measurement, and verification. These are not necessarily unique to CMM, representing — as they do — the stages that organizations must go through on the way to becoming mature.

    The model provides a theoretical continuum along which process maturity can be developed incrementally from one level to the next. Skipping levels is not allowed/feasible.

    Level 1 – Initial (Chaotic)
    It is characteristic of processes at this level that they are (typically) undocumented and in a state of dynamic change, tending to be driven in an ad hoc, uncontrolled and reactive manner by users or events. This provides a chaotic or unstable environment for the processes.

    Level 2 – Repeatable
    It is characteristic of processes at this level that some processes are repeatable, possibly with consistent results. Process discipline is unlikely to be rigorous, but where it exists it may help to ensure that existing processes are maintained during times of stress.

    Level 3 – Defined
    It is characteristic of processes at this level that there are sets of defined and documented standard processes established and subject to some degree of improvement over time. These standard processes are in place (i.e., they are the AS-IS processes) and used to establish consistency of process performance across the organization.

    Level 4 – Managed
    It is characteristic of processes at this level that, using process metrics, management can effectively control the AS-IS process (e.g., for software development ). In particular, management can identify ways to adjust and adapt the process to particular projects without measurable losses of quality or deviations from specifications. Process Capability is established from this level.

    Level 5 – Optimizing
    It is a characteristic of processes at this level that the focus is on continually improving process performance through both incremental and innovative technological changes/improvements.

    At maturity level 5, processes are concerned with addressing statistical common causes of process variation and changing the process (for example, to shift the mean of the process performance) to improve process performance. This would be done at the same time as maintaining the likelihood of achieving the established quantitative process-improvement objectives.

    The following were incorrect answers:

    Level 4 – Focus on process management and process control
    Level 5 – Process innovation and continuous optimization.
    Level 2 – Performance management and work product management.

    Reference:

    CISA review manual 2014 Page number 188

  20. ISO 9126 is a standard to assist in evaluating the quality of a product. Which of the following is defined as a set of attributes that bear on the existence of a set of functions and their specified properties?

    • Reliability
    • Usability
    • Functionality
    • Maintainability
    Explanation:

    Functionality – A set of attributes that bear on the existence of a set of functions and their specified properties.

    The functions are those that satisfy stated or implied needs.
    Suitability
    Accuracy
    Interoperability
    Security
    Functionality Compliance

    For CISA Exam you should know below information about ISO 9126 model:

    ISO/IEC 9126 Software engineering — Product quality was an international standard for the evaluation of software quality. It has been replaced by ISO/IEC 25010:2011.[1] The fundamental objective of the ISO/IEC 9126 standard is to address some of the well-known human biases that can adversely affect the delivery and perception of a software development project. These biases include changing priorities after the start of a project or not having any clear definitions of “success.” By clarifying, then agreeing on the project priorities and subsequently converting abstract priorities (compliance) to measurable values (output data can be validated against schema X with zero intervention), ISO/IEC 9126 tries to develop a common understanding of the project’s objectives and goals.

    ISO 9126

    CISA Certified Information Systems Auditor Part 49 Q20 036
    CISA Certified Information Systems Auditor Part 49 Q20 036

    The standard is divided into four parts:

    Quality model
    External metrics
    Internal metrics
    Quality in use metrics.

    Quality Model
    The quality model presented in the first part of the standard, ISO/IEC 9126-1,[2] classifies software quality in a structured set of characteristics and sub-characteristics as follows:

    Functionality – A set of attributes that bear on the existence of a set of functions and their specified properties. The functions are those that satisfy stated or implied needs.
    Suitability
    Accuracy
    Interoperability
    Security
    Functionality Compliance

    Reliability – A set of attributes that bear on the capability of software to maintain its level of performance under stated conditions for a stated period of time.
    Maturity
    Fault Tolerance
    Recoverability
    Reliability Compliance

    Usability – A set of attributes that bear on the effort needed for use, and on the individual assessment of such use, by a stated or implied set of users.
    Understandability
    Learn ability
    Operability
    Attractiveness
    Usability Compliance

    Efficiency – A set of attributes that bear on the relationship between the level of performance of the software and the amount of resources used, under stated conditions.
    Time Behavior
    Resource Utilization
    Efficiency Compliance

    Maintainability – A set of attributes that bear on the effort needed to make specified modifications.
    Analyzability
    Changeability
    Stability
    Testability
    Maintainability Compliance

    Portability – A set of attributes that bear on the ability of software to be transferred from one environment to another.
    Adaptability
    Install ability
    Co-Existence
    Replace ability
    Portability Compliance

    Each quality sub-characteristic (e.g. adaptability) is further divided into attributes. An attribute is an entity which can be verified or measured in the software product. Attributes are not defined in the standard, as they vary between different software products.

    Software product is defined in a broad sense: it encompasses executables, source code, architecture descriptions, and so on. As a result, the notion of user extends to operators as well as to programmers, which are users of components such as software libraries.

    The standard provides a framework for organizations to define a quality model for a software product. On doing so, however, it leaves up to each organization the task of specifying precisely its own model. This may be done, for example, by specifying target values for quality metrics which evaluates the degree of presence of quality attributes.

    Internal Metrics
    Internal metrics are those which do not rely on software execution (static measure)

    External Metrics
    External metrics are applicable to running software.
    Quality in Use Metrics

    Quality in use metrics are only available when the final product is used in real conditions.
    Ideally, the internal quality determines the external quality and external quality determines quality in use.

    This standard stems from the GE model for describing software quality, presented in 1977 by McCall et al., which is organized around three types of Quality Characteristics:

    Factors (To specify): They describe the external view of the software, as viewed by the users.
    Criteria (To build): They describe the internal view of the software, as seen by the developer.
    Metrics (To control): They are defined and used to provide a scale and method for measurement.

    ISO/IEC 9126 distinguishes between a defect and a nonconformity, a defect being The nonfulfillment of intended usage requirements, whereas a nonconformity is The nonfulfillment of specified requirements. A similar distinction is made between validation and verification, known as V&V in the testing trade.

    The following were incorrect answers:

    Reliability – A set of attributes that bear on the capability of software to maintain its level of performance under stated conditions for a stated period of time.
    Usability – A set of attributes that bear on the effort needed for use, and on the individual assessment of such use, by a stated or implied set of users.
    Maintainability – A set of attributes that bear on the effort needed to make specified modifications.

    Reference:

    CISA review manual 2014 Page number 188