An InfoCube is the central object in BI for the storage of data. It is the most used object for storing data and generating queries. A DataStore object serves as a storage location for consolidated and cleansed transaction data.
Copyright:
Attribution Non-Commercial (BY-NC)
Available Formats
Download as DOCX, PDF, TXT or read online from Scribd
An InfoCube is the central object in BI for the storage of data. It is the most used object for storing data and generating queries. A DataStore object serves as a storage location for consolidated and cleansed transaction data.
An InfoCube is the central object in BI for the storage of data. It is the most used object for storing data and generating queries. A DataStore object serves as a storage location for consolidated and cleansed transaction data.
Copyright:
Attribution Non-Commercial (BY-NC)
Available Formats
Download as DOCX, PDF, TXT or read online from Scribd
An InfoCube is the central object in BI for the storage of data. It is the most used object for storing data and generating queries. A DataStore object serves as a storage location for consolidated and cleansed transaction data.
Copyright:
Attribution Non-Commercial (BY-NC)
Available Formats
Download as DOCX, PDF, TXT or read online from Scribd
nfoCubes---Standard and Real Time MultiProviders nfoObjects---Characteristic nfoSets DataStores---Standard, Direct Update, and Write- Optimized Virtual nfoCubes---Data Transfer Process, BAP, and function module Data Targets: An InfoCube is the central object in B for the storage of data. t is the most used object for storing data and generating queries due to the relational structure of the tables. The structure of the nfoCube was set up specifically to be able to execute and run query processes as quickly as possible. nfoCube can answer any question about events that have happened, not events that have not happened. For example, suppose you are interested in knowing the product orders that have run during the month that have not had any variance between standard costs and actual costs. The nfoCube doesn't hold information about any event that has not happened---and the production orders that don't have a variance have not happened. A DataStore object serves as a storage location for consolidated and cleansed transaction data on a document (operational) data level, or it can be used for the storage of master data for analysis. With DataStore objects it is also possible to overwrite data fields. This is particularly important with document-related structures. f documents (for example, sales orders transactions) are changed in the source system, these changes include both numeric fields, such as the order quantity, and nonnumeric fields, such as the ship-to party, status, and delivery date, can be updated into the DataStore object and the change statuses can be tracked by the "change log" in the DataStore object. Another task that the DataStore object can be used for is the update of the current data in another object. This uses the delta updating functionality and can update data into an nfoCubes and/or other DataStore objects or master data tables (attributes or texts) in the same system or across different systems. You can indicate an InfoObject of type Characteristic as an nfoProvider if it has attributes and/or texts. The data is then loaded into the master data tables using the transformation rules. t is not possible to use transformation rules to load hierarchies. You also need to have the field for the connection to the nfoArea so that the nfoObject can be identified as an nfoProvider and therefore execute a query against this object. Non-Data Targets: As mentioned before, the Data Target nfoProviders are only able to show you events that have happened, but with Non- Data Targets you have the ability to create combinations so that additional reporting requests can be satisfied. A query based on a MuItiProvider is divided internally into subqueries. There is a sub-query for each nfoProvider included in the MultiProvider. These subqueries are usually processed in parallel. Technically there are no restrictions with regard to the number of nfoProviders that can be included in a MultiProvider. However, recommend that you include no more than ten nfoProviders in a single MultiProvider, because splitting the MultiProvider queries and reconstructing the results for the individual nfoProviders takes a substantial amount of time and is generally counterproductive. InfoSets are a specific kind of nfoProvider that describe DataSources that are defined as joins of DataStore objects, nfoCubes, and nfoObjects (not other MultiProviders). Because the nfoSet is a join of data, you can control the view of the information from an nfoSet. By this mean that you can set up the nfoSet so that the query can answer questions, such as questions about an event that has not happened. For example, if you use a master data-bearing nfoObject during the creation of the nfoSet and identify this as the outer join, the master data will be a required value in the column and therefore show situations were there has been no activity. As a real-world example, worked with a company that wanted to see what each of its different divisions sold during each month---and not only what they did sell, but also other products that they didn't sell. Therefore, setting up an nfoSet and using the master data as the left outer join, was able to generate a query that had all the divisions and the products listed in the rows and the periods in the columns. All products were displayed, even if there were no sales for that period. Note: A join condition determines the combination of individual object records that are included in the results set. nfoSets allow you to analyze the data in several nfoProviders by using combinations of master data-bearing characteristics, nfoCubes, and DataStore objects. The system collects information from the tables of the relevant nfoProviders. When an nfoSet is made up of several characteristics, you can map transitive attributes and analyze this master data. Note: Transitive attributes are attributes at the secondary level. Suppose, for example, you have an nfoObject called Customer that has an attribute of Region, and that attribute, Region, has an attribute of Country. You can set up a process so that you can report on Country via Customer. 'irtuaI InfoCubes are nfoProviders with transaction data that is not stored in the objects themselves, but is read directly for analysis and reporting purposes. The relevant data can be from the B system or from other SAP or non-SAP systems. Virtual Providers only allow read access to data. The different Virtual Providers are based on the data transfer process, on a BAP (Business Application Program nterface), or on the function module. Virtual Provider based on Data Transfer Process is used if you require up-to-date information from an SAP source system, but it is suggested that you only access a small set of data and that you only use this process for a specific set of users. There are specific situations where the use of this type of Virtual Provider is important---for example, if you are looking to validate information that was uploaded into the B system and you use this Virtual Provider to access the same table you used for the upload. You could create a query against this Virtual Provider to review the information directly from the source system and then compare it to the uploaded data (more than likely you would compare the data from the Virtual Provider query with the data in the PSA (Persistent Staging Area) since this is the initial table that is used for an uploading/staging area and the data in the PSA has not be altered by any programs in B). Virtual Provider based on BAP is different from the DTP Virtual Provider because the transaction data is read for analysis and reporting from an external system using a BAP (Business Application Program nterface) and not specifically from an SAP source. Virtual Provider based on Functional Modules are used if you want to display data from non-B DataSources in B without having to copy the dataset into the B structures. The data can be local or remote. You can also use your own calculations to change the data before it is passed to the OLAP processor. n comparison to other Virtual Providers, this one is more generic. t offers more flexibility, but also requires a higher implementation effort.