SQL Quiz-Y

Download as xlsx, pdf, or txt
Download as xlsx, pdf, or txt
You are on page 1of 75

Topic

SQL
SQL
SQL
SQL
SQL

Question
Which of the following texts below is a SQL statement that allows you
to add a DEFAULT value for a column to a particular table

SQL

CREATE table Test (


col1 NUMBER PRIMARY KEY
, col2 NUMBER UNIQUE
, col3 char(4) DEFAULT 1234
, col4 DATE DEFAULT 01-01-2011);
Why does the CREATE statement fail?

SQL

Which statement is true about sequences created in a single instance


database?
Evaluate the CREATE TABLE statement:

SQL

CREATE TABLE products


(product_id NUMBER(6) CONSTRAINT prod_id_pk PRIMARY KEY,
product_name VARCHAR2(15));
Which statement is true regarding the PROD_ID_PK constraint?

SQL

Which statement is true?

SQL
SQL
SQL

Evaluate the following CREATE SEQUENCE statement:


CREATE SEQUENCE seq1
START WITH 100
INCREMENT BY 10
MAXVALUE 200
CYCLE
NOCACHE;
The sequence seq1 has generated numbers up to the maximum limit of
200. You issue the following SQL statement:
SELECT seq1.nextval FROM dual;
What is displayed by the SELECT statement?
UNIQUE database constraint accepts
Below is used to delete duplicate rows from a table

SQL
SQL

CONNECT BY is used to
Below is a pseudo column

SQL
SQL
SQL

SQL Loader
Below Oracle function can only be used on NUMBER datatype
GREATEST oracle function works on

SQL

HIREDATE is a (date type) column in Employees table and we need to


select data for employees got hired in the month of December. Which
of the following can be used?
Atleast how many join conditions are needed to join 3 tables in oracle
to avoid cartesian product

SQL

Which statement is true regarding the EXISTS operator used in the


correlated subqueries?

SQL

A non-correlated subquery can be defined as_____

SQL

Which statement is true regarding the hierarchical query in Oracle


Database?

SQL

Which statement is true regarding views?

SQL

Which Statement best describes the GROUPING function?

SQL

Which statement is true regarding operators used with subqueries

SQL

Which statement is true regarding the ROLLUP operator specified in the


GROUP BY clause of a SQL statement?

SQL

Which statement is true regarding multiple-row subqueries?

SQL

Evaluate the following ALTER TABLE statement:

SQL
SQL

ALTER TABLE orders


SET UNUSED order_date;
Which statement is true?
Which Statement indicate the end of a transaction?

You need to create a table with the following column specifications:


1) Employee ID (numeric data type) for each employee
2) Employee Name, (character data type) which stores the employee
name
3) Hire date, to store the date when the employee joined the
organization
4) Status (character data type). It should contain the value if no data is
entered
5) Resume (character large object [CLOB] data type), which would
contain the resume submitted by the employee
SQL

Which is the correct syntax to create this table?

Options

UPDATE TABLE
SET DEFAULT
ALTER TABLE
ALTER COLUMN
MODIFY COLUMN

Answer Option1

UPDATE TABLE
x

A DATE field cannot have a


numeric default

The numbers generated by a


sequence can be used only for
one table

It would be created only if a


unique index is manually
created first.
The USER_SYNONYMS view can
provide information about
private synonyms

1
Duplicate Values
ROWNUM
Join two different tables
CURRVAL
is oracle utility
COUNT
NUMBER datatype

EXTRACT Function
One join condition

The outer stops evaluating the


result set of the inner query
when the first value is found

a set of sequential queries, all


of which must always return a
single value

It is possible to retrieve data


only in top-down hierarchy

A simple view in which column


aliases have been used cannot
be updated
It is used to set the order for
the groups to be used for
calculating the grand totals and
subtotals
The NOT IN operator is
equivalent to IS NULL

It produces only the subtotals


for the groups specified in the
GROUP BY clause.
They can contain group
functions

The DESCRIBE command would


still display the ORDER_DATE
column.
SELECT

CREATE TABLE EMP_1


(emp_id NUMBER(4),
emp_name VARCHAR2(25),
start_date DATE,
e_status VARCHAR2(10)
DEFAULT 'ACTIVE',
resume CLOB(200));

Option2

Option3

Option4

SET DEFAULT

ALTER TABLE

ALTER COLUMN

Number column cannot be


defined as UNIQUE

A Char field cannot have a


numeric default

A number column cannot be a


primary key

When a database instance


shuts down abnormally, the
sequence numbers that have
been cached but not used
DELETE <sequence name>
CURRVAL is used to refer to the would be available once again
would remove a sequence from last sequence number that has when the database instance is
the database
been generated
restarted

It would be created and would


use and automatically created
unique index

It would be created and would


use an automatically created
no unique index

All the dynamic performance


The user SYSTEM owns all the views prefixed with V$ are
base tables and user-accessible accessible to all the database
views of the data dictionary
users

10
only one NULL
DISTINCT
to specify relation between
parent and child rows
LEVEL
is oracle command
MIN
DATE datatype

It would be created and


remains in a disabled state
because no index is specified in
the command
The USER_OBJECTS view can
provide information about the
tables and views created by
the user only

100 an error
any number of NULLS
none of the above
ROWID
none of the above
to specify connection between
two tables
none of the above
NEXTVAL
ROWNUM
is a third party file loading
is oracle function
software
AVG
MAX
CHAR datatype
All of the above

TO_CHAR Function

TO_DATE Function

Option 1 & 2

Two join conditions

Three join conditions

Four join conditions

It is used to test whether the


values retrieved by the inner
query exist in the result of the
outer query

The outer query continues


evaluating the result set of the
inner query until all the values
in the result set are processed none of the above

a set of sequential queries, all


of which must return values
from the same table

a set of one or more sequential


queries in which generally the
a SELECT statement that can
result of the inner query is used
be embedded in a clause of
as the search value in the outer
another SELECT statement only query

It is possible to retrieve data in You cannot specify conditions


top-down & bottom-up
when you retrieve data by
hierarchy
using a hierarchical query

Hierarchy can have only one


root value

The WITH CHECK OPTION


constraint can be used in a
view definition to restrict the
columns displayed through the
view

Rows added through a view are


deleted from the table
automatically when the view is
dropped

The OR REPLACE option is used


to change the definition of an
existing view without dropping
and re-creating it.

It is used to form various


It is used to identify if the NULL
groups to calculate total and
values in an expression is a
subtotals created using ROLLUP stored NULL values or created
and CUBE operators
by ROLLUP or CUBE

It is used to specify the


concatenated group
expressions to be used for
calculating the grand totals and
subtotals

The <ANY operator means less The =ANY and =ALL operators The IN operator cannot be used
than the maximum
have the same functionality
in single-row subqueries

It produces only the grand


totals for the groups specified
in the GROUP BY clause.
They always contain a
subquery within a subquery

It produces higher-level
subtotals, moving from right to
left through the list of grouping
columns specified in the
GROUP BY clause

It produces higher-level
subtotals, moving in all the
directions through the list of
grouping columns specified in
the GROUP BY clause
They can be used to retrieve
They use the <ALL operator to multiple rows from a single
imply less than the maximum table only

The ORDER_DATE column


ROLLBACK can be used to get should be empty for the ALTER
back the ORDER_DATE column TABLE command to execute
in the ORDERS table.
successfully
INSERT
DELETE

After executing the ALTER


TABLE command, you can add
a new column called
ORDER_DATE to the ORDERS
table.
COMMIT

CREATE TABLE 1_EMP


(emp_id NUMBER(4),
emp_name VARCHAR2(25),
start_date DATE,
e_status VARCHAR2(10)
DEFAULT 'ACTIVE',
resume CLOB);

CREATE TABLE 1_EMP


(emp_id NUMBER(4),
emp_name VARCHAR2(25),
start_date DATE,
e_status VARCHAR2(10)
DEFAULT "ACTIVE",
resume CLOB);

CREATE TABLE EMP_1


(emp_id NUMBER,
emp_name VARCHAR2(25),
start_date DATE,
e_status VARCHAR2(10)
DEFAULT 'ACTIVE',
resume CLOB);

Option5

Answer

MODIFY COLUMN

None of the above

1
3
3

all of the above

2
5
1
3
4

None of the above

None of the above

A subquery used in a complex


view definition cannot contain
group functions or joins

None of the above

SAVEPOINT

4
4

Topic

Question

INFORMATICA

Source Qualifier Transformation is a

INFORMATICA
INFORMATICA

You specify the Target Load Order based on


In Source Qualifier, the default query is a SELECT
statement for each column

INFORMATICA

Source Qualifier for a Relational file, doesnt allow you to


enter the SQL override

INFORMATICA

Target Load Order is specified in the Designer in

INFORMATICA

Reusable Source Qualifier Transformation can be created in

INFORMATICA

A filter condition was entered in the Source Filter on the


Transformation Properties tab
When left outer and right outer joins are to be combined
with normal joins in a single source qualifier, then the
order they should entered is
A Source Qualifier is connected to the Source definition
having 28 Columns, out

INFORMATICA

Expression Transformation is

INFORMATICA

A variable port in an Expression Transformation can be


connected

INFORMATICA

Variable ports in the Expression Transformations can hold


the calculated values like Averages, Sums etc.,

INFORMATICA

INFORMATICA

INFORMATICA

Expressions can be entered in any


The number of Expressions you can enter for an Output
Port in Expression

INFORMATICA

A mapping has 2 Targets, TableA, TableB. TableA has a


primary key and a foreign key referencing the primary key
in TableB. TableB has a primary key and a foreign key
referencing the primary key of TableA. When you configure
the session to use Constraint Based Loading, the
Informatica Server

INFORMATICA

INFORMATICA

In a certain mapping, there are 4 targets Tgt1, Tgt2, Tgt3,


Tgt4. Tgt1 has a primary key, Tgt2 and Tgt3 contain
foreign keys referencing the Tgt1 primary key. Tgt3 has a
primary key that Tgt2 and Tgt4 references as a foreign
key. Tgt2 has a foreign key referencing the primary key of
Tgt4. The order in which the Informatica Server loads the
targets

INFORMATICA

The default port types for the ports in the Joiner


Transformation after adding the Master and Detail sources

INFORMATICA

The Joiner Transformation supports

INFORMATICA

Joiner Transformation cannot be used in which of the


following situations

INFORMATICA

When a Joiner transformation occurs in a session, the


Informatica Server reads all the records from the

INFORMATICA

Both Filter and Joiner Transformations are


If the filter condition entered in the Filter Transformation
evaluates to NULL, then the

INFORMATICA

For the relational sources, filter condition in the Filter


Transformation

INFORMATICA

INFORMATICA
INFORMATICA

RANKINDEX port that the Designer automatically creates is


What are the transformation port types available in
Sequence Generator Transformation.

INFORMATICA

After every session run for a mapping containing Sequence


Generator, the Informatica Server writes the Current Value
to
NEXTVAL, CURRVAL ports of the Sequence Generator are
by default

INFORMATICA

The default value for Number Of Cached Values for a


Reusable Sequence Generator Transformation is

INFORMATICA

INFORMATICA

INFORMATICA
INFORMATICA

To generate sequential values using Sequence Generator


Transformation, you need to connect
The number of Output Groups created when you enter 4
conditions in the Groups tab of the Router Transformation
How many types of Output Groups are there in the Router
Transformations

INFORMATICA

If a row meets 3 group filter conditions out of total 5 group


conditions entered in the Router Transformations, then the
Informatica Server passes the row
In the Router Transformation, the default port type of the
ports is

INFORMATICA

The records that falls into the Default Group

INFORMATICA

In the Router Transformation, when the last user-defined


group is deleted, then the

INFORMATICA

Lookup Transformation can be used to

INFORMATICA

Lookup Transformation is

INFORMATICA

By default the lookup cache in Lookup Transformation is


When Informatica Server encounters multiple matches
while processing a Lookup Transformation with a dynamic
cache, then it

INFORMATICA

INFORMATICA
INFORMATICA

INFORMATICA
INFORMATICA

The minimum size of the Lookup Index Cache size is


When multiple conditions are entered in the Lookup
Transformation conditions tab, the Informatica Server
evaluates each condition as
Stored Procedure Transformations are created by default
as

INFORMATICA

The Call Text attribute in the Stored Procedure


Transformation is used when
The type of Stored Procedure Transformation used when
parameters are passed to it to retrieve multiple
parameters is
If a session is created for a mapping that uses a Stored
Procedure Transformation with an error, then the
Informatica Server

INFORMATICA

The port types in Stored Procedure Transformation are

INFORMATICA

INFORMATICA

INFORMATICA

When Forward reject row attribute in the Update Strategy


Transformation is deselected, then the Informatica Server
writes the rows rejected by the Update Strategy
Transformation to

INFORMATICA

Which of the following transformations can be used in both


the connected and unconnected form

INFORMATICA

Which of the following transformation provides an option


for rejecting the rows
Which version of informatica upportsdata integrationfor
the cloud?
Can we use sorted input option
forincrementalaggregation?

INFORMATICA

How can you handle multiple matches in lookup


transformation?

INFORMATICA

Which transformation receives input values from the result


of a :LKP expression in another transformation?

INFORMATICA

How to create a custom join in Source Qualifier


transformation?
Explanation: When there is no PM key FK relation ship
between two table, we can specify custom join by using
user defined join in properties tab of source qualifier.

INFORMATICA
INFORMATICA

INFORMATICA

INFORMATICA

IIF( Avg(salary) 20000,1,0)


Can I se this statement in aggrigator transformation?
Can we use the maping parameter or variable created in
one maping into
other reusable transformation?

INFORMATICA

What is nested aggregate function?


Which transformations are both connected and on
connected?
How can you connect client to your informatica server if
server is located in different place i.e not in local client?
choose correct answers

INFORMATICA

What is single level aggregate function?

INFORMATICA

Which of these are true?

INFORMATICA

When a developer creates an XML target definition, how is th

INFORMATICA
INFORMATICA

INFORMATICA

When the Informatica server reads an XML data source, how i


What is the use of power plug?

INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA

INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA

INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA
INFORMATICA

Option1

Option2

Active/Connected Transformation

Passive/Connected
Transformation

Source Tables
Used in the mapping

Target Tables
Available in the Source
Qualifier

Source Analyzer

Transformation Developer

Mapping Designer
Transformation Developer
Queries the Database with the modified SQL Query
and applies the condition entered in the Source
Queries the Database with the
Filter on the result set
modified SQL Query

Left Outer, Normal, Right Outer

Normal, Left Outer, Right


Outer
Output

to a variable port in the next Transformation

Passive/Connected
Transformation
to a variable port but of the
same datatype in the next
Transformation

Input Port

Input/Output Port

Loads the target TableA first and then TableB

Loads the target TableB first


and then TableA

Active/Connected Transformation

Tgt1, Tgt2, Tgt3, Tgt4

Tgt1, Tgt3, Tgt4, Tgt2

Input

Input/Output

Equivalent Joins

Non-Equivalent Joins

In joining a Relational and an XML sources

Both input pipelines originate


from the same original data
source

detail source and builds index and data caches


based on the detail rows

master source and builds


index and data caches based
on the master rows

Passive/Connected

Active/Connected

Must use only standard SQL that returns either


TRUE/FALSE

Can be any condition that


returns either TRUE/FALSE

Input Port

Output Port

Input

Input/Output

Source Database

Target Database

Input Ports

Output Ports

100

NEXTVAL Port to the next Transformation

CURRVAL Port to the next


Transformation

Only once to the last of the 3


Only once to the first of the 3 output groups whose output groups whose
condition it meets
condition it meets
Input

Output

Cannot be connected to the next transformation or Can only be connected to the


Target definition
Target definition

Get a related value

Doesnt delete but invalidates


the default group
Update Slowly changing
dimension

Connected/Passive

Unconnected/Passive

Static

Dynamic

Skips the rows entirely

Skips all the rows but forwards


only the first

1,024 bytes

1,024 Kbytes

AND

OR

Source Pre-Load

Source Post-Load

The Stored Procedure Transformation type is


Normal

The Stored Procedure


Transformation type is not
Normal

Connected

Unconnected

Fails the session

Completes the session

Input, Output, Variable

Input, Output, Return

Deletes the default group

Reject File

Session Log File

Stored Procedure & Update Strategy

Stored Procedure & Lookup

Lookup Transformation

Stored Procedure
Transformation

8.1

8.6

YES

NO

By Using "LookupMatch" option

By Using "Lookup Policy "


option

Custom

Connected SP

By using system-defined join

By using custom-defined join

YES

NO

YES
Takes only oneoutputrow(wrong)

NO
Returns only multiple output
row

StoredProcedure

SQL

through pmcmd task

through ftp

Returns more than one row

Returns only one row

Drill-down allows the users view higher level

ADVANCE DESGINER
QUESTIONS
The Code Page (also called encoding) is defined in the XML
The Code Page (also called encoding

The Code Page (also called


encoding) can be defined in
the XML source file or the DTD
(Data Type Definition) file.
The Code Page (also called encoding) is defined in the XML
For 3rd party connectors to
sap, mainframe, Peoplesoft

Option3

Option4

Active/Unconnected Transformation

Passive/Unconne
cted
Transformation
1

Source Qualifier Transformations

Repository

Available in the Source Table

None

Depends on the Source Database


Mapping Designer
Mapplet Designer
Queries the Database with the Default SQL Query
and then applies the Filter Condition entered in the
Source Filter

Answer

Depends on the
settings
configured in the
Informatica
Client
Components
2
Warehouse
Designer
3
None

None of the
above

Right Outer, Normal, Left Outer

Left Outer, Right


Outer, Normal
2

Input/Output

Variable

Active/Unconnected Transformation

Passive/Unconne
cted
Transformation
2

to any Input/Output port but of the same datatype in


the next Transformation
none

Depends on the datatype

None of the
above

Output Port

Variable Port

3,4

10

Any Number

reverts to Normal Loading

invalidates the
session

Tgt1, Tgt4, Tgt3, Tgt2

Reverts to
Normal Load

Input for Detail


Source and
Output for Master
Input for Master Source and Output for Detail Source Source
2

Both

Depends on the
number of Joiner
Transformations
used in the
mappings.
1

In joining a csv text file and an XML Sources

Both input
pipelines
originate from
the same Source
Qualifier
Transformation
2,4

depends on
which source the
can be configured in the session properties to either Informatica
use detail or master source to
Server first reads 2
Active/Unconnect
Passive/Unconnected
ed
2
Depends on the datatype of the value returned by
the condition
NULL
2

Can concatenate columns from the Source to


evaluate the expression.
Input/Output Port

Evaluates the
condition if the
Source is
relational source
only.
2

Output

Variable Port
2
Input/Output/Vari
able
3

Repository

To a Flat File in
the Informatica
Server $RootDir

Input/Output Ports

Variable Ports

1,000

10,000

Both NEXTVAL and CURRVAL Ports to 2 different


Ports in the next Transformation

None

To all the 3 groups whose condition it meets

Directs it to the
Default Group

Input/Output
Can be connected to next Transformation or Target
definition

GroupBy
Are rejected by
the Informatica
Server

Connected/Active

Deletes both the


default group as
well as the
Router
Transformation
1
None of the
above
3
Unconnected/Acti
ve
1

Persistent

None

Skips all the rows but forwards only the last

Fails the session 4

1,00,000 bytes

1,000,000 bytes 1

LIKE

None

Target Pre-Load

Normal

The Stored Procedure Transformation type is


Unconnected

The Stored
Procedure
Transformation
type is
Connected

Doesnt delete but assigns NULL to the default


group condition
Both of the above

Connected/Unconnected

Source Pre-Load 3
Marks the
Can be configured in the session properties sheet to mapping/session
either stop or continue the session
as invalid
3
Input, Output, GroupBy

Input, Output

Bad File

None

Lookup and Update Strategy

Lookup and
Update Strategy 2

Update Strategy Transformation

Expression
Transformation

7.3

3
1

By Using " Multiple Match" option

By Using "Lookup
Policy on Multiple
Match" option
4

Unconnected Look-up

non of these

By using user-defined join

non of these

Returns only one output row


Lookup

ip address
Take input more than one row

1
Takes multiple
output row
3
ExternalProcedur
e
2

vpn service
Take input one
row

2
1

The Code page for the XML target definition can be set to m
The Code Page for th1

The Code Page


for the XML data
source will
always be the
The Code page for the XML data source can be set to
match either the Code Page (also called encoding) in the same as the
Code Page for
XML source file or the DTD (Data Type Definition) file or
the repository.
4
the Code Page for the repository.

What r the methods for creating reusable transformations? informatica powercenter

Which transformation should u need while using the COBOL sources as source definitions?
What are the unsupported repository objects for a mapplet?

What are the mapping parameters and mapping variables?

In Infomatica Can you use the mapping parameters or variables created in one mapping into another
What are the difference between joiner transformation and source qualifier transformation?

In Informatica which conditions we can not use joiner transformation (Limitations of joiner transformat

Why use the lookup transformation?

Differences between connected and unconnected lookup?Connected lookup

What are the types of lookup caches ?

Difference between static cache and dynamic cache

What is the Router transformation?

What is the status code?


What is the difference between STOP and ABORT options in Workflow Monitor?

What is the target load order?


Describe two levels in which update strategy transformation sets?

What is Data driven?


What are the options in the target session of update strategy transformation?

What r the types of mapping wizards that r to be provided in Informatica?

What r the types of mapping in Getting Started Wizard?

What r the different types of Type2 dimension mapping?

What r two types of processes that informatica runs the session?


Can u generate reports in Informatica?
What is metadata reporter?
Define mapping and sessions?
How the informatica server increases the session performance through partitioning the source?

What r the tasks that Load manger process will do informatica powercenter?

What is DTM process?


What r the different threads in DTM process?

What r the data movement modes in informatica?

What r the out put files that the informatica server creates during the session running?

How can u recover the session in sequential batches?

How to recover sessions in concurrent batches?

How can u complete unrecoverable sessions?

What r the circumstances that infromatica server results an unrecoverable session?

Explain about perform recovery?

Explain about Recovering sessions?

What is tracing level and what r the types of tracing level?

What r the scheduling options to run a session?


What is incremental aggregation in informatica ?

What is the difference between Power Center & Power Mart?

What are the various tools? - Name a few

What are snapshots? What are materialized views?

What is partitioning? What are the types of partitioning?

What is a staging area? Do we need it? What is the purpose of a staging area?

How to determine what records to extract?

What are the various methods of getting incremental records or delta records from the source systems?

Can we use procedural logic inside Infromatica? If yes how, if now how can we use external procedural logi

How do we call shell scripts from informatica?

How do we extract SAP data Using Informatica? What is ABAP? What are IDOCS?

Where do we use semi and non-additive facts?

What is Full load & Incremental or Refresh load?

In Informatica there are Two methods


I. Design it in the transformation developer.
II. Promote a standard transformation from the mapping designer. After U add a transformation to the mapping, U ca
transformation at any time. If u change the properties of a reusable transformation in mapping, we can revert it to the

Normalizer transformation is used to normalize the data. Since COBOL sources r often consists of Demoralized data
COBOL source definition
Joiner transformations
Normalizer transformations
Non reusable sequence generator transformations.
Pre or post session stored procedures
Target definitions
Power mart 3.5 style Look Up functions
XML source definitions
IBM MQ source definitions

In Informatica Mapping parameter represents a constant value that you can define before running a session. Infoma
in a mapping or mapplet. Then define the value of parameter in a parameter file for the session. Unlike a mapping p
variable to the repository at the end of session run and uses that value next time you run the session.

NO. In Informatica We can use mapping parameters or variables in any transformation of the same mapping or map
In Informatica you can join heterogeneous data sources in joiner transformation which we can not ach
you need matching keys to join two relational sources in source qualifier transformation. Where as u d
Two relational sources should come from same data source in sourcequalifier. you can join relational s
In
In
In
In
In
In

Informatica
Informatica
Informatica
Informatica
Informatica
Informatica

Both pipelines begin with the same original data source.


Both input pipelines originate from the same Source Qualifier transformation.
Both input pipelines originate from the same Normalizer transformation.
Both input pipelines originate from the same Joiner transformation.
Either input pipelines contains an Update Strategy transformation.
Either input pipelines contains a connected or unconnected Sequence Generator transfo

In informatica lookup transformation uue to perform the following tasks:


Get a related value. For example, if your source table includes employee ID, but you want to include t
Perform a calculation. Many normalized tables include values used in a calculation, such as gross sale
Update slowly changing dimension tables. You can use a Lookup transformation to determine whether
Receives input values directly from the pipe line.
You can use a dynamic or static cache
Cache includes all lookup columns used in the mapping
Support user defined default values
Unconnected lookup
In Informatica
Receives input values from the result of a Lookup expression in a another transformation.
You can use a dynamic or static cache
Cache includes all lookup out put ports in the lookup condition and the lookup/return port.
Does not support user defined default values

Persistent cache: You can save the lookup cache files and reuse them the next time the informatica se
Recache from database: If the persistent cache is not synchronized with he lookup table, You can confi
Static cache: You can configure a static or read-only cache for only lookup table. By default informatic
transformation. When the lookup condition is true, the informatica server does not update the cache w
Dynamic cache: If You want to cache the target table and insert new rows into cache and the target, Y
Shared cache: You can share the lookup cache between multiple transactions. You can share unnamed

Static cache
In Informatica
You can not inert or update the cache.
The informatica server returns a value from the lookup table opr cache when the condition is true. Wh
informatica server returns the default value for connected transformations and null for unconnected tr

Dynamic cache In Informatica


You can insert rows into the cache as u pass To the target the informatica server inserts rows into cach
when the condition is false. This indicates that the row is not in the cache or target table.
You can pass these rows to the target table.

A Router transformation is similar to a Filter transformation because both transformations allow you t
meet the condition. A Router transformation tests data for one or more conditions and gives you the o
If you need to test the same input data based on multiple conditions, use a Router Transformation in a

In Informatica Status code provides error handling for the informatica server during the session. The s
When we issue the STOP command on the executing session task, the Integration Service stops readin
processing and committing data, we can issue the abort command.

In contrast ABORT command has a timeout period of 60 seconds. If the Integration Service cannot fini

You specify the target load order based on source qualifiers in a mapping. If u have the multiple sourc
Within a session. When you configure a session, you can instruct the Informatica Server to either treat
records for different database operations.
Within a mapping. Within a mapping, you use the Update Strategy transformation to flag records for in

The informatica server follows instructions coded into update strategy transformations within the sess
Insert
Delete
Update
Update as update
Update as insert
Update else insert
Truncate table

The Designer provides two mapping wizards to help you create mappings quickly and easily. Both wiza
Getting Started Wizard. Creates mappings to load static fact and dimension tables, as well as slowly g
the amount of historical dimension data you want to keep and the method you choose to handle histo

SimplePass through mapping :


Loads a static fact or dimension table by inserting all rows. Use this mapping when you want to drop a
Slowly Growing target :
Loads a slowly growing fact or dimension table by inserting new rows. Use this
mapping to load new
Type2 Dimension/Version Data Mapping: In this mapping the updated dimension in the
primary key.

sourc

Type2 Dimension/Flag current Mapping: This mapping is also used for slowly changing dimensions. In
Flag indicates the dimension is new or newly updated. Recent dimensions will gets saved with current

Type2 Dimension/Effective Date Range Mapping: This is also one flavor of Type2 mapping used for slow
effective date range for each version of each dimension.

Load manager Process: Starts the session, creates the DTM process, and sends post-session email wh
The DTM process. Creates threads to initialize the session, read, write, and transform data, and handle

Yes. By using Metadata reporter we can generate reports in informatica.


It is a web based application that enables you to run reports against repository metadata. With a meta
Mapping: It is a set of source and target definitions linked by transformation objects that define the ru
Session: It is a set of instructions that describe how and when to move data from source to targets.

For relational sources informatica server creates multiple connections for each partition of a single sou
concurrently. Similarly for loading also informatica server creates multiple connections to the target an
For XML and file sources, informatica server reads multiple files concurrently. For loading the data info

Manages the session and batch scheduling: When u start the informatica server the load manager lau
load manager maintains list of list of sessions and session start times. When u start a session load ma
Locking and reading the session: When the informatica server starts a session load manager locks the
Reading the parameter file: If the session uses a parameter files, load manager reads the parameter fi
load manger checks whether or not the user have privileges to run the session
Creating log files: Load manger creates log file contains the status of session.

After the load manger performs validations for session, it creates the DTM process. DTM is to create a
Master thread: Creates and manages all other threads
Mapping thread: One mapping thread will be creates for each session. Fetches session and mapping in
Pre and post session threads: This will be created to perform pre and post session operations.
Reader thread: One thread will be created for each partition of a source.It reads data from source.
Writer thread: It will be created to load data to the target.
Transformation thread: It will be created to transform data.

Data movement modes determines how informatica server handles the charector data. U choose the d
Two types of data movement modes available in informatica.
I. ASCII mode
II. Uni code mode.

Informatica server log: Informatica server(on unix) creates a log for all status and error messages(def
Session log file: Informatica server creates session log file for each session. It writes information abou
load summary. The amount of detail in session log file depends on the tracing level that u set.
Session detail file: This file contains load statistics for each target in mapping. Session detail includes
window
Performance detail file: This file contains information known as session performance details which help
Reject file: This file contains the rows of data that the writer does not write to targets.
Control file: Informatica server creates control file and a target file when U run a session that uses the
external loader.
Post session email: Post session email allows U to automatically communicate information about a ses
session fails.
Indicator file: If u use the flat file as a target, U can configure the informatica server to create indicato
reject.
Output file: If session writes to a target file, the informatica server creates the target file based on file
Cache files: When the informatica server creates memory cache it also creates cache files. For the foll
Aggregator transformation
Joiner transformation
Rank transformation
Lookup transformation

If you configure a session in a sequential batch to stop on failure, you can run recovery starting with t
property
To recover sessions in sequential batches configured to stop on failure:
1. In the Server Manager, open the session property sheet.
2. On the Log Files tab, select Perform Recovery, and click OK.
3. Run the session.
4. After the batch completes, open the session property sheet.
5. Clear Perform Recovery, and click OK.
If you do not clear Perform Recovery, the next time you run the session, the Informatica Server attemp
If you do not configure a session in a sequential batch to stop on failure, and the remaining sessions in

If multiple sessions in a concurrent batch fail, you might want to truncate all targets and run the batch
session as a standalone session.
To recover a session in a concurrent batch:
1. Copy the failed session using Operations-Copy Session.
2. Drag the copied session outside the batch to be a standalone session.
3. Follow the steps to recover a standalone session.
4. Delete the standalone copy.

Under certain circumstances, when a session does not complete, you need to truncate the target table
The source qualifier transformation does not use sorted ports.
If u change the partition information after the initial session fails.
Perform recovery is disabled in the informatica server configuration.
If the sources or targets changes after initial session fails.
If the mapping consists of sequence generator or normalizer transformation.
If a concurrent batch contains multiple failed sessions.

When the Informatica Server starts a recovery session, it reads the OPB_SRVR_RECOVERY table and no
processing from the next row ID. For example, if the Informatica Server commits 10,000 rows before th
By default, Perform Recovery is disabled in the Informatica Server setup. You must enable Recovery in
OPB_SRVR_RECOVERY table.

If you stop a session or if an error causes a session to stop, refer to the session and error logs to deter
the properties of the mapping, session, and Informatica Server configuration.
Use one of the following methods to complete the session:
Run the session again if the Informatica Server has not issued a commit.
Truncate the target tables and run the session again if the session is not recoverable.
Consider performing recovery if the Informatica Server has issued at least one commit.

Tracing level represents the amount of information that informatcia server writes in a log file. Types of
Normal
Verbose
Verbose init
Verbose data

U can schedule a session to run at a given time or interval, or u can manually run the session.
Different options of scheduling
Run only on demand: server runs the session only when user starts session explicitly
Run once: Informatica server runs the session only once at a specified date and time.
Run every: Informatica server runs the session at regular intervals as configured.
Customized repeat: Informatica server runs the session at the dates and times specified in the repeat

When using incremental aggregation, you apply captured changes in the source to aggregate calculat
Power Mart is designed for:
Low range of warehouses only for local repositories mainly desktop environment.
we can connect to single and multiple Repositories, generally used in big Enterprises.
Power mart is designed for:
High-end warehouses Global as well as local repositories
ERP support.
Power Mart: we can connect to only a single Repository.

The various ETL tools are as follows.


Informatica
Data stage
Business Objects Data Integrator
OLAP tools are as follows.
Cognos
Business Objects

Materialized view:

Answer 1.Materialized view is a view in which data is also stored in some temp table.i.e if we will go w
stored in some temp tables.
Answer 2. Materialized view means it stores pre calculated data, it is a physical representation and it'
Snapshot:

Answer 1. A snapshot is a table that contains the results of a query of one or more tables or views, oft
Answer 2.Snapshot is a specific interval of data,

Partitioning is a part of physical data warehouse design that is carried out to improve performance an
components because it:
1. Reduces work involved with addition of new data.
2. Reduces work involved with purging of old data.
Two types of partitioning are:
1. Horizontal partitioning.
2. Vertical partitioning (reduces efficiency in the context of a data warehouse).

Staging area is place where you hold temporary tables on data warehouse server. Staging tables are c
before loading the data into warehouse.

In the absence of a staging area, the data load will have to go from the OLTP system to the OLAP syste
staging area. In addition, it also offers a platform for carrying out data cleansing.
According to the complexity of the business rule, we may require staging area, the basic need of stagi

Data modeler will provide the ETL developer, the tables that are to be extracted from various sources.
When addressing a table some dimension key must reflect the need for a record to get extracted. Mos
be adding an archive flag to record, which gets reset when record changes.
Draw the inference if slowly changing dimension and based on the Type 1/2 or3 tables defined.

Getting incremental records from source systems to target can be done


by using incremental aggregation transformation
One foolproof method is to maintain a field called 'Last Extraction Date' and then impose a condition i
Using mapping parameters and variable or type1 we can easily define from where parameter will star
We can use External Procedure Transformation to use external procedures. Both COM and Informatica
Can we override a native sql query within Informatica? Where do we do it? How do we do it?
we can override a sql query in the sql override property of a source qualifier

You can use a Command task to call the shell scripts, in the following ways:
1. Standalone Command task. You can use a Command task anywhere in the workflow or worklet to ru
2. Pre- and post-session shell command. You can call a Command task as the pre- or post-session shel
There is a task named command task, using that you can write or call Shell script, DOS commands or

To extract SAP DATA.


Go to source analyzer, click on source, now u will get option 'Import from SAP'
Click on this now give your SAP access user, client, password and filter criteria as table name (so it wi
Now one important thing after finishing the map save it and generate ABAP Code for the map. Then o
Additive: A measure can participate arithmetic calculations using all or any dimensions.
Ex: Sales profit
Semi additive: A measure can participate arithmetic calculations using some dimensions.
Ex: Sales amount
Non-Additive measure cant participate arithmetic calculations using dimensions.
Ex: temperature

By Full Load or One-time load we mean that all the data in the Source table(s) should be processed. Th
came after one-time load.
Full Load is the entire data dump load taking place the very first time.
Gradually to synchronize the target data with source data, there are further 2 techniques:Refresh load - Where the existing data is truncated and reloaded completely.
Incremental - Where delta or difference between target and source data is dumped at regular interval
Full Load: completely erasing the contents of one or more tables and reloading with fresh data.
Incremental Load: applying ongoing changes to one or more tables based on a predefined schedule.

seen by the user. It only used by the informatica server to determine whether to continue running the session o

he informatica server ignores all update strategy transformations in the mapping.

nderlying tables in the repository.

he other threads.

ery or when running recovery might result in inconsistent data.

cess only those changes. This allows the Informatica Server to update your target incrementally, rather than fo

nue running the session or stop.

ementally, rather than forcing it to process the entire source and recalculate the same calculations each time yo

alculations each time you run the session.

You might also like