Sunday, January 15, 2017

Usage of $FLEX$

$FLEX$ is a special bind variable that can be used to base a parameter value on the other parameters (dependent parameters)

Syntax –     :$FLEX$.Value_ Set_Name
Value_Set_Name is the name of value set for a prior parameter in the same parameter window that you want your parameter to depend on.
Some scenarios where $FLEX$ can be used:

Example1:
Say you have a concurrent program with the below 2 parameters which are valuesets :
Parameter1 is Deparment
Parameter2 is Employee name
Let’s say there are 100 deparments and each deparment has 200 employees.  Therefore we have 2000 employees altogether.
If we  display all department names in the valueset of parameter1 and all employee names in parameter2  value set  then it might kill lot of performance and also it will be hard for a user to select an employee from the list of 2000 entries.
Better Solution is to let user select the department from the Department Valuset first. Based on the department selected, you can display only the employees in parameter2 that belong to the selected department in parameter1 valueset.

Example2:
Say you have a concurrent program with the below 2 parameters:
parameter1: directory path
parameter2: filename
Parameter1 and parameter2 are dependent on each other. If the user doesn’t enter  directory path, there is no point in enabling the parameter2  i.e filename. In such a case, parameter should be disabled.This can be achieved using $FLEX$.

Working Example of how to use $FLEX$:
Let’s take the standard concurrent program  ”AP Withholding Tax Extract” to explain how to use $FLEX$.
This program has 7 parameters  like “Date From”, “Date To”, “Supplier From”, “Supplier To” etc
The requirement is to add an additional parameter called “File Name”  where the user will give a name to the flat file where the tax extract will be written to, as a parameter. Instead of typing in the name of the file everytime you run the program, the file name should  be defaulted with the value that the user provides for the parameter “Date From” plus  ”.csv” which is the file extension. Let us now see how this can be achieved using $FLEX$.

Navigation:
Application Developer responsibility > Concurrent > Program
Query up the Concurrent

Click “Parameters” Button

Add the parameter “File
  • Seq: 80 (Something that is not already assigned to other parameters. It’s always better to enter sequences in multiple of 5 or 10. So that you can insert any additional parameters if you want later in middle)
  • Parameter: ‘File Name’
  • Description: ‘File Name’
  • Value set: ’240 Characters’
  • Prompt: File Name
  • Default Type:  SQL Statement
  • Default Value: Select :$FLEX$.FND_STANDARD_DATE||’.csv’  from dual

Here FND_STANDARD_DATE is the value set name of the parameter “Date From” as seen in the above screenshot.
$FLEX$.FND_STANDARD_DATE gets the value that the user enters for the parameter “Date From”
select :$FLEX$.FND_STANDARD_DATE||’.csv’  from dual” returns “Date From” parameter value appended with ‘.csv’

Save your work.
Now go to the respective responsibility and run the concurrent program.

When you enter the value of “Date From” and hit tab, File Name parameter will automatically be populated as shown in the below screenshot.

$FLEX$ and $PROFILES$ Details

What is $FLEX$ and $PROFILES$?
$FLEX$ and $PROFILES$ are Special Variables in oracle Apps environment and are used to hold values at runtime.
Every Oracle Apps technical consultant will be familiar with the term $FLEX$ and $PROFILES$. Whenever we are working with value sets, we will be using both of these for modifying the data fetched, for basing the values of a parameter on the other parameter and also fetching the Profile Option values.
To segregate this based on the functionality

$FLEX$: Used for basing the value of a parameter on another parameter.
$PROFILES$: used for fetching the data stored in the specified profile option value which is currently active.

Where is it used?
Both these variables are used in the Parameter form of a Concurrent Program and are used at the Where Clause window in the value set of Table type.

Syntax:
:$FLEX$.previous_value_set_name
Important:
  • $FLEX$ must always be in capitals.
  • A ‘:’ must precede the declaration of $FLEX$.
  • The previous value set name must have already been assigned and saved on a different parameter.

:$PROFILES$.Profile_option_name
Important:
  • $PROFILES$ must be always in capitals.
  • ‘:’ must always precede the declaration.
  • Profile option name is the Profile Name and not to be confused with the User profile Name.

Some use of the Special Variables are as below:

Pre-Requisites:

Created an Executable and a concurrent program which is registered in the Application Object Library. The Query for the report is as below:

SELECT e.ename, e.empno, e.sal, e.hiredate, e.deptno, d.dname, d.loc
FROM emp e, dept d
WHERE d.deptno = :x_deptno;
The name of the concurrent program is taken as “XX_Checking_flex_and_profile_use” 

Oracle Conversion and Interface Strategy

  1. Data Mapping
During the data mapping process, list of all the data sets and data elements that will need to be moved into the Oracle tables as part of conversion are identified. Data mapping tables are prepared as part of this activity that show what are the data elements that are needed by the target system to meet the business requirements and from where they will be extracted in the old system.

  1. Download Programs
After the conversion data mapping is complete, download programs are developed that are used to extract the identified conversion data elements from the current systems in the form of an ASCII flat file. The structure of the flat file must match the structure of the Oracle standard interface tables. These flat files generated may be in text form or a comma or space delimited, variable or fixed format data file.

  1. Upload Program
Once the data has been extracted to a flat file, it is then moved to the target file system and the data from the file is loaded into user defined staging tables in the target database using SQL Loader or UTL_FILE utilities. Then programs are written and run which validate the data in the staging tables and insert the same into the Oracle provided standard Interface tables.

  1. Interface Program

Once the interface tables are populated, the respective interface program (each data element interface has a specific interface program to run) is submitted. The interface programs validate the data, derive and assign the default values and ultimately populate the production base tables.

Oracle SQL Loader Process Details

Overview:
SQL LOADER is an Oracle utility used to load data into table given a datafile which has the records that need to be loaded. SQL*Loader takes data file, as well as a control file, to insert data into the table. When a Control file is executed, it can create Three (3) files called log file, bad file or reject file, discard file.
  • Log file tells you the state of the tables and indexes and the number of logical records already read from the input datafile. This information can be used to resume the load where it left off.
  • Bad file or reject file gives you the records that were rejected because of formatting errors or because they caused Oracle errors.
  • Discard file specifies the records that do not meet any of the loading criteria like when any of the WHEN clauses specified in the control file. These records differ from rejected records.

Structure of the data file:
The data file can be in fixed record format or variable record format.
Fixed Record Format would look like the below. In this case you give a specific position where the Control file can expect a data field:
7369      SMITH     CLERK         7902      12/17/1980     800
7499      ALLEN     SALESMAN      7698      2/20/1981      1600
7521      WARD      SALESMAN      7698      2/22/1981      1250
7566      JONES     MANAGER       7839      4/2/1981       2975
7654      MARTIN    SALESMAN      7698      9/28/1981      1250
7698      BLAKE     MANAGER       7839      5/1/1981       2850
7782      CLARK     MANAGER       7839      6/9/1981       2450
7788      SCOTT     ANALYST       7566      12/9/1982      3000
7839      KING      PRESIDENT               11/17/1981     5000
7844      TURNER    SALESMAN      7698      9/8/1981       1500
7876      ADAMS     CLERK         7788      1/12/1983      1100
7900      JAMES     CLERK         7698      12/3/1981      950
7902      FORD      ANALYST       7566      12/3/1981      3000
7934      MILLER    CLERK         7782      1/23/1982      1300

Variable Record Format would like below where the data fields are separated by a delimiter.
Note: The Delimiter can be anything you like. In this case it is “|”
1196700|9|0|692.64
1378901|2|3900|488.62
1418700|2|2320|467.92
1418702|14|8740|4056.36
1499100|1|0|3.68
1632800|3|0|1866.66
1632900|1|70|12.64
1637600|50|0|755.5

Structure of a Control file:
Sample CTL file for loading a Variable record data file:
OPTIONS (SKIP = 1)   --> The first row in the data file is skipped without loading
LOAD DATA
INFILE '$FILE'             --> Specify the data file  path and name
APPEND                       --> type of loading (INSERT, APPEND, REPLACE, TRUNCATE
INTO TABLE “APPS”.”BUDGET”   --> the table to be loaded into
FIELDS TERMINATED BY '|'           --> Specify the delimiter if variable format datafile
OPTIONALLY ENCLOSED BY '"'   -->the values of the data fields may be enclosed in “
TRAILING NULLCOLS     --> columns that are not present in the record treated as null
(ITEM_NUMBER    "TRIM(:ITEM_NUMBER)", --> Can use all SQL functions on columns
QTY             DECIMAL EXTERNAL,
REVENUE         DECIMAL EXTERNAL,
EXT_COST        DECIMAL EXTERNAL TERMINATED BY WHITESPACE "(TRIM(:EXT_COST))"  ,
MONTH          "to_char(LAST_DAY(ADD_MONTHS(SYSDATE,-1)),'DD-MON-YY')" ,
DIVISION_CODE   CONSTANT "AUD"  --> Can specify constant value instead of Getting value from datafile
)

OPTION statement precedes the LOAD DATA statement. The OPTIONS parameter allows you to specify runtime arguments in the control file, rather than on the command line. The following arguments can be specified using the OPTIONS parameter.
SKIP = n – Number of logical records to skip (Default 0)
LOAD = n — Number of logical records to load (Default all)
ERRORS = n — Number of errors to allow (Default 50)
ROWS = n   — Number of rows in conventional path bind array or between direct path data saves (Default: Conventional Path 64, Direct path all)
BINDSIZE = n – Size of conventional path bind array in bytes (System-dependent default)
SILENT = {FEEDBACK | ERRORS | DISCARDS | ALL} — Suppress messages during run  (header, feedback, errors, discards, partitions, all)
DIRECT = {TRUE | FALSE} –Use direct path (Default FALSE)
PARALLEL = {TRUE | FALSE} — Perform parallel load (Default FALSE)

LOADDATA statement is required at the beginning of the control file.

INFILE: INFILE keyword is used to specify location of the datafile or datafiles.
INFILE* specifies that the data is found in the control file and not in an external file. INFILE ‘$FILE’, can be used to send the filepath and filename as a parameter when registered as a concurrent program.
INFILE   ‘/home/vision/kap/import2.csv’ specifies the filepath and the filename.
Example where datafile is an external file:
LOAD DATA
INFILE   ‘/home/vision/kap/import2.csv’
INTO TABLE kap_emp
FIELDS TERMINATED BY “,”
( emp_num, emp_name, department_num, department_name )

Example where datafile is in the Control file:
LOAD DATA
INFILE *
INTO TABLE kap_emp
FIELDS TERMINATED BY “,”
( emp_num, emp_name, department_num, department_name )
BEGINDATA
7369,SMITH,7902,Accounting
7499,ALLEN,7698,Sales
7521,WARD,7698,Accounting
7566,JONES,7839,Sales
7654,MARTIN,7698,Accounting

Example where file name and path is sent as a parameter when registered as a concurrent program
LOAD DATA
INFILE ‘$FILE’
INTO TABLE kap_emp
FIELDS TERMINATED BY “,”
( emp_num, emp_name, department_num, department_name )
TYPE OF LOADING:
INSERT   — If the table you are loading is empty, INSERT can be used.
APPEND  — If data already exists in the table, SQL*Loader appends the new rows to it. If data doesn’t already exist, the new rows are simply loaded.
REPLACE — All rows in the table are deleted and the new data is loaded
TRUNCATE — SQL*Loader uses the SQL TRUNCATE command.
INTOTABLE is required to identify the table to be loaded into. In the above example INTO TABLE “APPS”.”BUDGET”, APPS refers to the Schema and BUDGET is the Table name.
FIELDS TERMINATED BY specifies how the data fields are terminated in the datafile.(If the file is Comma delimited or Pipe delimited etc)
OPTIONALLY ENCLOSED BY ‘”‘ specifies that data fields may also be enclosed by quotation marks.
TRAILINGNULLCOLS clause tells SQL*Loader to treat any relatively positioned columns that are not present in the record as null columns.

Loading a fixed format data file:
LOAD DATA
INFILE 'sample.dat'
INTO TABLE emp
(empno         POSITION(01:04)   INTEGER EXTERNAL,
ename          POSITION(06:15)   CHAR,
job            POSITION(17:25)   CHAR,
mgr            POSITION(27:30)   INTEGER EXTERNAL,
sal            POSITION(32:39)   DECIMAL EXTERNAL,
comm           POSITION(41:48)   DECIMAL EXTERNAL,
deptno         POSITION(50:51)   INTEGER EXTERNAL)

Steps to Run the SQL* LOADER from UNIX:
At the prompt, invoke SQL*Loader as follows:

sqlldr USERID=scott/tiger CONTROL=<control filename> LOG=<Log file name>
SQL*Loader loads the tables, creates the log file, and returns you to the system prompt. You can check the log file to see the results of running the case study.

Register as concurrent Program:
Place the Control file in $CUSTOM_TOP/bin.
Define the Executable. Give the Execution Method as SQL*LOADER.
Define the Program. Add the Parameter for FILENAME.

Skip columns:
You can skip columns using the ‘FILLER’ option.
Load Data



TRAILING  NULLCOLS
(
name Filler,
Empno ,
sal
)
here the column name will be skipped.

SQL LOADER is a very powerful tool that lets you load data from a delimited or position based data file into Oracle tables. We have received many questions regarding SQL LOADER features from many users. Here is the brief explanation on the same.
Please note that the basic knowledge of SQL LOADER is required to understand this article.
This article covers the below topics:
1. Load multiple data files into a single table
2. Load a single data file into multiple tables
3. Skip a column while loading using “FILLER” and Load field in the delimited data file into two different columns in a table using “POSITION”
4. Usage of BOUNDFILLER
5. Load the same record twice into a single table
6. Using WHEN to selectively load the records into the table
7. Run SQLLDR from SQL PLUS
8. Default path for Discard, bad and log files

1) Load multiple files into a single table:
SQL LOADER lets you load multiple data files at once into a single table. But all the data files should be of the same format.

Here is a working example:
Say you have a table named EMP which has the below structure:
Column
Data Type
emp_num
Number
emp_name
Varchar2(25)
department_num
Number
department_name
Varchar2(25)

You are trying to load the below comma delimited data files named eg.dat and eg1.dat:
eg.dat:
7369,SMITH,7902,Accounting
7499,ALLEN,7698,Sales
7521,WARD,7698,Accounting
7566,JONES,7839,Sales
7654,MARTIN,7698,Accounting

eg1.dat:
1234,Tom,2345,Accounting
3456,Berry,8976,Accounting

The Control file should be built as below:
LOAD DATA
INFILE ‘eg.dat’ — File 1
INFILE ‘eg1.dat’ — File 2
APPEND
INTO TABLE emp
FIELDS TERMINATED BY “,”
( emp_num, emp_name, department_num, department_name )

2) Load a single file into multiple tables:
SQL Loader lets you load a single data file into multiple tables using “INTO TABLE” clause.

Here is a working example:
Say you have two tables named EMP and DEPT which have the below structure:
Table
Column
Data Type
EMP
emp_num
Number
EMP
emp_name
Varchar2(25)
DEPT
department_num
Number
DEPT
department_name
Varchar2(25)

You are trying to load the below comma delimited data file named eg.dat which has columns Emp_num and emp_name that need to be loaded into table EMP and columns department_num and department_name that need to be loaded into table DEPT using a single CTL file here.
eg.dat:
7369,SMITH,7902,Accounting
7499,ALLEN,7698,Sales
7521,WARD,7698,Accounting
7566,JONES,7839,Sales
7654,MARTIN,7698,Accounting

The Control file should be built as below:
LOAD DATA
INFILE ‘eg.dat’
APPEND
INTO TABLE emp
FIELDS TERMINATED BY “,”
( emp_num, emp_name )
INTO TABLE dept
FIELDS TERMINATED BY “,”
(department_num, department_name)
You can further use WHEN clause to selectively load the records into the tables which will be explained later in this article.

3) Skip a column while loading using “FILLER” and Load field in the delimited data file into two different columns in a table using “POSITION”
SQL LOADER lets to skip unwanted fields in the data file by using the “FILLER” clause. Filler was introduced in Oracle 8i.
SQL LOADER also lets you load the same field into two different columns of the table.
If the data file is position based, loading the same field into two different columns is pretty straight forward. You can use Position (start_pos:end_pos) keyword
If the data file is a delimited file and it has a header included in it, then this can be achieved by referring the field preceded with “:” eg description “(:emp_name)”.
If the data file is delimited file without a header included in it, Position (start_pos:end_pos) or “(:field)” will not work. This can be achieved using POSITION (1) clause which takes you to the beginning of the record.

Here is a Working Example:
The requirement here is to load the field emp_name in the data field into two columns – emp_name and description of the table EMP. Here is the Working Example:

Say you have a table named EMP which has the below structure:
Column
Data Type
emp_num
Number
emp_name
Varchar2(25)
description
Varchar2(25)
department_num
Number
department_name
Varchar2(25)

You are trying to load the below comma delimited data file named eg.dat which has 4 fields that need to be loaded into 5 columns of the table EMP.
eg.dat:
7369,SMITH,7902,Accounting
7499,ALLEN,7698,Sales
7521,WARD,7698,Accounting
7566,JONES,7839,Sales
7654,MARTIN,7698,Accounting

Control File:
LOAD DATA
INFILE ‘eg.dat’
APPEND
INTO TABLE emp
FIELDS TERMINATED BY “,”
(emp_num,
emp_name,
desc_skip FILLER POSITION(1),
description,
department_num,
department_name)

Explanation on how SQL LOADER processes the above CTL file:
· The first field in the data file is loaded into column emp_num of table EMP
· The second field in the data file is loaded into column emp_name of table EMP
· The field desc_skip enables SQL LOADER to start scanning the same record it is at from the beginning because of the clause POSITION(1) . SQL LOADER again reads the first delimited field and skips it as directed by “FILLER” keyword.
· Now SQL LOADER reads the second field again and loads it into description column of the table EMP.
· SQL LOADER then reads the third field in the data file and loads into column department_num of table EMP
· Finally the fourth field is loaded into column department_name of table EMP.

4) Usage of BOUNDFILLER
BOUNDFILLER is available with Oracle 9i and above and can be used if the skipped column’s value will be required later again.

Here is an example:
The requirement is to load first two fields concatenated with the third field as emp_num into table emp and Fourth field as Emp_name

Data File:
1,15,7369,SMITH
1,15,7499,ALLEN
1,15,7521,WARD
1,18,7566,JONES
1,20,7654,MARTIN

The requirement can be achieved using the below Control File:
LOAD DATA
INFILE 'C:\eg.dat'
APPEND
INTO TABLE EMP
FIELDS TERMINATED BY,
(
Rec_skip BOUNDFILLER,
tmp_skip BOUNDFILLER,
Emp_num “(:Rec_skip||:tmp_skip||:emp_num),
Emp_name
)

5) Load the same record twice into a single table:
SQL Loader lets you load record twice using POSITION clause but you have to take into account whether the constraints defined on the table allow you to insert duplicate rows.

Below is the Control file:
LOAD DATA
INFILE ‘eg.dat’
APPEND
INTO TABLE emp
FIELDS TERMINATED BY “,”
( emp_num, emp_name, department_num, department_name )
INTO TABLE emp
FIELDS TERMINATED BY “,”
(emp_num POSITION(1),emp_name,department_num,department_name)

SQL LOADER processes the above control file this way:
First “INTO TABLE” clause loads the 4 fields specified in the first line of the data file into the respective columns (emp_num, emp_name, department_num, department_name)
Field scanning does not start over from the beginning of the record when SQL LOADER encounters the second INTO TABLE clause in the CTL file. Instead, scanning continues where it left off. Statement “emp_num POSITION(1)” in the CTL file forces the SQL LOADER to read the same record from the beginning and loads the first field in the data file into emp_num column again. The remaining fields in the first record of the data file are again loaded into respective columns emp_name, department_num, department_name. Thus the same record can be loaded multiple times into the same table using “INTO TABLE” clause.

6) Using WHEN to selectively load the records into the table
WHEN clause can be used to direct SQL LOADER to load the record only when the condition specified in the WHEN clause is TRUE. WHEN statement can have any number of comparisons preceded by AND. SQL*Loader does not allow the use of OR in the WHEN clause.

Here is a working example which illustrates how to load the records into 2 tables EMP and DEPT based on the record type specified in the data file.
The below is delimited data file eg.dat which has the first field as the record type. The requirement here is to load all the records with record type = 1 into table EMP and all the records with record type = 2 into table DEPT and record with record type =3 which happens to be the trailer record should not be loaded.
1,7369,SMITH
2,7902,Accounting
1,7499,ALLEN
2,7698,Sales
1,7521,WARD
2,7698,Accounting
1,7566,JONES
2,7839,Sales
1,7654,MARTIN
2,7698,Accounting
3,10

Control File:
LOAD DATA
INFILE ‘eg.dat’
APPEND
INTO TABLE emp
WHEN (01) = ’1′
FIELDS TERMINATED BY “,”
( rec_skip filler POSITION(1),emp_num , emp_name )
INTO TABLE dept
WHEN (01) = ’2′
FIELDS TERMINATED BY “,”
(rec_skip filler POSITION(1),department_num,
department_name )

Let’s now see how SQL LOADER processes the CTL file:
· SQL LOADER loads the records into table EMP only when first position (01) of the record which happens to be the record type is ’1′ as directed by command
INTO TABLE emp
WHEN (01) = ’1′
· If condition When (01) = ’1′ holds true for the current record, then SQL LOADER gets to the beginning of the record as directed by command POSITION(1) and skips the first field which is the record type.
· It then loads the second field into emp_num and third field into emp_name column in the table EMP.
· SQL LOADER loads the records into table DEPT only when first position (01) of the record which happens to be the record type is ’2′ as directed by the commands -
INTO TABLE dept
WHEN (01) = ’2′
· If condition When (01) = ’2′ holds true for the current record, then SQL LOADER gets to the beginning of the record as directed by command POSITION(1) and skips the first field which is the record type.
· It then loads the second field into department_num and third field into department_name columns in the table DEPT.
· The records with record type = ’3′ are not loaded into any table.
Thus you can selectively loads the necessary records into various tables using WHEN clause.

7) Run SQLLDR from SQL PLUS
SQL LOADER can be invoked from SQL PLUS using “host” command as shown below:
host sqlldr userid= username/password@host control = C:\eg.ctl log = eg.log

Default path for Discard, bad and log files

If bad and discard file paths are not specified in the CTL file and if this SQL Loader is registered as a concurrent program, then they will be created in the directory where the regular Concurrent programs’ output files reside. You can also find the paths where the discard and bad files have been created in the log file of the SQL LOADER concurrent request.
Best Blogger TipsGet Flower Effect