Thursday, 30 March 2017

OBIEE 11g Config and log file locations

bi_server1.log  &  bi_server1-diagnostic.log 
[$FMW_HOME]/user_projects/domains/bifoundation_domain/servers/bi_server1/logs
nqserver.log    &    nqquery.log
[$FMW_HOME]/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
sawlog.log
[$FMW_HOME]/instances/instance1/diagnostics/logs/OracleBIPresentationServicesComponent/coreapplication_obips1
NQSConfig.INI   &  DBFeatures.INI
[$FMW_HOME]/instances/instance1/config/OracleBIServerComponent/coreapplication_obis1
instanceconfig.xml      &     credentialstore.xml
[$FMW_HOME]/instances/instance1/config/OracleBIPresentationServicesComponent/coreapplication_obips1
config.xml
[$FMW_HOME]/user_projects/domains/bifoundation_domain/config
system-jazn-data.xml
[$FMW_HOME]/user_projects/domains/bifoundation_domain/config/fmwconfig

Tuesday, 28 February 2017

mount windows folder in linux

mount -t smbfs -o username=pc,password=**** /home/mount-point //192.168.1.15(Windows folder path)

BASIC UNIX COMMANDS

1. tty - reveals the current terminal
2. whoami - reveals the currently logged-in user
3. which - reveals where in the search path a program is located
4. echo - prints to the screen
 a. echo $PATH - dumps the current path to STDOUT
 b. echo $PWD - dumps ths contents of the $PWD variable
 c. echo $OLDPWD - dumps the most recently visited directory

5. clear - clears the screen or terminal
6. history - reveals your command history
 a. command history is maintained on a per-user basis via:
  ~/.sh_history
~ = users's $HOME directory
7. pwd - prints the working directory
8. cd - changes directory to desired directory
 a. 'cd ' with no options changes to the $HOME directory
 b. 'cd ~' changes to the $HOME directory
 c. 'cd /' changes to the root of the file system
 d. 'cd ..' changes us one-level up in the directory tree
 e. 'cd ../..' changes us two-levels up in the directory tree

9. ls - lists files and directories
 a. ls / - lists the contents of the '/' mount point
 b. ls -l - lists the contents of a directory in long format:
 Includes: permissions, links, ownership, size, date, name
 c. ls -ld /etc - lists properties of the directory '/etc', NOT the contents of '/etc'
 d. ls -ltr - sorts chronologically from older to newer (bottom)
 e. ls -a - reveals hidden files. e.g. '.sh_history'
Note: files/directories prefixed with '.' are hidden. e.g. '.sh_history'

10. cat - catenates files
 a. cat 123.txt - dumps the contents of '123.txt' to STDOUT
 b. cat 123.txt 456.txt dumps both files to STDOUT
 c. cat 123.txt 456.txt > 123456.txt - creates new catenated file

11. mkdir - creates a new directory
 a. mkdir test - creates a 'test' directory

12. cp - copies files
 a. cp 123.txt test/

13. mv - moves files
 a. mv 123456.txt test/ - moves the file, preserving timestamp

14. rm - removes files/directories
 a. rm 123.txt
 b. rm -rf 456.txt - removes recursively and enforces

15. touch - creates blank file/updates timestamp
 a. touch test.txt - will create a zero-byte file, if it doesn't exist
 b. touch 123456.txt - will update the timestamp
 c. touch -t 201003221530 123456.txt - changes timestamp

16. stat - reveals statistics of files
 a. stat 123456.txt - reveals full attributes of the file

17. find - finds files using search patterns
 a. find / -name 'fstab'
Note: 'find' can search for fields returned by the 'stat' command

18. alias - returns/sets aliases for commands
 a. alias - dumps current aliases
 b. alias copy='cp'


###Unix Redirection & Pipes###
Features:
 1. Ability to control input and output

Input redirection '<':
 1. cat < 123.txt
Note: Use input redirection when program does NOT default to file as input

Output redirection '>':
 1. cat 123.txt > onetwothree.txt
Note: Default nature is to:
 1. Clobber the target file
 2. Populate with information from input stream


Append redirection '>>':
 1. cat 123.txt >> numbers.txt - creates 'numbers.txt' if it doesn't exist, or appends if it does

 2. cat 456.txt >> numbers.txt


Pipes '|':
Features: Connects the output stream of one command to the input stream of a subsequent command

 1. cat 123.txt | sort
 2. cat 456.txt 123.txt | sort
 3. cat 456.txt 123.txt | sort | grep 3


###Command Chaining###
Features:
 1. Permits the execution of multiple commands in sequence
 2. Also permits execution based on the success or failure of a previous command

 1. cat 123.txt ; ls -l - this runs first command, then second command without regards for exit status of the first command

 2. cat 123.txt && ls -l - this runs second command, if first command is successful
 3. cat 1234.txt && ls -l

 4. cat 123.txt || ls -l - this runs second command, if first command fails


20. more|less - paginators, which display text one-page @ a time
 1. more /etc/fstab
 2. less 1thousand.txt

21. seq - echoes a sequence of numbers
 a. seq 1000 > 1thousand.txt - creates a file with numbers 1-1000

22. su - switches users
 a. su - with no options attempts to log in as 'root'

23. head - displays opening lines of text files
 a. head /var/log/messages

24. tail - displays the closing lines of text files
 a. tail /var/log/messages

25. wc - counts words and optionally lines of text files
 a. wc -l /var/log/messages
 b. wc -l 123.txt

26. file - determines file type
 a. file /var/log/messages


###Tar, Gzip###
Features:
 1. Compression utility (gzip)
 2. File roller (the ability to represent many files as one - tar)


Gzip:
Includes:
 1. gzip - compresses/decompresses files
 2. gunzip - decompresses gzip files

Tasks:
 1. compress '1million.txt' file using gzip
  a. gzip -c 1million.txt > 1million.txt.gz

Note: gzip auto-dumps to STDOUT, by default

  b. gzip -l 1million.txt.gz - returns status information
  c. gunzip 1million.txt.gz - dumps to file, and removes compressed version
  d. gzip -d 1million.txt.gz
  e. zcat 1million.txt.gz - dumps the contents to STDOUT
  f. less 1million.txt.gzip - dumps the contents of gzip files to STDOUT


Tar :

 1. tar -cvf filename.tar path/ - creates a non-compressed archive
 2. tar -cvf 1million.txt.tar 1million.txt

###GREP###
Features:
 1. The ability to parse lines based on text and/or RegExes
 2. Post-processor
 3. Searches case-sensitively, by default
 4. Searches for the text anywhere on the line


1. grep 'linux' grep1.txt
2. grep -i 'linux' grep1.txt - case-insensitive search
3. grep '^linux' grep1.txt - uses '^' anchor to anchor searches at the beginning of lines
4. grep -i '^linux' grep1.txt
5. grep -i 'linux$' grep1.txt - uses '$' anchor to anchor searches at the end of lines

Note: Anchors are RegEx characters (meta-characters). They're used to match at the beginning and end of lines

6. grep '[0-9]' grep1.txt - returns lines containing at least 1 number
7. grep '[a-z]' grep1.txt


8. grep -v sshd messages - performs and inverted search (all but 'sshd' entries will be returned)
9. grep -v sshd messages | grep -v gconfd

Friday, 17 February 2017

DAC errors and solutions while doing Full Load first time for the BI Apps (OBIA)

Changes for the Errors :
1.       SDE_ORA_GL_LinkageInformation_Extract
Cause : Temp Tablespace Problem
Solution : Added 30GB in Temp Tablespace in both (Apps and Datawarehouse) database for 1 year of data (Required more tempspace for more years of load)

2.       SDE_ORA_GLSegmentDimension
Error : Value larger than specified precision allowed for this column.
Temp Solution : Alter column SEGEMENT_VAL_CODE size to 150 from 50.
Solution : Check the SR for the solution

3.       SDE_ORA_CodeDimension_FND_Flex_Mcat
Error : 1- The Task was not added in Parameters file
            2- $$PROD_CAT_SET_ID1 - Inventry
Solution : add the task to the parameter file (copy the other task parameter having similar name and paste it to the file with the required name).
And $$PROD_CAT_SET_ID = 1 and $$INV_PROD_CAT_SET_ID = 1

4.       SDE_ORA_EmployeeExpenseFact_FP_Full
Need Patch on EBS – Patch 11i.FIN_PR/11i.OIE.I family Pack (Not required)
Solution – Inactive Subject Area Employee Expense

5.        SDE_ORA_PeggingDetailsFact and SDE_ORA_PeggingDetailsFact_ASCP
Error : Invalid Identifier
Solution : Inactive the Task

6.       SDE_ORA_Project_Hierarchy and SDE_ORA_Project
(Subjaect Are – Planing Analysis)
Error : Invalid Identifier
Solution : Inactive the Task

7.       SIL_GLSegmentDimension
Error : Index creation failed
Solution : Create Index manually and mark the task as completed



8.       Subject Area : Supply Chain – BOM Item
Tasks : SDE_ORA_BOMHeaderDimension
                SDE_ORA_BOMHeader_Auxillary
                SDE_ORA_BOMItem_Fact
ERROR : Invalid Identifier
Solution : Patch required on EBS

9.       SDE_ORA_ProductDimension_Derive_Full
Error : Value too large
Solution : Double the column length in mapping and in target table.

10.   SDE_ORA_CustomerAccountDimension_Full
Error : Value too large
Solution : Double the column length in mapping and in target table.

11.   SDE_ORA_SalesProductDimension_Full
Error : Value too large
Solution : Double the column length in mapping and in target table

12.   SDE_ORA_CustomerFinincialProfileDimension_ProfileAmts_Full
Error : Value too large
Solution : Double the column length in mapping and in target table

13.   SIL_BOM_ItemFact_Full
Error : An error occurred executing the stored procedure
                PLS – 00201 : identifier ‘COMPUTE_BOUNDS_ORA11i’ must be declared
Solution : Navigate to the OracleBI\dwrep\Informatica\Stored_Procedure
                    Folder and copy the codes of COMPUTE_BOUNDS_ORA11i.sql and compile(run) into the
                    Dataware House database.

14.   $$SET_OF_BOOKS_TYPE_LIST = ‘N’

15.   Subject Area - SDE_ORA_ProductDimension_Full
Task : SDE_ORA_ProductCategories_Resolve
Error : [Error(transformation error)]
Solution : Remove default value [Error(transformation error)] from port created_by_ID and change_by_ID from the update stratigy Upd_Strategy_W_ORA_Product_DS_Tmp_Ins_upd

16.   SDE_ORA_GLBalanceFact
Error : ORA-01488  Value larger than specified precision allowed for this column

17.   SDE_ORA_UOMConversionGeneral_InterClass
Cause : Taking long time to run (40-50 min) and then failing without any error
Source table – MTL_UNITS_OF_MEASURE
                                MTL_UOM_CLASS_CONVERSIONS


Changes as per SR :
Cause : Succeeded in Informatica monitor but failed in DAC
Solution : Delete the Parameter $$YEAR_START from mappings and from the Tasks
a.       PLP_ARXactGroupAccount_A2_Load
b.      PLP_GLCogsGroupAccount_A2_Load
c.       PLP_GLOtherGroupAccount_A2_Load
d.      PLP_GLRevenGroupAccount_A2_Load

How to set a OBIEE logging level

You might want to diagnose performance or data issues by setting a temporary log level for a query. You can enable query logging for a specific query by preceding your Select statement with the following:
Set Variable LOGLEVEL=n;
This instruction sets the loglevel system session variable only for this SQL.
See the following example:
Set Variable LOGLEVEL=5; select year, product, sum(revenue) from time, products, facts
For this query, the logging level of five is used regardless of the value of the underlying LOGLEVEL variable.
Available in 11G
You can find a default logging level in the option of the repository:

This option determines the default query logging level for the internal BISystem user.
A query logging level of 0 (the default) means no logging. Set this logging level to 2 to enable query logging for internal system processes likeevent polling and initialization blocks.
By setting the loglevel system session variable via an initialization block, you can control the loglevel of each user.
The loglevel must be an integer datatype
The session variable LOGLEVEL overrides a user's logging level. For example, if the Oracle BI Administrator has a logging level defined as 4 and LOGLEVEL is defined as default 0 (zero) in the repository, the Oracle BI Administrator's logging level will be 0.
§  In the Administration Tool, select Manage > Security (10g) or Identity (11g). The Security Manager dialog box appears.
§  Double-click the user's user ID. The User dialog box appears.
§  Set the logging level by clicking the Up or Down arrows next to the Logging Level field.


You cannot configure a logging level for a group.

Command to import and Export Database and users schemas

The following example shows how to export full database
$exp USERID=scott/tiger FULL=y FILE=myfull.dmp

To export Objects stored in a particular schemas you can run export utility with the following arguments
$exp USERID=scott/tiger OWNER=(SCOTT,ALI) FILE=exp_own.dmp

As a DBA user, perform a full export from the source database, for example:
> exp system/manager FULL=y FILE=myfullexp.dmp

As a DBA user, perform a full import with the IGNORE parameter enabled:
> imp system/manager FULL=y IGNORE=y FILE=myfullexp.dmp