Author Archives: Asaf Tal

A short reminder – ORA-19279: XPTY0004 – XQuery dynamic type mismatch: expected singleton sequence – got multi-item sequence

ORA-19279: XPTY0004 - XQuery dynamic type mismatch: expected singleton sequence - got multi-item sequence
19279. 00000 -  "XQuery dynamic type mismatch: expected singleton sequence - got multi-item sequence"

A short reminder, this message may look intimidating but this is one of the rare cases when oracle codes actually explains the problem well:

*Cause: The XQuery sequence passed in had more than one item.
*Action: Correct the XQuery expression to return a single item sequence.

Basically, it only means that there is more than one instance of the tag you are looking at the level you are looking. Or, in plain English, duplicate nodes.

for example:
(pardon the poor indentation, for some reason the code editor is failing in showing html tags)




ITEMS varchar2(10) PATH '//*:ITEM'

ORA-19279: XPTY0004 – XQuery dynamic type mismatch: expected singleton sequence – got multi-item sequence

One solution, is to point the XQuery_string to the correct level. In our example, all you need to do is start our query at the CODES level.




ITEMS varchar2(10) PATH '//*:ITEM'


Informatica – FATAL ERROR : Signal Received: SIGSEGV (11)

While running a relatively simple Informatica mapping I got the following error.
Severity: FATAL
Message Code:
Message: *********** FATAL ERROR : Caught a fatal signal or exception. ***********
Message Code:
Message: *********** FATAL ERROR : Aborting the DTM process due to fatal signal or exception. ***********
Message Code:
Message: *********** FATAL ERROR : Signal Received: SIGSEGV (11) ***********

Basically, errors like this means that something went terribly wrong and the Informatica server did not know how to handle it. In many cases you will need to contact Informatica support to help you with a solution.

Various reason can cause this kind of error:

tnsnames.ora file used by the Oracle Client installed on the PowerCenter server machine gets corrupted.
If this is the case, other workflows using the same tns entry would probably break as well. The solution is to recreate the tnsnames.ora file (or contact Oracle support).

Another cause for the SIGSEGV Fatal error crash is a problem with the DB drivers on the Informatica server. Many user report this issue with Teradata and Oracle

The above reasons would probably be system-wide and will cause problems is many workflows. In my case, only the process I was working on had this problem while all other workflows were working without a problem. So, I could assume that the problem is limited to something specific in my mapping. I looked at the log and find out that this issue happens immediately after the following

Message Code: PMJVM_42009
Message: [INFO] Created Java VM successfully.

This led me to the conclusion that this problem is Java related.

It is possible that the Oracle libraries/symbols are loaded first and there is a conflict of symbols between Java libraries and Oracle client libraries.
In this case, Informatica suggest to set up the following environment variables:
LD_PRELOAD = $INFA_HOME/java/jre/lib/amd64/

The problem with setting environment variables is the fact that it is system wide change and there is always a risk of breaking something. So, changing a setting in the workflow or session level is always preferred. This way, if something breaks, the problem is contained.

While looking for a workflow level solution I came across the following article suggesting that the problem is a result of mismatch between the Linux Java and the PowerCenter version.

“This is an issue with the Oracle JDK 1.6 in combination with RHEL Linux. Java txn, and in turn the session is impacted.

The JDK has a Just In Time (JIT) compiler component that works towards optimizing Java bytecode to underlying native assembly code. Assembly code is native and is faster than Java byte code.

The crash occurs in this JIT generated assembly code, which means that native debuggers will not be able to resolve the symbols.”

There are two solutions for the problem, one is, forcing the session (DTM) to spawn Java transformation with JDK 1.7 rather than JDK 1.6. There are details in the above link but, again, this is major change with many possible implications.

The second (and simpler) option is Disabling JDK’s Just In Time compiler

You can do this at a session level by adding the following JVM option as custom property to IS.
JVMOption1 = -Xint
In Edit task – Config Object – Custom Properties
It is important to understand that this means that there is no optimization performed on the Java transformation’s bytecode and a performance hit is expected in the Java transformation alone. Therefore, if the java transformation is performance sensitive, you might want to think about the first solution.

Informatica – [PCSF_46002] Failed to communicate with the server

After serving me well for several years, my Informatica client suddenly stopped working.
I got the following error when trying to connect to one of my domains.

Error: [PCSF_46002] Failed to communicate with the server at [http://[server]:[port]/coreservices/DomainService].

Since it took me a good few hours to find the solution for the problem, I will write all my failed attempts and the actual solution in my case. I hope it will save someone (probably me) some time in the future. (TL;DR – delete proxy environment variables)

Verify that server is working– In my case, the domain I was trying to connect to was Production, the first thing I did is to verify that the server is up and that the integration service is running. Very quickly I found that all the target tables still get updated as expected. Knowing that the server is running, allowed to proceed with less stress.

Verify that Informatica Administration console is available.
Since I was able to connect to my Informatica Administration Web Console on http://server:port/administrator/#admin I came to the conclusion that the problem is on the client side. Luckily I was able to confirm it by connecting to the domain using one of my colleague workstation. Only after that, I understood that I am not able to connect to any domain, so the problem is on my machine without a doubt.

On the client, I followed in formatica KB suggestion at and clicked on ” Configure Domains” and tried to delete the domain entries and recreate them. This time I got almost identical message

Unable to save information for domain .
Error: [PCSF_46002] Failed to communicate with the server at [http://[server]:[port]/coreservices/DomainService].

Since the message started with “Unable to save”, the next suspect was a problem in the OS. I verified that I have permission to write to the INFA_HOME location (\clients\PowerCenterClient\domains.infa in my case). After some more googling I also tried to manually write/edit the domains.infa file and created INFA_DOMAINS_FILE environment variable pointing to it. Still, no success.

The next step is to check network connectivity. I verified that I am able to ping or telnet the server. Since pinging the computer was successful, my only remaining option was to verify that no firewall or proxy is blocking me. While searching the knowledge base on how to check the proxy setting, I came across this article
from which I learned that setting the following environment variables can interfere with the efforts to connect to the domain.
• http_proxy
• https_proxy
Only then, I remembered that as part of the Docker installation on my PC, I did set the HTTP_PROXY environment variable. I deleted the HTTP_PROXY environment variable (on windows: My computer – properties – Advanced system settings – Environment Variables) and after a restart my Informatica client came back to life. I wish the error messages were a little clearer but I hope this blog post will help.

Short and sweet way to increase Sequence value to be higher than the max value in the table

Many times during the development process, there is a need to copy /export/backup table data between environments or stages. While it is easy to export the data using SQL inserts, it is also very easy to forget the sequences related to the tables.
Sometimes, this practice results in DUP_VAL_ON_INDEX exception (ORA-00001) when you use NEXTVAL in your insert. The tables may already include high values but the sequence is lower than the max values in the table.
Therefore it is important to remember to increment the sequences to a value higher than the max value in the table.

The most common way to increase the sequence value to the next value in the table is to:

1) alter the sequence increment to the difference between the current value of the sequence and the max value in the table.


2) Issue a dummy nextval request

select sequence-name.nextval from dual;

3) Alter the sequence increment value back to the original increment


Another option of course is to drop the sequence and recreate it with the required value but this may invalidate objects referencing to it. Also, it will force you to re-apply all grants and permissions.
Both above methods will work but not everyone have permissions to change the sequence. In addition, DDL operations are always risky. So, If you are looking for a short and sweet (or quick and dirty) way to increase the sequence value, look no further.

select  level, sequence-name.NEXTVAL
from  dual 
connect by level <= (select max(column-using-the-sequence  ) from table-name);

Run a function with a table of User Defined Type (UDT) as a parameter using SQLPlus

If you want to send multiple records to an oracle function, a common practice is to create the function with one or more parameters as a table of an object type.

In the function itself, you can loop thru the array or select from it using to TABLE command.

Another option to send an array of multiple records is to use XML (xmltype) as the function parameter but this will be addressed in a separate post.

While I use this method for years, for some reason, many times I find myself struggling with the relatively simple syntax. This post will serve me (and maybe some of you) as a short, very simplistic template to copy from.

Basically, what you need to do is the create a user defined object type (UDT)

create or replace type order_type as object (
  num_of_units number
, cost_of_unit number)

Then you create a table of this user defined type

create or replace type order_tbl as  table of order_type

A simple function

create or replace function calc_total (i_orders order_tbl) return number is

res number;


   select sum(num_of_units * cost_of_unit)
   into res
   from table (i_orders);


   return res;


Testing this function with SQLPlus is a little challenging. The simple way to send a table of UDT type to a function is to cast it to the table of the user defined type.

SQL>   select  calc_total(
  2                      cast(MULTISET(
  3                        select * from
  4                        (
  5                        select 9 num_of_units, 2 cost_of_unit from dual union all
  6                        select 6, 3  from dual
  7                        ) ) as order_tbl   ))
  8    from dual;

The reason for the additional inline view is the weird oracle bug the cause the function to hang (and sometimes crash with ora-03113 and ora-00600 ) when using cast and multiset together with union. The inline view is fast workaround for this.
Another option to work around this problem is to use multiset union

SQL>    select  calc_total( (cast(multiset(
  2                               select 9,2 from dual
  3                                MULTISET UNION
  4                               select 6,3 from dual
  5                           ) as order_tbl   ))
  6                       )
  7     from dual;