Load Ingest Data
Results 1 to 2 of 2

Thread: How to increase the throughput of records being read from Oracle Source DB

  1. #1
    akanksha.1787 is offline Junior Member
    Join Date
    Nov 2016
    Rep Power

    How to increase the throughput of records being read from Oracle Source DB

    Hi Experts,

    Could you please help us understand, how can we increase the throughput while reading records from the source oracle database.
    The throughput we get varies higly with various peaks such as 20k 4k 12k 0 9k 0 21k 2k etc.

    The source view has 23 million records and the target in Azure sqldw.
    We have been able to successfully replicate the data only when we chose to create a .csv file of size 2GB on the blob storage.

    The task has failed for all the file sizes between 200-1000 MB with the error ' Error in request handler'.

    Your response would be really helpful.


  2. #2
    stevenguyen is offline Senior Member
    Join Date
    May 2014
    Rep Power
    Hello Akanksha,

    From another thread you post that CSV between 200-500 are working, but over 500MB are not working.

    in this thread you post that 2000MB is working and below is not working.

    1. Look like you have a connection issue with your Replicate server with the Attunity Cloudbeam server and Azure, please double check all your connection.

    2. What version of Replicate are you running ?

    3. What version of Replicate Client is running on the Attunity Cloudbeam Azure server?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts