Integrations and APIs for Service

Get Involved. Join the Conversation.


    Kevin Ngo
    Issue with limitation of number records returned using...
    Topic posted April 30, 2010 by Kevin NgoRed Ribbon: 250+ Points, last edited October 29, 2011 
    2866 Views, 23 Comments
    Issue with limitation of number records returned using execute report


    I am aware of issue about limitation of number row returned from a report, and I get around it by making multiple calls to a report based on a sorted and filter unique column and using condition where ID > max ID.  However, is there a way to deal with Prod2Quote table which has no primary key or ID, and it has mulitple quoteID and product ID.  Please help




    • willy

      Here's my superduper awesome secret I just figured out on a project I am working on. While Ryan's idea of looping and making multiple requests to get the data works just fine- and I have used it in some of our GA code, it is slow. Very slow.

      It dawned on me that you might be able to concatenate all the column data into just one column, and return that. This would allow you to get all of the "cells" in a particular row. The magic guy here is: concat_ws()

      Once I got that working, it dawned on me that if you can concatenate multiple columns together for the row- perhaps you can concatenate multiple rows together and return the entire result set as the first item. The magic guy here is: group_concat()

      Between the two of these, you can get the entire result set in one fast request. Obviously there is some parsing you have to do on the data, but in my testing- I went from 12.04 seconds to retrieve the items I needed using the "looping" method to 0.6 seconds using a combination of the group_concat and concat_ws.

      As an example, perhaps you wanted to get the report definition, i.e., the column names and types for a specific report. Using the two MySql functions above, you might write a query like:

              string _GetReportDefinitionForReport = "SELECT (SELECT group_concat(concat_ws('|', val, type)) " +
                                                                            "FROM ac_columns " +
                                                                            "WHERE ac_id = '{0}') val FROM ac_columns";

      And then in your code, you'd parse it like:

      object value =, reportId), RNOWUtil.SQLReturnType.STRING);
      if (value != null)
          string[] columnsRaw = value.ToString().Split(',');
          foreach (string columnRaw in columnsRaw)
              string[] columnInfo = columnRaw.Split('|');

              string columnName = columnInfo[0];
              string columnType = columnInfo[1];   //This actually doesn't work. I learned after the fact that you can't get this info...



      (Now that I've shared my ultra-awesome secret, please don't remove that functionality, RightNow!)

    • ryszard antonczyk



      that's really awesome!


      Thank you.



    • Ryan McCullough

      Keep in mind there is a 4k character max on the return value of

    • willy

      "Keep in mind there is a 4k character max on the return value of"

      ...Now that is a good point.

    • Ryan McCullough

      It is actually a 4k byte limit, so if you are using UTF8 characters, it would be potentially less than 4k characters returned.

      I believe the server will truncate the return value at 4k bytes of data.

    • willy

      Still- for relatively small queries, this is a very efficient way to go.

    • Kevin Ngo


      Speaking of limation of character returned, what's limit or exact number of row that I should set for execute report? I have tried to run a report in Report Analytic with a few columns, and it successfully return 100,000 rows.  However, if a report is more than 10 columns then it will cause a problem.



    • Ryan McCullough

      There is a section in the RN Connect Data Integration developer guide talking about server side limitations:

      3.10 Server Limitations- CPU time and memory limits

      The server has memory and CPU time limits (256mb of memory or 5 minutes of CPU time) in place to protect against runaway processes and abuse. If the limit is reached, the server will kill the process. When a process created as part of a RightNow Connect request is killed, the server will return an incomplete response.

      It is rare to hit the memory limit in real-time mode. If you are in “Batch Mode”, you should take care to break your operations into multiple transaction blocks. As the server processes your request, it keeps the current transaction block as well as the previous one (if applicable) in memory. The server imposes a hard limit of 1000 operations per transaction block, but we recommend you limit your operations to 50 or less per transaction block to avoid exceeding the server memory limit.

      Reaching the CPU time limit is rare.

      So, it could be that you are hitting the 256mb memory limit and apache is killing the process.