Big Data

Get Involved. Join the Conversation.

Topic

    Pavel Brunarskiy
    BigDataSQL memory consumption
    Topic posted October 25, 2019 by Pavel Brunarskiy, tagged Big Data SQL, Hadoop 
    12 Views
    Title:
    BigDataSQL memory consumption
    Summary:
    How to limit BigDataSQL shared memory consumption ?
    Content:

    We tried to install BD SQL 4.0 at CDH cluster that have about 642 Gb of memory at each Data Node. At installation process fails when try to start BD_CELL with failure:

     

    Failed to allocate EXTRA SysV segment of 248545 MB, exceeding system SHMALL limit of 197923328 pages (773138 MB) or SMMNI limit of 4096 segments.

    [RS] Monitoring process /opt/oracle/cell/cellsrv/bin/bdsqlrsomt (pid: 3755286) returned with error: 161

    Errors in file /opt/oracle/cell/log/diag/bdsql/cell/amaterasu2/trace/bdsqlsrvtrc_3755288_main.trc  (incident=185):

    ORA-00600: internal error code, arguments: [main::ocl_lib_serv_init2], [30], [Shared memory create failure], [28], [No space left on device], [ocl_shmem.c], [1456], [], [], [], [], []

     

    We try to set 

    "memory" : {
            "min_hard_limit" : 16384,
            "max_percentage" : 30
    }

    at bds-config.json but it's not helped us. 

    Is it any way to reduce BDS memory consumption?

    Version:
    Big Data SQL 4.0