• M
    mihairosu

    Okay awesome, I'll play around with stuff, see what works well.

    posted in User help read more
  • M
    mihairosu

    Okay thanks for everything, I can certainly handle it from here! If anything, I'll have to split it up, like you said, though I would have hoped to be able to get more out of this method. I hope it's more than I can get out of the watchlist page, haha.

    posted in User help read more
  • M
    mihairosu

    Ok I found the watch list for a test watchlist ID: WL_883061

    {
             "xid":"WL_883061",
             "data":null,
             "query":null,
             "readPermission":null,
             "name":"Troubleshooting",
             "dataPoints":[
                "boosterPumpsFlowRate",
                "DHWBsmntReturnRate"
             ],
             "folderIds":null,
             "params":null,
             "type":"static",
             "user":"admin",
             "editPermission":null
          },
    

    The page appears to work fine:

    0_1583345621374_watch list download page.png

    Data begins approximately Feb 7, 2017, so I set a test download from

    Feb 7, 2017 to March 4, 2020 with 1 day rollup interval:

    0_1583345750582_b2a128af-be91-4116-98e7-bbcda9e61382-image.png

    I was getting lost connectivity and restored notifications.

    Anyway, after the server is finished doing whatever it is doing, with a high processor load, I get an error for the CSV download:

    0_1583348255792_de8facaa-05ec-4d97-b91a-3a88f54b3d25-image.png

    Any idea what the issue may be?

    I tried a smaller time period for the download and that had no problems.

    I assume it's some limit in some sub-system that I may have to adjust values for?

    edit: I think I see what the issue is. ma/log shows this error:

    ERROR 2020-03-04T12:36:28,716 (com.serotonin.m2m2.rt.maint.BackgroundProcessingImpl$RejectableWorkItemRunnable.run:632) - Error in work item
    java.lang.OutOfMemoryError: GC overhead limit exceeded
    
    
    

    posted in User help read more
  • M
    mihairosu

    @mattfox

    What parts of this script should I be changing to get the points I am looking for?

    <ma-watch-list-get ng-model="designer.watchList" parameters="designer.parameters" on-points-change="designer.points = $points" id="29305bf3-cc2a-4bac-9650-3cbf64d738a9" watch-list-xid="WL_Flow"></ma-watch-list-get>
    <div class="ma-designer-root" id="1a2a6980-f5bb-4acc-a41a-f8f69d5c12d2" style="width: 1366px; height: 768px; position: relative;">
    
        <ma-calc input="designer.points | filter:{name:'Flow'}:true | maFirst" output="point"></ma-calc>
        
        <md-button id="2585b1fa-a630-4255-8670-41884b8754e2" style="position: absolute; left: 260px; top: 70px;" class="md-raised" ng-href="/rest/v2/point-values/time-period/{{point.xid}}?fields=TIMESTAMP&fields=VALUE&fields=ANNOTATION&from={{dateBar.from.toISOString()}}&to={{dateBar.to.toISOString()}}&format=csv2" download="{{point.name}}.csv">Download values</md-button>
    </div>
    

    I'll re-create the data source because it's easy in bacnet, but I am still trying to figure out how to download the data easily.

    I don't fully understand what XID I need to match to something in my own syste,

    There's no point XID in there, as far as I can tell.

    posted in User help read more
  • M
    mihairosu

    I am getting these errors when trying to import data point configuration for every point:

    Data point with XID 'boosterPumpsOutputPress' does not already exist and references a data source that does not exist. Ignored.
    

    I'm probably not following your instructions 100% somehow, I'll try to understand better.

    posted in User help read more
  • M
    mihairosu

    Okay cool I'll give that a try and report back. Thanks Matt.

    posted in User help read more
  • M
    mihairosu

    So right now I'm trying to backup and data itself.

    We got 23 data points collecting data over many years.

    I can't fit enough data into one watch list graph even from one point to be able to download it all.

    I can't download 5000 points at a time for 23 data points.

    Is there another reliable way to download all of this data?

    posted in User help read more
  • M
    mihairosu

    Hey Terry,

    This one is also giving a java out of bounds exception, but I don't understand why:

    /*
    //Script by Phil Dunlap to automatically generate lost history
    if( my.time + 60000 < source.time ) { //We're at least a minute after
      var metaEditDwr = new com.serotonin.m2m2.meta.MetaEditDwr();
      metaEditDwr.generateMetaPointHistory(
          my.getDataPointWrapper().getId(), my.time+1, CONTEXT.getRuntime(), false);
      //Arguments are, dataPointId, long from, long to, boolean deleteExistingData
      //my.time+1 because first argument is inclusive, and we have value there
    }
    //Your regular script here.*/
    
    //Get daily run starts
    return runs.past(MINUTE, 1440)['data'].get(true).starts;
    

    posted in Scripting general Discussion read more
  • M
    mihairosu

    Hey Matt,

    Actually this is all on the same internal network, I only need to migrate the data source between the two devices. Network communication is not a problem.

    Thanks for the very helpful instructions; as soon as I get a chance, I'll see if I can work it out.

    posted in User help read more
  • M
    mihairosu

    Looks to me like this is something we would be interested in pursuing in case we want to change the data source:

    It would be possible to achieve what you want by terminating mango and performing some operations on the database to change the dataSourceId column of the data point to the new source's id. This is not recommended but is possible if you know what you are doing.

    With regards to this:

    There are some migration tools with the NoSQL module assuming you use that for you point value store. Those tools coupled with the JSON Import/Export feature would likely work. The other option is to publish the data to the enterprise server assuming you are ok with the point becoming a Persistent TCP type data point and it won't have to be attached to a BACnet data source.

    I may not have explained myself well or misunderstood your instructions. I want to make sure we're on the sama page. We already publish this data point to the Enterprise server, so there's no need to move the actual data from the ES. I'll attach an image, it might work best:

    0_1574786287338_migrate bacnet.png

    Thanks

    posted in User help read more