How Do I Connect Get Page JSON to XML operator to Read XML Operator

apmishraapmishra Member Posts: 3 Contributor I
edited June 2019 in Help
Hello  ,
I seem to have a fairly simple problem that I can't seem to get around ..

I am reading a JSON feed from a remote URL using the "Get Page" operator. I needed to convert this to an example set so I can score the output. The path that seemed obvious to me was to take this JSON output and convert it to XML then transform it into an example set using the Read XML Operator.
I found that to do connect the Read XML operator to the output of JSON to XML is not possible. I even tried Documents to Data and that documents with text attribute (which is my entire document)..so that did not work.

I can convert the output to a file and write it though this may not work for many requests. Should I be writing the document to the database instead .. I basically want to score the output "in situ" vs storing it in files ..

So how do I process URL's which produce XML or JSON within RapidMiner and RapidAnalytics..

Eagerly looking forward to your assistance
Tagged:

Answers

  • apmishraapmishra Member Posts: 3 Contributor I
    Found the Solution with Open File Operator. Basically I abandoned the Document approach and wen with reading URL's with Open File Operator which is then read by other Operators to generate the example set (Read CSV, READ XML etc).

    Though it will be good to know how to convert a "doc" into an example set.
  • DeZepTupDeZepTup Member Posts: 1 Contributor I
    Hi,
    Get page => JSON to XML seems to work fine in 5.2. Open File produces "File", not "Doc", so JSON to XML don't like it and throwing error.
    Could you, please, show, how you converted JSON to Excel?
    My data example:

    {
    "project":{"name":"test","slug":"test"},
    "area1":{"cells":[
    {"field1":1891363026,"field2":52293,"field3":"TestSubject_1.1","field4":1617500,"field5":1617500,"field6":1,"field7":"VERY_LONG"},
    {"field1":1891102016,"field2":4611,"field3":"TestSubject_1.2","field4":250075,"field5":290000,"field6":1,"field7":"LONG"},
    {"field1":1892882651,"field2":774,"field3":"TestSubject_1.3","field4":96790,"field5":119495,"field6":1,"field7":"VERY_LONG"},
    {"field1":1892198261,"field2":8193,"field3":"TestSubject_1.4","field4":83062,"field5":90000,"field6":1,"field7":"VERY_LONG"}]},
    "area2":{"cells":[
    {"field1":1892897717,"field2":67061,"field3":"TestSubject_2.1","field4":504899,"field5":509999,"field6":1,"field7":"VERY_LONG"},
    {"field1":1892485359,"field2":1205,"field3":"TestSubject_2.2","field4":10000,"field5":12020,"field6":2,"field7":"VERY_LONG"},
    {"field1":1891501931,"field2":12365,"field3":"TestSubject_2.3","field4":87500,"field5":0,"field6":20,"field7":"VERY_LONG"},
    {"field1":1892105389,"field2":39896,"field3":"TestSubject_2.4","field4":331125,"field5":550000,"field6":1,"field7":"VERY_LONG"}]}
    }
    Also, converting even 5.6mb of data if "eating" 1.8GB RAM and sometimes freezing RM, this is pretty weird - it is not a big amount of data, 51 000 rows, 7 columns.
    What was your utilization while processing JSON?

    Best regards.
Sign In or Register to comment.