Python Forum

Full Version: Unable to get the data
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hi All,

I am trying to get one of the value named as "TotalNumberOfProducts" from one of the API.The API is below
http://xx:8080/maintenance/job?jobType=IMPORT&resultLimit=30
i was able to get other parameters like recordsprocesed,jobid etc and tried the below code to get the required data
#totalProducts=responseurl['statistics'].get('totalProducts')[0]
#totalProducts=responseurl[1]

both the ways are not giving any result Could you please assist as am new to this language.

responseurl = requests.get('http://xx:8080/maintenance/job?jobType=IMPORT&resultLimit=30')
                if(responseurl.ok):
                        jData = json.loads(responseurl.content)
                        #print jData
                        if jData > 0:
                                for responseurl in jData['response']:
                                        starttime=responseurl['statistics']['startTime']
                                        jobId= responseurl['jobId']
                                        status = responseurl['status']
                                        catalogId = responseurl['catalogId']
                                        recordsProcessed=responseurl['statistics']['recordsProcessed']
                                        recordsFailed=responseurl['statistics']['recordsFailed']
                                        fileName=responseurl['fileName']
                                        duration=responseurl['statistics']['duration']
                                        throughput=responseurl['statistics']['throughput']
                                        
API output

{u'response': [{u'status': u'COMPLETE', u'statistics': {u'estimatedTimeToFinish': u'0 seconds', u'workExpected': 1613, u'recordsFailed': 0, u'throughput': 322, u'percentComplete': 100, u'startTime': u'2019-05-22T11:55:01.642Z', u'duration': u'5 seconds', u'recordsProcessed': 1613, u'endTime': u'2019-05-22T11:55:07.126Z'}, u'expirationDatetime': u'2019-05-25T11:55:01.639Z', u'completedSteps': [{u'statistics': {u'workExpected': 1, u'recordsFailed': 0, u'throughput': 0, u'percentComplete': 100, u'startTime': u'2019-05-22T11:55:01.642Z', u'duration': u'0 seconds', u'recordsProcessed': 1, u'endTime': u'2019-05-22T11:55:01.653Z'}, u'workExpected': 1, u'catalogId': u'xx', u'header': {u'language': u'xx', u'totalProducts': 1332, u'country': u'xx', u'catalogId': u'xx', u'currency': u'xx', u'version': u'xx', u'partnerCode': u'xx'}, u'concreteType': u'CreateCatalogStep', u'workType': u'CREATE'}, {u'statistics': {u'workExpected': 256, u'recordsFailed': 0, u'throughput': 0, u'percentComplete': 100, u'startTime': u'2019-05-22T11:55:01.653Z', u'duration': u'0 seconds', u'recordsProcessed': 256, u'endTime': u'2019-05-22T11:55:01.757Z'}, u'workExpected': 256, u'fileName': u'/xx', u'catalogId': u'xx', u'concreteType': u'CategoryImportFromFileStep', u'workType': u'CATEGORY_IMPORT'}, {u'workExpected': 1, u'statistics': {u'workExpected': 1, u'recordsFailed': 0, u'throughput': 0, u'percentComplete': 100, u'startTime': u'2019-05-22T11:55:01.757Z', u'duration': u'0 seconds', u'recordsProcessed': 1, u'endTime': u'2019-05-22T11:55:01.781Z'}, u'concreteType': u'EvictCacheStep', u'workType': u'EVICT_CACHE', u'catalogId': u'xx'}, {u'statistics': {u'estimatedTimeToFinish': u'0 seconds', u'workExpected': 1332, u'recordsFailed': 0, u'throughput': 266, u'percentComplete': 100, u'startTime': u'2019-05-22T11:55:01.781Z', u'duration': u'5 seconds', u'recordsProcessed': 1332, u'endTime': u'2019-05-22T11:55:07.023Z'}, u'workExpected': 1332, u'bsinsRequiringSupplierSelection': [], u'catalogId': u'xx', u'startFrom': 0, u'merchantsProcessed': [u'xx'], u'fileName': u'xx', u'concreteType': u'ImportProductsFromFileStep', u'workType': u'IMPORT'}, {u'merchants': [u'xx'], u'statistics': {u'workExpected': 23, u'recordsFailed': 0, u'throughput': 0, u'percentComplete': 100, u'startTime': u'2019-05-22T11:55:07.023Z', u'duration': u'0 seconds', u'recordsProcessed': 23, u'endTime': u'2019-05-22T11:55:07.117Z'}, u'workExpected': 23, u'lastImportDateTime': u'2019-05-22T11:55:01.781Z', u'catalogId': u'xx', u'bsinsRequiringSupplierSelection': [], u'concreteType': u'ExpireMerchantProductsStep', u'workType': u'EXPIRE'}, {u'statistics': {u'workExpected': 0, u'recordsFailed': 0, u'throughput': 0, u'percentComplete': 0, u'startTime': u'2019-05-22T11:55:07.117Z', u'duration': u'0 seconds', u'recordsProcessed': 0, u'endTime': u'2019-05-22T11:55:07.125Z'}, u'workExpected': 0, u'catalogId': u'xx', u'concreteType': u'SelectPreferredSupplierStep', u'multiSupplierProducts': [], u'workType': u'SELECT_SUPPLIER'}], u'jobId': 7620, u'startFrom': 0, u'pendingSteps': [], u'fileName': u'/xx', u'catalogId': u'xx', u'jobType': u'IMPORT'},
If you look at the structure of that output, 'totalProducts' is in responseurl['completedSteps'][0]['header']['totalProducts'] (but not in any of the other items in that list).
So I copied the API output you posted into Notepad++, installed JSTool plugin to use "Plugin -> JSTool -> JSFormat" to nicely format the output, and it can be clearly seen that the json output is incomplete, and that's the reason why it doesn't load.
Error:
json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)
Not sure if you posted incomplete output or the response was incomplete itself but it's kinda hard to practically help using it...
use this code
response = requests.get('http://xx:8080/maintenance/job?jobType=IMPORT&resultLimit=30')
if resposnse.ok:
    data = response.json()
    with open('data.json', 'w') as f:
        json.dump(data, f, indent=4)
to save the json response to file, nicely formatted
e.g. without u in front of the keys
the solution given by ichabod801 has worked.thanks
responseurl['completedSteps'][0]['header']['totalProducts']
(May-22-2019, 02:35 PM)pythonFresher Wrote: [ -> ]the solution given by ichabod801 has worked.thanks
it's unclear if totalProducts is present only in the first element or there may be others (definitely not all). You need to know better if famillair with the data.