Python Forum

Full Version: Help batch converting .json chosen file to MySQL
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hey there everyone! I am currently manually copying data fields from .json files on GitHUB for Supreme Court of The United States Opinions from 1700's to 2000's from the freelawproject.

GitHUB here: https://github.com/brianwc/bulk_scotus

Example .json file

(SCOTUS Opinion - 1700 - 1754 - 84581.json)
GitHUB Link: https://github.com/brianwc/bulk_scotus/b...84581.json

The fields I am currently manually entering into a MySQL database is the following:

case_name VARCHAR (250)
federal_cite (100)
date_filed (100)
judge_name (100)
citation_count (100)
precedential_status (100)
download_url (200)
html_with_citations (longtext)
html (longtext)
html_lawbox (longtext)
Knowing this will take me 3+ months to manually do and knowing the power of python and it's abilities to take every single one of my .json files and write the chosen fields to a CSV and then to a MYSQL database table schema OR directly to a MySQL database table schema and bypass the CSV conversion.

What would you do in my position?

Thank you in advance!

Best Regards,

Brandon Kastning
Big Grin
clone the entire site site (button on repository page)
or do a sparse clone see: https://gist.github.com/hofnerb/a5ebf4e0c89caffc40ee
(Mar-14-2020, 04:07 PM)Larz60+ Wrote: [ -> ]clone the entire site site (button on repository page)
or do a sparse clone see: https://gist.github.com/hofnerb/a5ebf4e0c89caffc40ee

Larz60,

I have the entire github repository downloaded. There are 64,000+ .json files.