Mar-14-2020, 03:41 PM
Hey there everyone! I am currently manually copying data fields from .json files on GitHUB for Supreme Court of The United States Opinions from 1700's to 2000's from the freelawproject.
GitHUB here: https://github.com/brianwc/bulk_scotus
Example .json file
(SCOTUS Opinion - 1700 - 1754 - 84581.json)
GitHUB Link: https://github.com/brianwc/bulk_scotus/b...84581.json
The fields I am currently manually entering into a MySQL database is the following:
What would you do in my position?
Thank you in advance!
Best Regards,
Brandon Kastning
GitHUB here: https://github.com/brianwc/bulk_scotus
Example .json file
(SCOTUS Opinion - 1700 - 1754 - 84581.json)
GitHUB Link: https://github.com/brianwc/bulk_scotus/b...84581.json
The fields I am currently manually entering into a MySQL database is the following:
case_name VARCHAR (250) federal_cite (100) date_filed (100) judge_name (100) citation_count (100) precedential_status (100) download_url (200) html_with_citations (longtext) html (longtext) html_lawbox (longtext)Knowing this will take me 3+ months to manually do and knowing the power of python and it's abilities to take every single one of my .json files and write the chosen fields to a CSV and then to a MYSQL database table schema OR directly to a MySQL database table schema and bypass the CSV conversion.
What would you do in my position?
Thank you in advance!
Best Regards,
Brandon Kastning