Sep-10-2020, 08:44 PM
Hi, I have a radius server which store 400MB of text files, so I have millions of logs and I am wandering if it is possible to filter the data according to hour, date, etc. In the log file, every line is an entry and I would like to be able to make searchs in the file and find specific intormation according to my needs. What I am getting in the file is something like this:
IP,user,date,hour,device,etc,etc,etc,etc,etc
IP,user,date,hour,device,etc,etc,etc,etc,etc
IP,user,date,hour,device,etc,etc,etc,etc,etc
IP,user,date,hour,device,etc,etc,etc,etc,etc
IP,user,date,hour,device,etc,etc,etc,etc,etc
IP,user,date,hour,device,etc,etc,etc,etc,etc
I think the main task to achieve what I need, is indexing data line per line (probably using a for and indexing data in a dataframe) and then coding all the necessary to be able to filter according the colums I want to use as reference.
This thread is just to get if possible, a general view about how you would focus this scenario. Do you think that it is possible to achieve this only using python? Thank you very much in advice.
IP,user,date,hour,device,etc,etc,etc,etc,etc
IP,user,date,hour,device,etc,etc,etc,etc,etc
IP,user,date,hour,device,etc,etc,etc,etc,etc
IP,user,date,hour,device,etc,etc,etc,etc,etc
IP,user,date,hour,device,etc,etc,etc,etc,etc
IP,user,date,hour,device,etc,etc,etc,etc,etc
I think the main task to achieve what I need, is indexing data line per line (probably using a for and indexing data in a dataframe) and then coding all the necessary to be able to filter according the colums I want to use as reference.
This thread is just to get if possible, a general view about how you would focus this scenario. Do you think that it is possible to achieve this only using python? Thank you very much in advice.