Posts: 41
Threads: 20
Joined: Apr 2023
May-05-2023, 05:18 PM
(This post was last modified: May-05-2023, 05:18 PM by SuchUmami.)
Hi.
I have a piece of code that will get a lot of information. My problem is that I just want to use one piece of information out of the data. It loops through currency pairs and gets a lot of info but I'm having trouble isolating one piece of the information to further process that information.
Is it possible to do?
Posts: 6,778
Threads: 20
Joined: Feb 2020
May-05-2023, 05:29 PM
(This post was last modified: May-05-2023, 05:29 PM by deanhystad.)
Yes.
If you'd like more detail, please provide more detail about your problem. What kind of information are you sifting through? How do you identify the information you want to retrieve? What details of the information do you want to retain?
Example code, a small, not completely working example, would be a good start.
Posts: 41
Threads: 20
Joined: Apr 2023
(May-05-2023, 05:29 PM)deanhystad Wrote: Yes.
If you'd like more detail, please provide more detail about your problem. What kind of information are you sifting through? How do you identify the information you want to retrieve? What details of the information do you want to retain?
Example code, a small, not completely working example, would be a good start.
Hi. thanks for your response. I didn't include the code because it might be a bit much. But basically I am cycling through currency pairs and time frames to extract different moving averages of those currency pairs. What I want to do is use the information provided to give a score on each of the currency pairs to help me understand what might be better trades. I've managed to get exactly the data I am looking for, but I'm having difficulty figuring out how to score that data.
Here is the code:
import krakenex
import pandas as pd
api = krakenex.API()
# Define currency pairs and their corresponding names
currency_pairs = [('XETHXXBT', 'ETH/BTC'), ('XXBTZEUR', 'BTC/EUR'), ('XXBTZUSD', 'BTC/USD'), ('XXBTZGBP', 'BTC/GBP'), ('XETHZEUR', 'ETH/EUR'), ('XETHZGBP', 'ETH/GBP'), ('XETHZUSD', 'ETH/USD'), ('EURGBP', 'EURGBP'), ('ZEURZUSD', 'EUR/USD'), ('ZGBPZUSD', 'GBP/USD'), ('PAXGXBT', 'PAXG/BTC'), ('PAXGETH', 'PAXG/ETH'), ('PAXGEUR', 'PAXG/EUR'), ('PAXGUSD', 'PAXG/USD'), ('XXRPXXBT', 'XRP/BTC'), ('XRPETH', 'XRP/ETH'), ('XXRPZEUR', 'XRP/EUR'), ('XRPGBP', 'XRP/GBP'), ('XXRPZUSD', 'XRP/USD')]
time_frames = ['1', '5', '15', '30', '60', '240', '1440', '10080']
for pair, pair_name in currency_pairs:
for tf in time_frames:
print(f"{tf} minute m/a ({pair_name})")
# Fetch OHLC data for the given pair and time frame
ohlc_data = api.query_public('OHLC', {'pair': pair, 'interval': tf})
# Convert the data to a pandas dataframe
df = pd.DataFrame(ohlc_data['result'][pair], columns=['time', 'open', 'high', 'low', 'close', 'vwap', 'volume', 'count'])
df['time'] = pd.to_datetime(df['time'], unit='s')
df.set_index('time', inplace=True)
# Calculate the moving averages
last_20_candles = df.iloc[-21:-1]
last_20_close_prices = last_20_candles['close'].astype(float).tolist()
avg_last_20_close_price = sum(last_20_close_prices) / len(last_20_close_prices)
print(f"20 period m/a: {avg_last_20_close_price}")
last_50_candles = df.iloc[-51:-1]
last_50_close_prices = last_50_candles['close'].astype(float).tolist()
avg_last_50_close_price = sum(last_50_close_prices) / len(last_50_close_prices)
print(f"50 period m/a: {avg_last_50_close_price}")
last_200_candles = df.iloc[-201:-1]
last_200_close_prices = last_200_candles['close'].astype(float).tolist()
avg_last_200_close_price = sum(last_200_close_prices) / len(last_200_close_prices)
print(f"200 period m/a: {avg_last_200_close_price}")
print("\n")
Posts: 6,778
Threads: 20
Joined: Feb 2020
How is this different from your other thread?
https://python-forum.io/thread-39914.html
And don't your two posts here contradict each other? You know how to isolate the information you are looking for. What you don't know is how to interpret that information. Is that correct?
Posts: 41
Threads: 20
Joined: Apr 2023
I didn't get any help in the other thread, I thought it was because I worded it badly or something.
Quote:And don't your two posts here contradict each other? You know how to isolate the information you are looking for. What you don't know is how to interpret that information. Is that correct?
No. If you run the code, you will see much information is being extracted. But I'd like this information to be further processed. So for instance, I'd like all the BTC/USD time frames to be taken together so I can rate them. The piece of code I wrote works very well for getting the information but I'm finding it much harder to further process this information.
Posts: 41
Threads: 20
Joined: Apr 2023
I'm not asking for someone to do this for me, but I'd like to know if it's possible and point me in the general direction of how to do it.
Posts: 6,778
Threads: 20
Joined: Feb 2020
If you can provide a good description of what you want to do, you will get more responses. I know that can be difficult, but when I look at your post I have no idea where you are in this process. Is this a python question (1)? Is it a question about finding or using a package for doing some particular processing (2)? Do you just have a bunch of data and are looking for guidance about what to do with it (3)? You will get a lot of responses for type 1 and 2 questions, but not so much for type 3 unless you are lucky enough to run across someone who's done or seen something similar.
So what are you thinking of doing with last_20, last_50 and last_200?
Posts: 41
Threads: 20
Joined: Apr 2023
May-06-2023, 10:27 AM
(This post was last modified: May-06-2023, 10:27 AM by SuchUmami.)
(May-05-2023, 08:34 PM)deanhystad Wrote: If you can provide a good description of what you want to do, you will get more responses. I know that can be difficult, but when I look at your post I have no idea where you are in this process. Is this a python question (1)? Is it a question about finding or using a package for doing some particular processing (2)? Do you just have a bunch of data and are looking for guidance about what to do with it (3)? You will get a lot of responses for type 1 and 2 questions, but not so much for type 3 unless you are lucky enough to run across someone who's done or seen something similar.
So what are you thinking of doing with last_20, last_50 and last_200?
Thank you for your patience and explaining to me why people are having difficulty helping me. As you can probably see, I am quite new to coding.
I think my problem is closer to option (1) you described. Basically, my code cycles through currency pairs and time frames and it finds the moving averages (last_20, last_50 and last_200) of each of the pairs on each of the timeframes. What I would like to do is write a new module that uses the data from this code to rate the data in terms of profitability. I definitely could do this if I break the code up to only look for one currency pair on one timeframe but I thought this code will make it a lot easier and condensed for me (as there's 133 different modules this code would break down into).
The trouble I am having is that now that I want to write this new module to rate the data as individual pieces (for instance, rate btc/usd moving averages on the 15 minute time frame), that I am not sure how to isolate these pieces of data in order to rate them.
I am thinking of defining each moving average more clearly, so that each time it loops through that a correct definition will come up regarding it. So just like how I've wrote out the currency pairs and time frames in a sort of list (not sure of the programming term), I will do that with the moving averages and because now I have defined them, I will be able to draw upon that data in my new module. Is that how I should go about accomplishing it?
Again, I really appreciate the help.
Posts: 7,312
Threads: 123
Joined: Sep 2016
May-06-2023, 05:53 PM
(This post was last modified: May-06-2023, 05:53 PM by snippsat.)
(May-06-2023, 10:27 AM)SuchUmami Wrote: . I definitely could do this if I break the code up to only look for one currency pair on one timeframe but I thought this code will make it a lot easier and condensed for me (as there's 133 different modules this code would break down into).
The trouble I am having is that now that I want to write this new module to rate the data as individual pieces (for instance, rate btc/usd moving averages on the 15 minute time frame), that I am not sure how to isolate these pieces of data in order to rate them. You should break it up to the basic,the code you have written could be one function and new stuff in another function.
It's much harder when you have written code that do a lot stuff in a big loop,to add test test stuff out.
So as example.
# kr4.py
import krakenex
import pandas as pd
# create a Kraken API client
api = krakenex.API()
# set the currency pair and interval
pair = 'BTC/USD'
interval = 15
# make the API request for OHLC data
ohlc_data = api.query_public('OHLC', {'pair': pair, 'interval': interval})
# Convert the data to a pandas dataframe
df = pd.DataFrame(ohlc_data['result'][pair], columns=['time', 'open', 'high', 'low', 'close', 'vwap', 'volume', 'count'])
df['time'] = pd.to_datetime(df['time'], unit='s')
df.set_index('time', inplace=True)
# Calculate the moving averages
last_20_candles = df.iloc[-21:-1]
last_20_close_prices = last_20_candles['close'].astype(float).tolist()
avg_last_20_close_price = sum(last_20_close_prices) / len(last_20_close_prices)
print(f"20 period m/a: {avg_last_20_close_price}") What i often to now is to run this code interactive when testing,if not have Editor that do this can use -i flagg.
(dl_env) G:\div_code\dl_env
λ python -i kr4.py
20 period m/a: 28792.299999999996
# The original DataFrame
>>> df
open high low close vwap volume count
time
2023-04-29 06:00:00 29383.4 29405.9 29383.3 29383.3 29387.6 8.66207226 136
2023-04-29 06:15:00 29383.4 29394.9 29341.9 29342.1 29381.4 14.18612946 172
2023-04-29 06:30:00 29342.0 29353.2 29341.2 29342.3 29345.5 2.62617868 104
2023-04-29 06:45:00 29342.3 29360.4 29322.0 29343.5 29337.8 13.15491601 182
2023-04-29 07:00:00 29343.5 29390.4 29343.4 29369.7 29372.0 8.08884765 155
... ... ... ... ... ... ... ...
2023-05-06 16:45:00 28710.1 28767.1 28710.0 28760.1 28743.2 7.60574838 257
2023-05-06 17:00:00 28760.1 28760.1 28733.0 28733.1 28742.0 10.26238430 210
2023-05-06 17:15:00 28733.1 28890.0 28733.0 28878.4 28819.0 79.55429626 573
2023-05-06 17:30:00 28875.2 28883.2 28789.2 28803.8 28817.6 16.42856836 238
2023-05-06 17:45:00 28803.8 28810.9 28800.0 28807.4 28803.7 0.22000144 52
[720 rows x 7 columns]
>>> last_20_close_prices
[29141.2, 29104.5, 29069.1, 28982.2, 28979.2, 28740.1, 28670.1, 28662.8, 28647.7, 28590.1, 28626.6, 28666.5, 28630.3, 28710.0, 28740.1, 28710.1, 28760.1, 28733.1, 28878.4, 28803.8]
>>> last_20_candles
open high low close vwap volume count
time
2023-05-06 12:45:00 29109.2 29142.5 29051.6 29141.2 29084.2 39.59896757 326
2023-05-06 13:00:00 29139.6 29139.6 29046.2 29104.5 29083.6 141.26224316 665
2023-05-06 13:15:00 29104.5 29104.5 29015.0 29069.1 29047.3 196.75223375 881
2023-05-06 13:30:00 29062.1 29062.1 28949.6 28982.2 28983.4 112.89161354 632
2023-05-06 13:45:00 28982.0 29043.0 28919.6 28979.2 28979.1 40.68199757 451
2023-05-06 14:00:00 28976.2 28976.2 28602.0 28740.1 28709.3 303.95728877 2030
2023-05-06 14:15:00 28730.7 28733.6 28628.1 28670.1 28673.5 79.46272297 617
2023-05-06 14:30:00 28669.4 28734.1 28610.0 28662.8 28656.8 92.00957394 729
2023-05-06 14:45:00 28656.4 28693.8 28629.6 28647.7 28658.5 16.64881712 319
2023-05-06 15:00:00 28643.2 28643.2 28338.0 28590.1 28469.0 233.08117238 1223
2023-05-06 15:15:00 28590.1 28640.6 28540.6 28626.6 28583.3 54.44542700 442
2023-05-06 15:30:00 28629.4 28675.0 28590.0 28666.5 28629.3 12.56857437 373
2023-05-06 15:45:00 28664.4 28670.5 28629.0 28630.3 28650.6 29.61014138 361
2023-05-06 16:00:00 28630.3 28712.0 28597.7 28710.0 28651.1 28.06378914 340
2023-05-06 16:15:00 28710.1 28740.1 28661.7 28740.1 28698.2 34.03577445 354
2023-05-06 16:30:00 28740.1 28740.1 28700.0 28710.1 28714.5 34.78157198 351
2023-05-06 16:45:00 28710.1 28767.1 28710.0 28760.1 28743.2 7.60574838 257
2023-05-06 17:00:00 28760.1 28760.1 28733.0 28733.1 28742.0 10.26238430 210
2023-05-06 17:15:00 28733.1 28890.0 28733.0 28878.4 28819.0 79.55429626 573
2023-05-06 17:30:00 28875.2 28883.2 28789.2 28803.8 28817.6 16.42856836 238
As see can look at all data and is not in loop,now easier to start testing out stuff.
Posts: 41
Threads: 20
Joined: Apr 2023
May-07-2023, 01:44 PM
(This post was last modified: May-07-2023, 01:46 PM by SuchUmami.)
In future, I will not put too many functions in my code.
Thank you.
|