Python Forum
Calculate next rows based on previous values of array
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Calculate next rows based on previous values of array
#1
Hello everyone,
I am trying to calculate the value of the next array from the previous array. The calculation is done like this:
1. I have two arrays like this:
[Image: screenshot-01.png]

2. I called each of the rows in this array as depth, for example in the first row (row = 0) it will be called depth = 0 and it will be written as:
[Image: screenshot-02.png]

In this depth, it has 1 group and 16 rows.

3. For the next depth (depth = 1), it is constructed as:
[Image: screenshot-03.png]

And in this depth has 2 groups and 8 rows. The value of this depth is calculated from the previous depth (depth = 0) with the equation:
[Image: screenshot-04.png]

4. At depth = 2, it become:
[Image: screenshot-05.png]

Same as before, the value of this depth is calculated from the previous depth (depth = 1) using:
[Image: screenshot-06.png]

And it goes until depth = 4, but I don't show how to calculate the next depth because there are too many figures.
(If the other process is needed, I will update my post with calculation until depth = 4.)

Right now I am stuck at how to call the value of the previous row in this calculation and the code that I have worked out so far:

Method 1: L and B is divided by calculate the range based on node group
import numpy as np
N = 16
n = int(np.log2(N))
L0 = np.arange(N)
B0 = np.arange(N)[::-1]
L = np.zeros((n + 1, N))
B = L.copy()

L[0, :] = L0 #insert the value for L0
B[0, :] = B0 #insert the value for B0

for depth in range(n + 1):
    node = 2 ** (n - depth)
    node_group = 2 ** depth
    for i in range(node_group):
        Ln = L[depth, (node * i) : (node * i) + (node)] #divide the array of each rows into groups
        l1, l2 = Ln[::2], Ln [1::2]
        Bn = B[depth, (node * i) : (node * i) + (node - 1)]
Method 2: L and B is reshaped based on the depth, node group and node
import numpy as np
N = 16
n = int(np.log2(N))
L0 = np.arange(N)
B0 = np.arange(N)[::-1]
L = np.zeros((n + 1, N))
B = L.copy()

L[0, :] = L0 #insert the value for L0
B[0, :] = B0 #insert the value for B0

for depth in range(n + 1):
    node = 2 ** (n - depth)
    node_group = 2 ** depth
    Ln = np.reshape(L[depth], (node_group, node))
    Bn = np.reshape(B[depth], (node_group, node))
    print(Ln)
(I don't know which method is the best and I am showing this to get any opinions about it)

Please give me some advice on it, and forgive me if I am not posting it in the correct category. Because even though it uses NumPy, but I think this is only array manipulation, so I am not posting it in Data Science.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Deleting rows based on cell value in Excel azizrasul 11 2,621 Oct-19-2022, 02:38 AM
Last Post: azizrasul
  How to assign a value to pandas dataframe column rows based on a condition klllmmm 0 828 Sep-08-2022, 06:32 AM
Last Post: klllmmm
  Replace elements of array with elements from another array based on a third array Cola_Reb 6 1,839 May-13-2022, 06:06 PM
Last Post: deanhystad
Question Change elements of array based on position of input data Cola_Reb 6 2,113 May-13-2022, 12:57 PM
Last Post: Cola_Reb
  Create array of values from 2 variables paulo79 1 1,085 Apr-19-2022, 08:28 PM
Last Post: deanhystad
  Creating a numpy array from specific values of a spreadsheet column JulianZ 0 1,119 Apr-19-2022, 07:36 AM
Last Post: JulianZ
  Loop through values in dictrionary and find the same as in previous row Paqqno 5 1,901 Mar-27-2022, 07:58 PM
Last Post: deanhystad
  Remove if similar values available based on two columns klllmmm 1 1,355 Feb-20-2022, 06:55 PM
Last Post: Larz60+
  The code I have written removes the desired number of rows, but wrong rows Jdesi1983 0 1,630 Dec-08-2021, 04:42 AM
Last Post: Jdesi1983
  Pandas DataFrame combine rows by column value, where Date Rows are NULL rhat398 0 2,111 May-04-2021, 10:51 PM
Last Post: rhat398

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020