Python Forum
how to make iterative search more efficient
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
how to make iterative search more efficient
#1
Hello,

I am trying to solve an optimization problem. Due to the complexity of the objective function, it is not convenient to use solver packages. So I tried to solve the problem using brute-force algorithm.

I have a list f which include n elements. each element can take 10 different values. therefore there could be 10^n different f combinations. I would like to search optimal f out of this 10^n possible f combination.

I generated the following lines to do the task. first with itertools.product I generate a possible list, then check whether is it feasible of not. If this candidate f is feasible, calculate the revenue. after that if the revenue is greater than the previous optimal one, it assigns optimal f and rev pair.

the code worked well when n was small(like n=10 or n= 12). However now, I have to work with lists having 120 elements. Therefore there could be 10^120 different f combination. Iteration 10^120 times would take unacceptable duration(I guess weeks) which make it impossible to use. Is there any ways to make it more efficient and converge to a solution more quicker?
I am not expert in coding but use occasionally for my researches. Therefore I really appreciate any help.

value_list=range(10)
f_possible=120
my_iter=itertools.product(value_list,repeat=f_possible) #generates all possible lists 'f' having 120 elements, each element could have 10 different value, thus 10^120 different 'f'
for f in my_iter:
      if check_feas(f) == 1: #ckecks feasibility of the list f
          rev=cal_rev(f) # calculates revenue is f is feasible
          if rev > rev_opt:
              rev_opt=rev
              fob=f
Reply
#2
In the last 10 billion years, which seems to be twice the age of the Earth, there were only 3e26 nanoseconds. It means that iterating 1e120 times may take longer than you think.
Reply
#3
Without knowing more about check_feas() and cal_rev(), it's difficult to provide adequate and specific advice.

Off-hand, you could use multi-threading to split the list into smaller lists for multiple worker threads to process; though that can be challenging and there are other potential issues.

I'm thinking that check_feas could use some optimization to speed it up. Again, more information would be needed.

Have you tried collections.Counter? I speculate the order of the numbers is insignificant. In which case, you would only need to know how many of which numbers are present and a collections.Counter would do that and potentially speed things up.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  A more efficient code titanif 2 459 Oct-17-2023, 02:07 PM
Last Post: deanhystad
  Cleaning my code to make it more efficient BSDevo 13 1,275 Sep-27-2023, 10:39 PM
Last Post: BSDevo
  Making a function more efficient CatorCanulis 9 1,750 Oct-06-2022, 07:47 AM
Last Post: DPaul
  How would you (as an python expert) make this code more efficient/simple coder_sw99 3 1,755 Feb-21-2022, 10:52 AM
Last Post: Gribouillis
  Trying to search and make new column from the keyword. dgarg 1 1,454 Dec-20-2021, 08:41 PM
Last Post: deanhystad
  I there a more efficient way of printing ? Capitaine_Flam 7 3,415 Dec-01-2020, 10:37 AM
Last Post: buran
  A more efficient way of comparing two images in a python vukan 0 1,967 Mar-17-2020, 11:39 AM
Last Post: vukan
  Make dual vector dot-product more efficient technossomy 3 2,487 Nov-28-2019, 09:27 PM
Last Post: Gribouillis
  Simple problem. looking for an efficient way silverchicken24 3 2,288 Oct-14-2019, 07:13 PM
Last Post: Larz60+
  How can I make a faster search algorithm pianistseb 19 6,395 Apr-18-2019, 05:48 PM
Last Post: Larz60+

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020