20161214, 22:20  #12  
Sep 2003
5·11·47 Posts 
Quote:
I have 7127, 7621 and 8291 with B1=100,000,000,000, and most of the others to B1=10,000,000,000 or better (the range above M17,000 still has a few more days to complete). I haven't manually reported them yet because I will do stage 2 later. For these small exponent ranges Stage 2 is best done with GMPECM, and I have already done some. For instance I already did 10399 and others with B1=10,000,000,000 and B2=505,851,529,607,770 , with no factors found. This was done in the cloud with a bit more than 32 GB of memory. Unfortunately, negative gmpecm results can't be reported to PrimeNet, I think it's necessary to email George or something. Even though I'm working on these myself, I'm somewhat pessimistic. I think Bob Silverman was of the opinion that it was more useful to throw more ECM at them instead of taking P−1 to very high levels. Recall that some factors are basically invisible to P−1 (the ones that happen to have nonsmooth k). I'm finding a bunch of second, third and higher factors, but any exponents that still have no known factors have already been tested with ECM to a pretty large depth without success. I'm not sure if 16GB is enough. I'm generally using between 32 and 64 GB using cloud instances. I am looking forward to availability of the recently announced AWS Batch, which will allow doing batch jobs in the cloud. You can just specify the number of cores and the amount of memory, and the job will get scheduled for you without you having to manage the virtual machine instances yourself. 

20161215, 03:31  #13  
1976 Toyota Corona years forever!
"Wayne"
Nov 2006
Saskatchewan, Canada
3×7×227 Posts 
Quote:
P.S. Does GMPECM make a BIG difference for Stage 2 for exponents of this size? 

20161215, 05:19  #14 
Romulan Interpreter
Jun 2011
Thailand
10011000110100_{2} Posts 
Can you KEEP the stage 1 files? Is there a server available where we can collect them? Much easier to extend later, avoiding lots of duplicate effort. I have many of those from past efforts, but looking to the numbers in your table, I am far behind, and they can not really be used.

20161215, 07:34  #15 
Dec 2012
2×139 Posts 
If anyone would like the save files for the 25 numbers that I worked on which pertw1 listed, I can PM you. It will not save you much though, maybe 35 GHzD in all.

20161215, 09:23  #16  
Sep 2003
5031_{8} Posts 
Quote:
Note however that if you have the standard NumBackupFiles=3 in your prime.txt, then mprime will also retain the .bu and .bu2 files (frankly this seems like a bug), and these can be huge, presumably because they reflect the high memory usage during stage 2. To avoid this, I always set NumBackupFiles=1 in any working directory where I'm doing P−1 testing. At the moment they aren't setup for public retrieval, but it would be possible to do so by creating a static website via Amazon S3 for that purpose. 

20161215, 11:00  #17  
Sep 2003
5×11×47 Posts 
Quote:
For example, James Hintz this past September did a P−1 test of M1619 (no known factors) using B1=3,790,662,020,300 and B2=303,252,961,624,000 (about 3.0 × 10^{14}). Using GMPECM for stage 2 instead of mprime, you could easily surpass this B2 value in fairly short order. And if you threw a really large amount of memory at it, say up to 64 GB, you could even take it to B2 = 10^{17} in a few hours. It might be worth contacting him if anyone knows how, to see if he still has the savefiles. Nevertheless, the odds of finding a factor with P−1 are extremely small, given the amount of ECM testing that has already been done (to the t=60 level). The same principle would apply with ECM testing, it's best to use GMPECM for stage 2. 

20161215, 22:18  #18  
"GIMFS"
Sep 2002
Oeiras, Portugal
2·3^{2}·83 Posts 
Quote:
I´m still running ECM curves on small exponents, and I will continue doing it throughout next year. I have already managed to find a first factor for a 15K and another for a 20 K exponent, and several others fo exponents under 100K, and my goal for next year is to find a first factor for a sub10K exponent. It´s probably too ambitious, as I have only an i5750 with 16GB of DDR31600 memory dedicated to the job, but hey, I´m with GIMPS for nearly 15 years, have done lots of DC, P1 and some 1st time LL in the past, not to mention large amounts of TF on GPUs, and now it´s time to do something different. After all, the most important thing is to have fun, isn´t it? 

20161215, 22:44  #19 
1976 Toyota Corona years forever!
"Wayne"
Nov 2006
Saskatchewan, Canada
1001010011111_{2} Posts 
How much is "... a lot more memory ..." I have as little as 4G on some of the PC's I want to run this on.

20161215, 22:56  #20  
Nov 2008
1F5_{16} Posts 
Quote:
I'm testing 4007 and with B1=2.9B I need to use maxmem=14000 to avoid it crashing when it tries to use more than 16gig 

20161216, 02:12  #21 
Sep 2003
101000011001_{2} Posts 
Yes, but in doing so you lose some of the benefit of using gmpecm
Let's say you want to do an exponent to B2 = 100,000 but you only have enough memory to do B2 = 1000 (I'm using ridiculously low values in order to simplify the example). No problem, you say, I'll just run gmpecm 100 times, specifying a different range for B2 each time: 01000 10002000 20003000 ... 99000100000 and that will take 100 times as long as doing only B2 = 1000. However, because GMPECM uses an O(sqrt(N)) algorithm for stage 2, if you had enough memory to do B2 = 100,000 in one shot instead of in a hundred separate runs, then that would only take 10 times as long as doing B2 = 1000. So in that scenario, the lowmemory machine either takes ten times longer than the highermemory machine, or has to use a B2 limit that is ten times smaller than the highermemory machine. Last fiddled with by GP2 on 20161216 at 02:28 
20161216, 02:24  #22  
Sep 2003
5·11·47 Posts 
Quote:
From memory and off the top of my head, if you are running GMPECM and you increase B2 by 10 times and leave everything else the same increases both the memory and the run time by about 2.5 to 3 times. 

Thread Tools  
Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Modular restrictions on factors of Mersenne numbers  siegert81  Math  23  20140318 11:50 
Mersenne prime factors of very large numbers  devarajkandadai  Miscellaneous Math  15  20120529 13:18 
Factors of Mersenne Numbers  asdf  Math  17  20040724 14:00 
Factoring Smallest Fermat Numbers  Erasmus  Factoring  32  20040227 11:41 
Factors of Mersenne numbers ?  Fusion_power  Math  13  20031028 20:52 