Opened 10 years ago
Closed 10 years ago
#728 closed defect (fixed)
'Cannot allocate memory' situation for large decays
Reported by: | Juergen Reuter | Owned by: | kilian |
---|---|---|---|
Priority: | P1 | Milestone: | v2.2.7 |
Component: | core | Version: | 2.2.6 |
Severity: | major | Keywords: | |
Cc: |
Description
Lukas Mitzka reported for a SARAH-generated model file of an extended SUSY model, that in the process of event generation, the C system library could not allocate memory. This is possibly due to an overflow of a matrix element counter. cf. below
Attachments (1)
Change History (5)
comment:1 Changed 10 years ago by
comment:3 Changed 10 years ago by
@tho: thanks for the debugging info.
This is not a 2->32 process, it is the state matrix for a decay chain.
There is, first, the decay algorithm that accumulates all possible branches and thus lets the state matrix grow and grow. However, the size of 69632 should not pose a problem, maybe an inefficiency.
But then this pairing_proto
array is allocated with the number of matrix elements squared. Crash.
It should be possible to avoid an array with the size squared. I'll try.
comment:4 Changed 10 years ago by
Resolution: | → fixed |
---|---|
Status: | new → closed |
Should be fixed in r7009. Simplified the caching for the pairing algorithm in evaluator_init_qn_sum.
In qft.nw, the following code lines:
are screweg up, the variable
N_ME_OLD
gets the value 69632, which leads to a crash. Hence,get_n_matrix_elements()
doesn't deliver the correct values.Debug output: