# [ot][spam][random][crazy][random][crazy]

Undescribed Horrific Abuse, One Victim & Survivor of Many gmkarl at gmail.com
Tue Nov 15 02:18:53 PST 2022

```NOTE: I recall I was thinking the max_freq should slide up to higher
than 0.5 when there is a min_freq higher than 1/n (and no max_freq
specified).

notes

- the test code indexes by floor to produce the signal
- the comparison code transforms the signal out of time sequence, then
back in at a different rate

considering transformation:
- the freq2time transformation produces a matrix where columns are
time, and rows are frequencies
- inverting this matrix and multiplying time data by it, produces
frequency data, where each offset in the vector is a frequency and the
vector contains magnitude
- multiplying the frequency vector by the matrix recovers the time data

I've set np.random.seed(0) so I can reproduce the same data.

198         # extract a repeating wave
199         longvec = np.empty(len(randvec)*100)
200 B       short_count = np.random.random() * 4 + 1
201         short_duration = len(longvec) / short_count
202  ->     for idx in range(len(longvec)):
203             longvec[idx] = randvec[int((idx / len(longvec) *
short_duration) % len(randvec))]

(Pdb) p short_count
4.330479382191752
(Pdb) p short_duration
369.47410639563054
(Pdb) p len(longvec)
1600
(Pdb) p len(randvec)
16

The data is stretched to a very low frequency (16 samples -> 369
samples), so it should be easy to extract.

(Pdb) p ((np.array([0,1,2])/len(longvec) * short_duration) % len(randvec))
array([0.        , 0.23092132, 0.46184263])
(Pdb) p [int(x) for x in ((np.array([0,1,2])/len(longvec) *
short_duration) % len(randvec))]
[0, 0, 0]
(Pdb) p randvec[0]
0.5488135039273248

54                 min_freq = 1 / repetition_time
55             elif repetition_samples:
56  ->             min_freq = freq_sample_rate / repetition_samples
57             else:
58                 min_freq = freq_sample_rate / freq_count

(Pdb) p repetition_samples
369.47410639563054
(Pdb) p freq_sample_rate
1
(Pdb) p freq_sample_rate / repetition_samples
0.0027065496138698455
(Pdb) p freq_count
30

freq_count is 30 because it is making 16 positive frequencies, so
there would be 14 negative ones. min_freq will be 0.0027 . I am
guessing that there is a bug showing here, but I think it will be
clearer for me to see the impact when the data is near.

58                 min_freq = freq_sample_rate / freq_count
59         min_freq /= sample_rate
60  ->     if freq_count % 2 == 0:
61             if max_freq is None:
(Pdb) p sample_rate
1
(Pdb) p min_freq
0.0027065496138698455

I am thinking part of a logical error could relate to the sample rates
being passed the same, but actually being quite different. Making
clear definitions of the meanings of the two sample rates, and
reviewing the fftfreq function to enforce those meanings, could seem

106         offsets = np.arange(time_count)
107         #mat = np.exp(2j * np.pi * np.outer(freqs, offsets))
108  ->     mat = wavelet(np.outer(freqs, offsets))
109         return mat / len(freqs) # scaled to match numpy convention

(Pdb) p freqs
array([0.        , 0.00270655, 0.03822751, 0.07374847, 0.10926943,
0.14479039, 0.18031135, 0.21583231, 0.25135327, 0.28687424,
0.3223952 , 0.35791616, 0.39343712, 0.42895808, 0.46447904,
0.5       ])
(Pdb) p offsets
array([   0,    1,    2, ..., 1597, 1598, 1599])

(Pdb) p np.outer(freqs,offsets)[:,:2]
array([[0.        , 0.        ],
[0.        , 0.00270655],
[0.        , 0.03822751],
[0.        , 0.07374847],
[0.        , 0.10926943],
[0.        , 0.14479039],
[0.        , 0.18031135],
[0.        , 0.21583231],
[0.        , 0.25135327],
[0.        , 0.28687424],
[0.        , 0.3223952 ],
[0.        , 0.35791616],
[0.        , 0.39343712],
[0.        , 0.42895808],
[0.        , 0.46447904],
[0.        , 0.5       ]])

(Pdb) p wavelet(np.outer(freqs,offsets)[:,:2])
array([[-1., -1.],
[-1., -1.],
[-1., -1.],
[-1., -1.],
[-1., -1.],
[-1., -1.],
[-1., -1.],
[-1., -1.],
[-1., -1.],
[-1., -1.],
[-1., -1.],
[-1., -1.],
[-1., -1.],
[-1., -1.],
[-1., -1.],
[-1.,  1.]])
```