Extremely small datapoints introduced from edf recordings after filtering

mne version: 1.8.0
OS: Win 10

Hello all,
I use mne to process edf recordings with code as below and it creates extremely small datapoints in my data such as:

Before filtering> Smallest positive: 1.6479743648449245e-07 Smallest negative decimal: -1.6479743648415023e-07 Max: 0.09253376058594608
After filtering> Smallest positive: 8.470329472543003e-22 Smallest negative decimal: -1.6940658945086007e-21 Max: 0.49086862614052496

Data is in microvolts units with 1.0 scaling factor.
I tried getting rid of either bandpass or notch filter, and using FIR filter instead of IIR default, and filtering only some eeg channels of interest instead of all, with same results.
Code I used is as following:

raw = mne.io.read_raw_edf(file_name, preload=True)
raw.filter(l_freq=0.5, h_freq=60, picks="eeg")
freqs = (50)
raw.notch_filter(freqs=freqs, picks="all")  

Since I later use machine learning to process this dataset, long strips (hundreds consequtively) of extremely small datapoints interleaved with normal range values makes the further normalization, scaling, or machine learning impossible. Data is also noisy and certainly in need of filtering.

Any helps or ideas are appreciated.