The function store_in_shm
writes a numpy array to the shared memory while the second function read_from_shm
creates a numpy array using data in the same shared memory space and returns the numpy array.
However, running the code in Python 3.8 gives the following segmentation error:
zsh: segmentation fault python foo.py
Why is there no problem accessing the numpy array from inside the function read_from_shm
, but a segmentation error appears when accessing the numpy array again outside of the function?
Output:
From read_from_shm(): [0 1 2 3 4 5 6 7 8 9]
zsh: segmentation fault python foo.py
% /Users/athena/opt/anaconda3/envs/test/lib/python3.8/multiprocessing/resource_tracker.py:203: UserWarning: resource_tracker: There appear to be 1 leaked shared_memory objects to clean up at shutdown
warnings.warn('resource_tracker: There appear to be %d '
foo.py
import numpy as np
from multiprocessing import shared_memory
def store_in_shm(data):
shm = shared_memory.SharedMemory(name='foo', create=True, size=data.nbytes)
shmData = np.ndarray(data.shape, dtype=data.dtype, buffer=shm.buf)
shmData[:] = data[:]
shm.close()
return shm
def read_from_shm(shape, dtype):
shm = shared_memory.SharedMemory(name='foo', create=False)
shmData = np.ndarray(shape, dtype, buffer=shm.buf)
print('From read_from_shm():', shmData)
return shmData
if __name__ == '__main__':
data = np.arange(10)
shm = store_in_shm(data)
shmData = read_from_shm(data.shape, data.dtype)
print('From __main__:', shmData) # no seg fault if we comment this line
shm.unlink()
See Question&Answers more detail:os