First of all, let me say that I find nicegui among the best projects currently around and the feedback from developers and the community has been fantastic so far.
My question is about loading large datasets stored in .hdf files. Normally, I use h5py module to load the datasets into numpy arrays. If the file is on the local disk, then this operation is fairly fast, however things get slower when using a file stored on the network (NFS).
I have been using the following code to read in a non-blocking way, but the behaviour is a bit unpredictable, sometimes it blocks (causing the GUI to crash and reload after release), sometimes it doesn't.
from asgiref.sync import sync_to_async
from h5py import File as getHDF5
from nicegui import app, ui
import time
# Function that actually loads the data
def _dictFromHDF(filePath, sleepTime):
outputDict = {}
with getHDF5(filePath, 'r') as f:
outputDict['data'] = f['data'][:]
time.sleep(sleepTime)
return outputDict
def main(args):
path = args[0]
delay = args[1]
data = _dictFromHDF(path, delay)
return data
# Call from the main event loop which is called by ui.run()
async def load_data(delay):
t0 = time.time()
path = 'test.hdf'
args = []
args.append(path)
args.append(delay)
loadedData = await sync_to_async(main)(args)
ui.notify("Done in {:.2f} s".format(time.time()-t0))
return loadedData
# Main event loop (just an example)
delay = ui.number("Sleep [s]", value=0)
ui.button("Load data", on_click=lambda: load_data(delay.value))
ui.run()
Increasing the sleep delay, the script works fine and never blocks the main loop and thus the GUI. However, when the reading time of the hdf is long, the thing doesn't work well. From here there are obviously a few questions. Am I doing something completely wrong, trying to adapt the synchronous logic in an improper way? Is this a bug of some sort and it could be rewritten a bit differently?
Thanks!