stopping a function from taking 100% of ram

56 views Asked by At

I am making an app that has encryption with PyCryptodome, here is the code that does the encryption:

class Cipher:
    def __init__(self, password: str, key: bytes = None):
        self.key = PBKDF2(password, key if key else password_as_key(password), dkLen=32)
        self.cipher = AES.new(self.key, AES.MODE_CBC, iv=get_random_bytes(16))

    def encrypt(self, data: bytes) -> bytes:
        return self.cipher.encrypt(pad(data, AES.block_size)) + self.cipher.iv

    def decrypt(self, data: bytes) -> bytes:
        encrypted_data = data[:-16]
        iv = data[-16:]
        self.cipher = AES.new(self.key, AES.MODE_CBC, iv=iv)
        return unpad(self.cipher.decrypt(encrypted_data), AES.block_size)

While this class works great with most files, however, files that are too large (i tested on this file: https://github.com/szalony9szymek/large) crash the program with a memory error. i want to make sure that even if the files are that big, the program would have some sort of a defense to make sure the app doesn't crash (and hopefully can show a loading screen instead, and keep encrypting when more memory can be allocated).

how can i implement such a thing on this Cipher class?

some extra data: OS - windows 11 Home ram - 8.00 GB (7.68 GB usable) processor - 12th Gen Intel(R) Core(TM) i5-1235U 1.30 GHz

0

There are 0 answers