I have been working on a project that involves multiple files and have been having some issues with imports. I have read about 20 articles and stack overflow questions, but none seems to explain this behavior.
If I have a folder system that looks like this:
\package
\folder_a
__init__.py
b.py
__init__.py
a.py
with a simple hello function in b.py. In a.py I have from package.folder_a.b import hello. When I run a.py, I get the error "ModuleNotFoundError: no module named 'package'".
Why does this happen? When a.py is run, isn't package the top level directory?
Now, I have found a fix for this by simply changing the code in a.py to from folder_a.b import hello, but this comes with it another set of issues. let us say that I am working with a slightly more complicated file structure(witch I am) such as:
\package
\folder_a
__init__.py
b.py
\folder_b
__init__.py
c.py
__init__.py
a.py
Here I want to import hello into c.py. BOTH from package.folder_a.b import hello AND from folder_a.b import hello reult in the same "ModuleNotFoundError: no module named 'package/folder_a(repectivly)'". I presume that this is because the file does not add thoes folders to the namespace when the file is executed (importing c into a when running a.py allows the import to work). Is there any way to force \package to be the top level module that does not involve running a script from a file in it's own directory? I hope this makes sense. Thanks.
Python treats the directory where the script being executed resides as the top level directory, as an example when executing b.py in directory folder_b, /folder_b would be considered as the top level directory and not the /package. to tackle your situation, you can use relative imports. Here's how you can adjust your imports in a.py and c.py:
file a.py -> from .folder_a.b import hello
file c.py -> from ..folder_a.b import hello
This allows Python to correctly resolve the module paths regardless of the current working directory or the location of the script being executed.