Why is it bad to manage Python imports like this?

0

I've personally noticed that when writing new python files, a big timewaster for me at the beginning is doing imports for things that I will always use in every single python project I ever make. Things like os, numpy, math, time, random, etc.

Whilst I could simply copy/paste the imports to each new file, a much better solution I've found is to have a custom package named "imports" in my PYTHONPATH with an __init.py__ that reads: import os import numpy as np etc..' for all my commonly used packages.

Then, in every new python script I create, I simply type from imports import * and I instantly have access to all the base packages that I frequently use. Another benefit of doing this is it allows me one central location to easily manage the packages I'm using. If I want to use a new base package in many files, rather than having to do an import in each file, I can just include it in my custom imports package __init.py__ file.

But it doesn't end there, I've taken this idea even further. I have a different imports file for specific project types. For example if I'm doing Machine Learning, I'll ALWAYS want to do imports like from keras.layers import Dense. So rather than doing that in every single .py file related to my ML projects, instead I include all these imports in a custom package named import_ml with an __init.py__ that imports all these keras layers, of which there are a hundred or so.

Again, you might say "why not just import these layers directly in the project file as needed?". The reason being that I want to just type "Conv2D" and have the function already there from the get go. I don't want to have forgotten to import something, import it, restart the python shell to include the new import, then do this again many times because I forgot an import statement I always use related to keras.

I've noticed another hidden benefit to doing things this way. By being able to simply focus on using functions such as "Conv2D" rather than worrying about the imports, if my import scope changes I don't need to manually go to every single machine learning related python file and change the import namespaces, instead I can just go to the single, central import_nn package init file and update things there one time only. An example of this that occurred recently? Instead of using keras.layers.convolutional.Conv2D for my Conv2D I wanted to instead move to tf.keras.layers.Conv2D which has an identical function given my inputs but in a different and more recent package.

If I did things "the python way" then I'd have to manually go into every single file I use that uses keras and update the import statements, even though the Conv2D function in the new namespace behaves in the exact same manner with regards to its inputs and outputs. Doing things the way I'm doing them however only required updating the imports one single time in my imports_ml package, and it was all done. I could begin typing "Conv2D" functions everywhere without having to worry about anything else since all these other python files had a simple from imports_ml import * at the top.

I've heard this is quite an unconventional thing to do, but I'm very much curious as to why it's a bad thing to manage imports the way I am? Ie: Separating central import packages into different project types, and just doing one-time imports from there in each file depending upon the file's project type. I understand that there may be some imports that I don't use doing things this way, but the trivial computational overhead in importing them is for me extremely trivial in comparison to the ease of use this method provides.

So I'm curious why is it really a bad thing to do imports this way, and why should I instead manually import packages in every single python file and accept this as an added and necessary time cost for every project?

user4779

Posted 2019-04-25T05:08:39.570

Reputation: 121

No answers