I have developped a set of QGIS processing scripts that rely on some background data stored in a postgreSQL database. During my development phase I had the database on my localost server and connected with hardcoded credentials in the QgsDataSourceUri connection. Now we have a common server and I want to enable my colleagues to use the scripts as well. I have migrated the database to the new server and each user who runs a script needs to authenticate with their own respective credentials. My first attempt to solve this is based on this great answer and works well so far:
uri = QgsDataSourceURI()
# assign this information before you query the QgsCredentials data store
uri.setConnection("HOST", "PORT", "DB_NAME", None, None)
connInfo = uri.connectionInfo()
(success, user, passwd) = QgsCredentials.instance().get(connInfo, None, None)
if success:
uri.setPassword(passwd)
uri.setUsername(user)
uri.setDataSource("SCHEMA", "TABLE_NAME", "the_geom")
layer = QgsVectorLayer(uri.uri(), "Layer", "postgres")
The drawback of this method is that the users need to enter their credentials each time they run a script, even when they have already authenticated themselves by entering their master password and already have access to the databases through the QGIS Explorer.
Is there a way to make use of the authentication manager in the processing scripts, so that the users only have to authenticate themselves once per session and not each time they run a script?
What security implications would this have?
I would be very greatful for any advice!