App Connect Enterprise Vault

March 29, 2022

Since App Connect Enterprise v11 Fixpack 6 a new way of storing data records is provided. A vault can be used to encrypt and decrypt data records using symmetric encryption. Symmetric encryption is a type of encryption where only one key (a secret key) is used to both encrypt and decrypt information. In this case AES 256-bit encryption is used to encrypt this data. 256-bit AES encryption refers to the process of concealing plaintext data using the AES algorithm and an AES key length of 256 bits. In addition, 256 bits is the largest AES key length size, as well as its most mathematically complex. It is also the most difficult to crack.

Therefor this solution is a much better option in comparison to the older way of storing credentials using "mqsisetdbparms" at clients where the security aspect is more important. Mqsivault is also focused more on the use of Standalone Integration Server and containerization. Moving forward this will be the new way of working once "mqsisetdbparms" eventually becomes deprecated.

mqsivault

Before you can store encrypted credentials for an integration node or integration server, you must configure an App Connect Enterprise vault. You create a separate vault for each independent integration server, and for each integration node. Each independent integration server or Integration Node has its own vault, with its own vault key. Each integration node Vault is shared by all the integration servers that it manages. Each integration server that is managed by an integration node has its own credentials stored in the vault, but all the credentials in the vault are accessed by the same vault key.

You can use the mqsivault command to create or destroy a vault, change or verify a vault key, or retrieve credentials from the vault. The vault stores the credentials (in encrypted form), and the integration node or server uses them to access secured resources.

If a vault has been created in the work directory, then the vault key must be specified when starting the Integration Server. The vault key can be stored in a .mqsivaultrc file which can be used instead of specifying the vault key as a parameter value for a command.

For more information about this command, see mqsivault command.

mqsicreatebroker

If you create an integration node by running the mqsicreatebroker command, you can create a vault for that integration node by specifying either the --vault-key or --vaultrc-location parameter on the command. If either the --vault-key or --vaultrc-location parameter is specified on the command, an App Connect Enterprise vault is created to hold the credentials used by the integration node when accessing secured resources.

The following parameters should be used when creating the vault directly when creating a new integration node:

--vault-key <vaultKey>

This parameter specifies the vault key to be used for creating the vault. If this parameter is specified on the command, an App Connect Enterprise vault is created.

--vaultrc-location <mqsivaultrc_file_location>

This parameter specifies the location of the .mqsivaultrc file that is used to locate the vault key. f this parameter is specified on the command, an App Connect Enterprise vault is created.

The full command would look as follows:

For more information about this command, see mqsicreatebroker command.

mqsicredentials

Use the mqsicredentials command to create, update, retrieve, or delete the security credentials for resources that are used by an integration node or integration server. The credentials are stored in an encrypted form in the App Connect Enterprise vault.

For more information about this command, see mqsicredentials command.

Test Setup

For getting familiar with the new command and understand the difference between the old way of storing credentials a poc was done. The test setup for this poc goes as follows:

  1. Create a small flow that requires a secure connection to a certain resource.
    • In my test this will be an HTTP flow that connects to a data source and sends back the result using HTTP Reply.
  2. Deploy the flow to an ACE instance with minimum fix pack version 6.
    • I will be using ACE v12 Fix pack 3 for this test.
  3. Make sure the connection to the secured resource is configured correctly on that server.
    • For my test I will be adding the data source to the odbc.ini file.
  4. Create the ACE Vault using the mqsivault command.
  5. Create the encrypted credential records using the mqsicredential command.
  6. Decrypt and check if the credentials are stored correctly
  7. Test if you can create a second set of the same credentials that are already defined using the mqsisetdbparms command.
  8. Test what happens when you want to update the credentials. Is a restart required? Do I need to stop the integration server before executing the command?
  9. Test the behavior of the --all-integration-servers parameter and determine when this would be useful.

Execution

Creating the flow

For this test, I have created a simple flow that uses an HTTP Input Node to trigger a DB query to a secured data source. When the data is received it is transformed into a json format and sent back to the caller.

Using HTTPS we can trigger the flow by doing a GET request to "https://<host>:<port>/pull/amount/data".

Using a compute node we will perform a query to the data source that is defined in its properties.

For executing the query the following code will be used: ( the ViewName is defined as a UDP directly on the flow level: ACL_CCT_CustomerLedger_Summary)

The next step will be to send back the result of the query in a more readable fashion. 

Lastly, the result is sent back to the caller using the HTTP Reply node.

Testing the flow

After having deployed the flow to an Integration server that we will be using for this POC we can check if the flow is providing the expected result.

Because no error handling was added to the flow a default error is thrown by the node providing us with a list of reasons why the connection might have failed to the data source. Of course we didn't create the vault yet or added the credentials for accessing the data source using mqsisetdbparms. The following error of the response is therefore the most likely cause : BIP2348E: Error detected while attempting to obtain a connection to data source O2015_DB_TSTof type ODBC using userid mqsiUser.

Doing this test, we now at least know that the flow can be called and that creating the vault and credentials should resolve this issue.

Creating the Vault

We will be doing our test on the following integration node:

mqsilist <Integration_Node_Name>

Before creating the vault we will have to stop the integration node. If we don't stop the integration node the commands to create the vault cannot be executed. 

mqsistop <Integration_Node_Name>

Verify that the integration node is stopped using mqsilist <Integration_Node_Name>

Create a .mqsivaultrc file that contains the vault key in the home directory of the runtime user. This location will also be the location that the integration node will look for by default.

This can be done using the mqsivault command in combination with the --vaultrc-store-key parameter,

mqsivault <Integration_Node_Name>--vaultrc-store-key --vault-key <vaultkey> --vaultrc-location <mqsivaultrc_file_location>

Create the App Connect Enterprise Vault. Because the .mqsivaultrc is created in the home directory we dont have to provide it in the command. It will be found automatically. When using the vault-key option instead we will have to provide it every time we execute a vault command.

Start the integration node. Again, Providing the location of the .mqsivaultrc is optional in this case because we saved it in the homedirectory.

mqsistart <Integration_Node_Name> OR (mqsistart <Integration_Node_Name> -vault-key myvaultkey)

To destroy the vault when no longer needed we can use the following command.

mqsivault <Integration_Node_Name> --destroy

Creating the Credentials

Now it is time to create the credentials that we will use in our test. This can be done when the integration server is running and WILL NOT require a restart when creating the credentials for the first time. It would however require a restart when updating the credentials.

m

mqsicredentials <Integration_Node_Name> -e <Integration_Server_Name> --create --credential-name <DSN_NAME> --credential-type odbc --username <USERNAME> --password <PASSWORD>

To check if the credentials we created are correctly configured we can report them using the following command. 

mqsicredentials <Integration_Node_Name> -e <Integration_Server_Name> --report

As you can see in the commands below also the mqsisetdbparms credentials are shown. This makes it easier when migration to the vault as no credentials with the same DSN can be created in both the vault and the mqsisetdbparms.

To check if the credentials we created also have the correct password configured we can decrypt and report them using the following command. This can also be useful when the credentials were not documented properly. (retrieving the credentials was never possible using mqsisetdbparms. So, this is also a huge advantage in comparison to the old way of storing credentials)

mqsivault <Integration_Node_Name> -e <Integratrion_Server> --decode credentials/odbc/<DSN_NAME> --vault-key myvaultkey_

For this command we do need the vault key for security reasons. 

Deleting the credentials can be done using the following command

mqsicredentials <Integration_Node_Name> -e <Integration_Server_Name> --delete --credential-name <DSN_NAME> --credential-type odbc

Testing the flow with MQSIVAULT

Now that everything is configured correctly we can test the flow.

Make sure the integration node is running:

Test the flow again using postman:

As you can see the flow is now working correctly and using the credentials we defined in the vault.

Extra Tests

1. Test if you can create a second set of the same credentials that are already defined using the mqsisetdbparms command. In our previous command where we did a report of all stored credentials we noticed there was already an entry for LINEMAIL01.
BIP15110I: The credential name 'LINEMAIL01' of type 'email' contains user name 'fmb_150323162759' from provider 'setdbparms' and has the following properties defined: 'password'

We will now try and create the same credentials in the vault. As you can see this is not possible and we get the desired result.

2. Test what happens when you want to update the credentials. Is a restart required? Do I need to stop the integration server before executing the command?

As you can see we first need to stop the integration node before executing this command. It is not possible to update it on the fly because a flow might still be using those credentials causing a transaction to fail.

3. Test the behavior of the --all-integration-servers parameter and determine when this would be useful.

Because of course we only have one integration server per integration node the command will only show that integration server and not a list of multiple integration servers. It is important to keep in mind that this command will add credentials for all integration servers. This means that should we want to move to containers or standalone integration servers it would be more difficult to know what credentials are used by what integration server. The only added benefit for this in my opinion is when the same credentials are used by all integration server. Using this command for all credentials that are not necessarily used by all integration servers could also potentially provide a security risk.

Independent Integration Server

Independent integration servers (not associated with an integration node) are very useful to quickly get up and running with App Connect Enterprise software, especially if you are working in a development phase of your project or trying out the product for the first time. If you are planning to run the App Connect Enterprise in conjunction with a container framework such as Kubernetes or IBM Cloud Private, then it is the responsibility of this framework to ensure that the servers remain running (or are restarted appropriately) so in this situation using independent integration servers would be the better choice.

Because it would be interesting to also work with an app connect enterprise vault in such a setup it is important to know how this should be configured and how it differes from a setup with a Integration Node that manages the Integration Server.

Creating the Vault

We will be doing our test on the following independent integration Server:

IntegrationServer --work-dir <Working Directory of Integration Server>

Before creating the vault we will have to stop the Independent integration server. If we don't stop the integration server the commands to create the vault cannot be executed. This can be done by killing the process of the running integration server or using the rest admin.

POST http://localhost:7600/apiv2/shutdown

When executing this command the following information will be shown in the console window.

rest API

The rest API documentation can always be retrieved using the following url:

http://localhost:7600/apidocs

Create the App Connect Enterprise Vault.

mqsivault --work-dir c:\myaceworkdir --vault-key vaultKey --create

Creating the Credentials

Now it is time to create the credentials that we will use in our test. This can be done when the integration server is running and WILL NOT require a restart when creating the credentials for the first time. It would however require a restart when updating the credentials.

mqsicredentials --work-dir c:\myaceworkdir --create --credential-name <DSN_NAME> --credential-type odbc --username <USERNAME> --password <PASSWORD>

To check if the credentials we created are correctly configured we can report them using the following command. 

mqsicredentials --work-dir c:\myaceworkdir --report

As you can see in the commands below also the setdbparms credentials are shown. This makes it easier when migration to the vault as no credentials with the same DSN can be created in both the vault and setdbparms.

To check if the credentials we created also have the correct password configured we can decrypt and report them using the following command. This can also be useful when the credentials where not documented properly. (retrieving the credentials was never possible using mqsisetdbparms. So, this is also a huge advantage in comparison to the old way of storing credentials)

mqsivault --work-dir c:\myaceworkdir --decode credentials/odbc/<DSN_NAME> --vault-key myvaultkey_

For this command, we do need the vault key for security reasons. 

Deleting the credentials can be done using the following command

mqsicredentials --work-dir c:\myaceworkdir --delete --credential-name <DSN_NAME> --credential-type odbc

https://www.linkedin.com/in/cocxfrancis/

© 2019 Integration Designers - 
Privacy policy
 - Website by 
OneDot
 - Part of 
Cronos Group
 & 
integr8 consulting
map-marker linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram