# MPAI-NNW v1.1 implementation This code refers to the implementation of the MPAI-NNW under MPAI-AIF, as described in https://mpai.community/wp-content/uploads/2023/10/Reference-Software-Neural-Network-Watermarking-V1.pdf. All the code is based in Python (APIs and . **Implemented APIs** 1. MPAI_AIFS_GetAndParseArchive, unzip and parse json and AIMs. 2. MPAI_AIFM_AIM_{Start,Pause,Resume,Stop,GetStatus}, to process on AIM/AIW. 3. MPAI_AIFM_Port_Input_{Write,Read,Reset}, to process on the port of the AIMs. **Controller/User Agent** 1. Controller is deployed under the socket library (waiting request from _input.py_). 2. User Agent can trigger and run command by sending inputs. 3. _config.py_ shares some variables among the different files. **Folders** - **all_AIW** stores all the different AIW that are implemented - NNW_NNW-QAM,NNW_NNW-QAM-Checker for the Multimodal Question Answering watermarking use case - NNWImp,NNWRob for the controller_NNW - **resources** external elements for some use cases (uncorrelated images for ADI, context/question of the MQA, ...) - **Attacks** contains all the specified attacks of MPAI-NNW under the PyTorch Framework. **Specificity to MPAI-NNW** 1. _utils.py_ contains function link to the dataset/dataloader under the PyToch framework. 2. _UCHIDA.py_ / _ADI.py_ correspond to the Neural Network Watermarking technology under evaluation. 3. AIW.zip is composed of the corresponding .json and the AIMs as Python file. 4. Folder _Attacks_ contains all the specified attacks of MPAI-NNW under the PyTorch Framework. ## Installation Code was designed and tested on an Ubuntu 20.04 operating system using anaconda 23.7.2 and Python 3.9. An environment with all the necessary libraries can be created using: ```bash conda create --name --file requirements.txt ``` ## Run **Initialisation** First the Controller should be initialized (the command '-W ignore' can be added to avoid warning message during execution): ```bash conda activate python controller.py Controller Initialized ``` To send commands to the controller as a user agent, a second terminal should be open, and run: ```bash conda activate python input.py input: ``` **Emulation of MPAI Store** Emulate the folder of the computer as a website using the command: ```bash python3 -m http.server ``` Then the command simulate the downloading of the AIW from a website: ```bash conda activate python input.py input: wget http://0.0.0.0:8000/[yourpath]/AIW.zip ``` ### **List of command for controller** This command will open a window for the selection of the AIW.zip folder: ```bash (env) python input.py input: getparse ``` This command will run the AIW (after being parsed): ```bash (env) python input.py input: run all ``` windows using tkinter will ask for the different files **List of command for controller_NNW** (old) for [AIWImp/AIWRob] This command will open a window for the selection of the AIW.zip folder: ```bash (env) python input.py input: getparse ``` This command will open set the Computational Cost flag ON: ```bash (env) python input.py input: ComputationalCost True ``` This command will open run the Robustness AIW with the **1** Modification and **{"P":0.5}** Parameters ```bash (env) python input.py input: run robustness 1 {"P":0.5} ``` This command will open run the Imperceptibility AIW with **vgg16** as watermarked AIM and trained on the **CIFAR10** dataset ```bash (env) python input.py input: run imperceptibility vgg16 cifar10 ``` ### Some warnings 1. The AIW should be named AIW.zip and contained the .json and the needed AIMs. 2. The code does not permit misspelling. # Licence Licence information are detailed in the MPAI website