Microsoft ends support for Internet Explorer on June 16, 2022.
We recommend using one of the browsers listed below.

  • Microsoft Edge(Latest version) 
  • Mozilla Firefox(Latest version) 
  • Google Chrome(Latest version) 
  • Apple Safari(Latest version) 

Please contact your browser provider for download and installation instructions.

Open search panel Close search panel Open menu Close menu

March 29, 2024

Information

NTT has achieved the highest performance ever in the benchmark launched in the federated learning competition at NeurIPS2023, the biggest international conference in AI.

Nippon Telegraph and Telephone Corporation (Head Office: Chiyoda-ku, Tokyo President and CEO: Akira Shimada, hereinafter "NTT") has achieved the highest performance ever in the "Privacy Preserving Federated Learning Document Visual Question Answering*1" (PFL Benchmarks), the first competition concerning federated learning, held at NeurIPS2023 co-workshop, an international conference on challenging issues in the field of AI and machine learning. PFL benchmarks are still open, with the latest results now publicly available.

Federated learning is a technology that facilitates the training of AI models across various organizations without aggregating entire data sets from multiple owners/organizations in a single site. Federated learning has gained much attention since it addresses the challenges of handling private/confidential information, making it difficult to bring data itself outside data owner. In the PFL benchmarks, it is requested to train AI models to automatically manage the information of document workflows that tend to contain private/confidential information (e.g. invoices, business manuals, etc.) under federated learning set-ups. The participant's objective is to train AI models that achieve the best performance on question-answering for those documents. Additionally, differential privacy*2 is applied as the de-facto standard for privacy-aware AI model training, and the method must satisfy the pre-defined privacy level based on differential privacy.

NTT has been actively engaged in research and development of federated learning technologies, achieving high accuracy while guaranteeing pre-defined privacy level based on differential privacy.*3 Researchers Takumi Fukami, Yusuke Yamasaki, Iifan Tyou, and Kenta Niwa worked on PFL benchmarks and improved NTT's federated learning algorithm to achieve the highest performance ever.

NTT aims to the promotion of the utilization of data that is distributed and accumulated across organizations through the development of federated learning, which is expected to protect and utilize information, and to put them into practical use as IOWN PETs*4 functions promoted by NTT.

*1Privacy Preserving Federated Learning Documents Visual Question Answering: Challenges in AI and Machine Learning: A competition on federated learning held at NeurIPS2023 co-workshop and continuously updated as the first open benchmark on the safety of federated learning
https://neurips.cc/virtual/2023/competition/66580Open other window
https://benchmarks.elsa-ai.eu/?ch=2&com=evaluation&task=2Open other window

*2Differential privacy: A quantitative measure of the strength of data privacy protection. It offers a mathematically provable metric for statistical analysis independent of data characteristics. The U.S. Census has also applied a disclose avoidance system based on the "differential privacy" to protect the individual data.

*3I. Tyou, T. Murata, T. Fukami, Y. Takezawa, and K. Niwa, "A localized primal-dual method for centralized/decentralized federated learning robust to data heterogeneity," in IEEE TSIPN, 2023.
https://ieeexplore.ieee.org/abstract/document/10373878Open other window

*4IOWN PETs (IOWN Privacy Enhancing Technologies) In order to realize free and active data cooperation in the IOWN era, we aim to create a world without plaintext that enables safe data distribution by technically ensuring that data is used only within the policy of the data owner throughout its life cycle from the time it is generated to the time it disappears.
https://www.rd.ntt/e/sil/project/iown-pets/iown-pets.htmlOpen other window

Contact from the press regarding this matter

Nippon Telegraph and Telephone Corporation
Service Innovation Research Institute
Corporate Communications Dept.
nttrd-pr@ml.ntt.com

Information is current as of the date of issue of the individual topics.
Please be advised that information may be outdated after that point.