Benchmarking in Optical Networks

Call for Papers

Submissions Open: 1 May 2024
Submission Deadline: 1 June 2024

“Most papers present performance results for the new algorithm only for a very small number of problems - rarely more than three. In most cases, one or several of these problems are meaningless synthetic problems [...]. Comparisons to algorithms suggested by other researchers are in many cases not done at all [...]". L. Prechelt, 1994 (1)

Almost 30 years later, we argue that the above statement—made at the time in reference to advances in Artificial Intelligence (AI)—aptly describes much of the research being published by the optical networking community. For instance, a review of papers addressing resource allocation in optical networks shows three main trends. First, a high diversity of topologies, traffic profiles, network configurations, and—to a lesser extent—performance metrics used in the evaluation of new solutions. Second, performance evaluation is limited to specific cases, often with arbitrary selection of baseline solutions. Finally, open access to code and data is limited at best.

Today, AI thrives. There is no doubt that benchmarks—such as the popular ImageNet and GLUE (General Language Understanding Evaluation)—have been instrumental in moving the field forward. Good benchmarking practices—including open access to data and code—allow for meaningful comparability, fast reproducibility of results and lead to better evaluation of new proposals. Indeed, by enabling informed comparisons of different and diverse solutions, such practices are essential for effective progress.

The lack of good benchmarking practices in our community is becoming more problematic as the optical networking field diversifies and encompasses new technologies and ideas. As a way of example, the performance of Deep Reinforcement Learning (DRL) solutions is typically compared only to some reasonable—but arbitrarily selected—previous work, making it difficult to determine the real benefits of DRL. Even worse, new DRL papers usually report results in a slightly different network scenario, exacerbating the problem.

This Special Issue in the Journal of Optical Communications and Networking (JOCN) aims to close this gap and contribute to establishing benchmarks and good benchmarking practices for evaluating research published by the optical networking community. The Special Issue will be divided into two parts.

In the first part, we solicit works in the themes of:

  • Reference networks and traffic profiles. Papers recommending a minimum set of reference networks, traffic profiles and performance metrics that researchers should use when submitting a new work. The set should be big enough to represent real and diverse cases, but small enough to avoid excessive computation time when evaluating a new proposal. These will cover different sized geographies as well as long-haul networks and should include, as a minimum, the geographic location of nodes and link lengths. Traffic profiles should include at least one representative traffic matrix. Dynamic traffic models of real or expected scenarios are welcome. Contributions to this theme are by invitation only.
  • Surveys. Papers surveying the proposals found in the literature for any relevant optimisation problem, ranging from well-known problems revisited again due to the emergence of Deep Reinforcement Learning (e.g. Routing and Wavelength Allocation) to most recent problems for which surveys have not been published yet (e.g. resource allocation in multiband optical networks).

Submissions to the Special Issue need to present original, previously unpublished work. Manuscripts should be prepared according to the usual standards for JOCN and will undergo the normal peer review process. Please review the standard JOCN Publication Charges prior to submission. Authors will upload manuscripts through the online submission system specifying from the Feature Issue drop-down menu that the manuscript is for the issue on Benchmarking in Optical Networks.

The papers accepted in the first part of this Special Issue will provide the reference networks, traffic profiles and algorithmic approaches to be benchmarked in the second part, where we will solicit works reporting on benchmarking of different solutions for relevant resource allocation problems in optical networks. The timeline for part two will be confirmed after publication of papers in part one of the special issue.

Editors

Alejandra Beghelli (Lead Editor), Optical Networks Group, University College London, UK
George Rouskas, North Carolina State University, USA
Paul Wright, British Telecom, UK

(1) L. Prechelt, “PROBEN1—A set of neural network benchmark problems and benchmarking rules”, Technical Report 21/94, Universitat Karlsruhe, 1994