Skip to content

Feature/rb 143 compression pack network#115

Merged
Saverio976 merged 21 commits intodevfrom
feature/RB-143-compression-pack-network
Oct 21, 2023
Merged

Feature/rb 143 compression pack network#115
Saverio976 merged 21 commits intodevfrom
feature/RB-143-compression-pack-network

Conversation

@romainpanno
Copy link
Copy Markdown
Collaborator

@romainpanno romainpanno commented Oct 18, 2023

  • Please check if the PR fulfills these requirements
  • The commit message follows our guidelines
  • Tests for the changes have been added (for bug fixes / features)
  • Docs have been added / updated (for bug fixes / features)
  • Norm of the code has been respected
  • In what state is your pull request?
  • Ready to merge / Waiting a reviwer to see my work.
  • Work In Progress (WIP) / My work is not finish but i want daily reviews. (Draft)
  • CI Review / I want to see what the CI thinks of my work.
  • Need feedback / Just to have feedback on what i produced. (May not be merged)
  • What kind of change does this PR introduce? (You can choose multiple)
  • Bug fix
  • Feature request
  • New / Updated documentation
  • Testing CI ( Make the pull request in draft mode)
  • What is the current behavior? (link an issue based on the kind of change this pr introduce)

  • What is the new behavior (if this is a feature change)?

  • Other information:

Summary by CodeRabbit

  • New Feature: Integrated Zstandard (Zstd) compression library to enhance network data transmission. This will improve the speed and reliability of data transfer by compressing and decompressing network packets.
  • New Feature: Added a new class Zstd to provide static functions for data compression and decompression, improving the efficiency of data handling.
  • Documentation: Updated the "Network Packet Global Structure" section in RFC.md to include the use of Zstd for packet compression and decompression.
  • Chore: Modified the build process to include the Zstd library and optimized the build process for parallel execution, speeding up the overall build time.
  • Chore: Updated the GitHub workflow to include a new step for handling Zstd library in the release process.

@romainpanno romainpanno added the enhancement New feature or request label Oct 18, 2023
@romainpanno romainpanno self-assigned this Oct 18, 2023
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Oct 18, 2023

Walkthrough

The changes introduce the Zstandard (zstd) compression library to the project, modifying the network packet structure to include compressed data. The Zstd class provides static functions for data compression and decompression. The build process is updated to fetch and link the zstd library, and the GitHub workflow is adjusted to include zstd in the release process.

Changes

File(s) Summary
deps/CMakeLists.txt, deps/zstd/CMakeLists.txt Added zstd as a new dependency and configured its build process.
src/Nitwork/ANitwork.cpp, src/Nitwork/ANitwork.hpp Modified the network packet handling to include data compression and decompression using zstd.
src/Nitwork/Zstd.hpp Introduced a new class Zstd for data compression and decompression using zstd.
docs/network/rfc/RFC.md Updated the network packet structure documentation to include data compression.
.github/workflows/release.yml Modified the GitHub workflow to include zstd in the release process.
scripts/compil.sh Updated the build command to enable parallel building.

"With a hop and a skip, we compress and zip, 🐇💼
Packets now lighter, our network much sprighter! 🚀✨
Zstd joins the fray, to keep lag at bay, 🎉🐢
Celebrate we must, for in CodeRabbit we trust! 🎊🐇"

Walkthrough

The changes introduce the Zstandard (zstd) compression library to the project, enabling data compression and decompression for network packets. The Zstd library is fetched and linked via CMake. A new Zstd class is added to handle compression tasks. The network packet handling code is updated to use this class for compressing outgoing data and decompressing incoming data. The GitHub workflow is also updated to include zstd in the release process.

Changes

File(s) Summary
deps/CMakeLists.txt, deps/zstd/CMakeLists.txt Added Zstandard library to the project and configured CMake to fetch and link it.
src/Nitwork/ANitwork.cpp, src/Nitwork/ANitwork.hpp Updated network packet handling code to use Zstd for data compression and decompression.
src/Nitwork/Zstd.hpp Introduced a new Zstd class for handling data compression and decompression tasks using the Zstandard library.
docs/network/rfc/RFC.md Updated the network packet structure documentation to include the use of Zstd for data compression and decompression.
.github/workflows/release.yml Updated the GitHub workflow to include Zstd in the release process.

"With a hop and a skip, we compress and zip, 🐇💼
Packets now lighter, our network is tighter. 🌐💨
Zstd is the name, compression is the game, 🗜️🎮
Making our data, much easier to tame!" 📊🦁


Tips

Chat with CodeRabbit Bot (@coderabbitai)

  • Mention @coderabbitai in any review comment for bot assistance.
  • Note: Review comments are made on code diffs or files, not on the PR overview.

Pause Incremental Reviews

  • Insert @coderabbitai: ignore in the PR description to halt the bot's ongoing reviews. Remove the line to resume.
  • Additionally, enter @coderabbitai pause to halt the bot's ongoing reviews as any issue or review comment. To
    resume reviews, enter @coderabbitai resume as any issue or review comment.

Tips

Chat with CodeRabbit Bot (@coderabbitai)

  • If you reply to a review comment from CodeRabbit, the bot will automatically respond.
  • To engage with CodeRabbit bot directly around the specific lines of code in the PR, mention @coderabbitai in your review comment
  • Note: Review comments are made on code diffs or files, not on the PR overview.
  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger a review. This is useful when automatic reviews are disabled for the repository.

Note: For conversation with the bot, please use the review comments on code diffs or files.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 3

Details

Commits Files that changed from the base of the PR and between a0692a4 and 7673130.
Files selected for processing (5)
  • deps/CMakeLists.txt (1 hunks)
  • deps/zstd/CMakeLists.txt (1 hunks)
  • src/Nitwork/ANitwork.cpp (3 hunks)
  • src/Nitwork/ANitwork.hpp (3 hunks)
  • src/Nitwork/Zstd.hpp (1 hunks)
Files skipped from review due to trivial changes (1)
  • deps/CMakeLists.txt
Additional comments (Suppressed): 8
deps/zstd/CMakeLists.txt (1)
  • 1-27: The changes in this hunk look good. The Zstandard library is fetched and linked statically to both the client and server projects. The use of FetchContent is a good practice for managing dependencies in CMake. The static linking of the library ensures that the binary can run on systems without the Zstandard library installed. The version of the library is also explicitly specified, which is a good practice to ensure consistent behavior across different build environments.
src/Nitwork/ANitwork.hpp (4)
  • 14-17: The inclusion of the Zstd library is a significant change. Ensure that the library is correctly linked and that the build system is properly configured to include it. Also, verify that the library is compatible with the rest of the codebase and the target platforms.

  • 62-79: The data compression using Zstd is a good addition for optimizing network data transmission. However, it's important to consider the trade-off between CPU usage and network bandwidth. Compression algorithms like Zstd can be CPU-intensive, which might not be ideal for systems with limited CPU resources. Also, ensure that the receiving end of the data transmission is capable of decompressing the data.

  • 65-65: The Zstd::compress function is used here to compress the data before sending. Ensure that this function handles errors properly and is able to compress the data as expected. Also, verify that the function does not cause any memory leaks or other resource management issues.

  • 70-79: The error handling in the async_send_to function is well done. It checks for errors and logs them appropriately. However, it would be beneficial to add more specific error messages to help with debugging. For example, the message "NITWORK: Package not sent" could be more descriptive, indicating why the package was not sent.

src/Nitwork/ANitwork.cpp (3)
  • 8-10: The order of the #include directives has been changed. Ensure that this does not affect the compilation or the functionality of the code. The order of #include directives can sometimes matter, especially when there are dependencies or macros that need to be defined in a certain order.

  • 127-131: The callReceiveHandler function now logs an error message before starting the receive handler. This is a good practice for debugging and understanding the flow of the program. However, ensure that the logging does not introduce significant overhead, especially if this function is called frequently.

  • 136-147: The code now checks the size of the packet after decompression against MAX_PACKET_SIZE and the size of the header. This is a good practice to prevent buffer overflow attacks and to ensure that the packet is valid. However, the error message "header not received" may not be accurate if the packet size is greater than MAX_PACKET_SIZE. Consider using a more descriptive error message.
    Also, the decompression of the received buffer is done after the size check. This is a good practice as it prevents unnecessary decompression of invalid packets.

- callReceiveHandler("header not received");
+ callReceiveHandler("Invalid packet size");

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 4

Details

Commits Files that changed from the base of the PR and between 7673130 and 027bad8.
Files selected for processing (2)
  • src/Nitwork/ANitwork.cpp (2 hunks)
  • src/Nitwork/Zstd.hpp (1 hunks)
Additional comments (Suppressed): 1
src/Nitwork/ANitwork.cpp (1)
  • 127-131: The callReceiveHandler function now logs an error message before starting the receive handler. This is a good improvement for error handling and debugging. However, ensure that the Logger::error function is thread-safe, as callReceiveHandler might be called from multiple threads.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 1

Details

Commits Files that changed from the base of the PR and between 027bad8 and 0ccc214.
Files selected for processing (1)
  • src/Nitwork/Zstd.hpp (1 hunks)
Additional comments (Suppressed): 2
src/Nitwork/Zstd.hpp (2)
  • 20-37: The compress function is well implemented. It checks if the data type is POD (Plain Old Data), calculates the maximum size of the compressed data, compresses the data, checks for errors, and resizes the vector to the actual size of the compressed data. However, it's worth noting that the compression level is set to 1, which is the fastest but least compressed. If you need more compression and can afford the extra CPU usage, consider increasing this value.

  • 55-58: The getFrameContentSize function is straightforward and doesn't seem to have any issues. It correctly uses the ZSTD library function to get the size of the decompressed content.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 4

Details

Commits Files that changed from the base of the PR and between 0ccc214 and 5b4d88c.
Files selected for processing (1)
  • src/Nitwork/Zstd.hpp (1 hunks)

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 1

Details

Commits Files that changed from the base of the PR and between 5b4d88c and 9bfe1ab.
Files selected for processing (1)
  • src/Nitwork/Zstd.hpp (1 hunks)
Additional comments (Suppressed): 3
src/Nitwork/Zstd.hpp (3)
  • 34-37: The compress function checks if there was an error during compression using ZSTD_isError. If there was an error, it throws a runtime error with a message that includes the error name. This is a good practice as it helps with debugging.

  • 42-50: The decompress function decompresses the input data and checks if there was an error during decompression using ZSTD_isError. If there was an error, it throws a runtime error with a message that includes the error name. This is a good practice as it helps with debugging.

  • 54-60: The getFrameContentSize function gets the size of the decompressed content and checks if there was an error during this operation using ZSTD_isError. If there was an error, it throws a runtime error with a message that includes the error name. This is a good practice as it helps with debugging.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 3

Details

Commits Files that changed from the base of the PR and between 9bfe1ab and f47b105.
Files selected for processing (1)
  • src/Nitwork/Zstd.hpp (1 hunks)

Copy link
Copy Markdown
Collaborator

@Saverio976 Saverio976 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

il faut modifier la RFC

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 2

Details

Commits Files that changed from the base of the PR and between f47b105 and 22d3e79.
Files selected for processing (2)
  • docs/network/rfc/RFC.md (1 hunks)
  • src/Nitwork/Zstd.hpp (1 hunks)
Additional comments (Suppressed): 2
docs/network/rfc/RFC.md (1)
  • 142-147: The new hunk introduces the use of the zstd library for compressing and decompressing network packets. Ensure that the implementation of this feature does not introduce any performance or compatibility issues. Also, verify that the zstd library is compatible with the platforms that the game is intended to run on.
src/Nitwork/Zstd.hpp (1)
  • 54-56: The check for decompressedSize > MAX_PACKET_SIZE should be done before calling ZSTD_decompress. If the decompressed size is larger than MAX_PACKET_SIZE, it could lead to a buffer overflow. Consider moving this check before the call to ZSTD_decompress.
                std::array<char, MAX_PACKET_SIZE> decompressedArray = {0};
                size_t decompressedSize =
                    ZSTD_decompress(decompressedArray.data(), MAX_PACKET_SIZE, data.data(), size);
-
                if (ZSTD_isError(decompressedSize) != 0U) {
                    throw std::runtime_error(
                        std::string("ZSTD: Error while decompressing: ")
                        + ZSTD_getErrorName(decompressedSize));
                }
+                if (decompressedSize > MAX_PACKET_SIZE) {
+                    throw std::runtime_error("ZSTD: Decompressed size is too big");
+                }
                if (decompressedSize > MAX_PACKET_SIZE) {
                    throw std::runtime_error("ZSTD: Decompressed size is too big");
                }
                return decompressedArray;

Overall, the code is well-structured and follows good practices. The use of std::is_standard_layout_v and std::is_trivial_v to check if the data is POD (Plain Old Data) is a good practice to ensure that the data can be safely serialized and deserialized. The error handling is also well done, with clear error messages and checks for errors after each ZSTD function call.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 0

Details

Commits Files that changed from the base of the PR and between 22d3e79 and 9dfbece.
Files selected for processing (2)
  • docs/network/rfc/RFC.md (1 hunks)
  • src/Nitwork/Zstd.hpp (1 hunks)
Additional comments (Suppressed): 4
src/Nitwork/Zstd.hpp (3)
  • 22-40: The compress function checks if the input data is of Plain Old Data (POD) type and if its size is within the limit that Zstandard can handle. It then compresses the data and returns it as a vector of characters. If any of these operations fail, it throws a runtime error with an appropriate message. This is a good practice as it ensures that the function fails fast and provides useful information for debugging.

  • 45-60: The decompress function takes a data array and its size, decompresses the data, and returns it as an array of characters. It checks if the decompression was successful and if the decompressed size is within the limit. If any of these checks fail, it throws a runtime error with an appropriate message. This is a good practice as it ensures that the function fails fast and provides useful information for debugging.

  • 63-72: The getFrameContentSize function takes a data array and returns the size of its decompressed content. It checks if the operation was successful and if it fails, it throws a runtime error with an appropriate message. This is a good practice as it ensures that the function fails fast and provides useful information for debugging.

docs/network/rfc/RFC.md (1)
  • 142-147: The changes made to the "Network Packet Global Structure" section of the RFC document accurately reflect the introduction of the Zstandard (zstd) compression library to the project. The new text explains that packets are now compressed with zstd before being sent over the network and decompressed upon receipt. This is a good update to the documentation, ensuring it stays current with the codebase.

However, it might be beneficial to include a brief explanation or a link to more information about the zstd library for those unfamiliar with it. This would provide context and help developers understand why this specific compression library was chosen.

Example:

    The packets are compressed using the zstd library before being sent over
    the network using the UDP protocol. Upon receipt, the packets are
    decompressed using the zstd library
+     Zstandard (zstd) is a fast lossless compression algorithm, providing high compression ratios. It also offers a very wide range of compression/speed trade-off. More information can be found [here](https://facebook.github.io/zstd/).

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 1

Details

Commits Files that changed from the base of the PR and between 9dfbece and ac29cf8.
Files selected for processing (1)
  • .github/workflows/release.yml (5 hunks)
Additional comments (Suppressed): 4
.github/workflows/release.yml (4)
  • 3-5: The new branch fix/Nitwork-tiks-and-enemydeath has been added to the list of branches that trigger the workflow. Ensure that this branch exists and is intended to trigger the workflow.

  • 11-11: A new environment variable ZSTD_TAG has been added. This variable is used to checkout a specific tag of the zstd repository. Ensure that the tag v1.5.5 exists in the zstd repository.

  • 119-119: The zstd-to-tar job has been added as a dependency for the release-windows, release-linux, and release-macos jobs. Ensure that the zstd-to-tar job completes successfully before these jobs are run.

  • 225-259: A new job zstd-to-tar has been added to the workflow. This job clones the zstd repository, checks out a specific tag, creates a tar file, and uploads it to the release or artifact based on the branch. Ensure that the commands in this job are correct and work as expected.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 0

Details

Commits Files that changed from the base of the PR and between ac29cf8 and aaab9e4.
Files selected for processing (1)
  • .github/workflows/release.yml (5 hunks)
Additional comments (Suppressed): 4
.github/workflows/release.yml (4)
  • 3-5: The branch fix/Nitwork-tiks-and-enemydeath has been added to the list of branches that trigger the workflow. Ensure that this branch is intended to trigger the workflow and that it exists in the repository.

  • 8-11: The environment variable ZSTD_TAG has been added. This variable is used to specify the version of the Zstd library to be used in the project. Ensure that the version specified is compatible with the project.

  • 119-119: The zstd-to-tar job has been added as a dependency for the release-windows, release-linux, and release-macos jobs. This means that these jobs will not start until the zstd-to-tar job has completed successfully. Ensure that this dependency is necessary and that the zstd-to-tar job is guaranteed to complete successfully before these jobs are expected to start.

  • 225-259: A new job zstd-to-tar has been added. This job clones the Zstd repository, checks out the version specified by the ZSTD_TAG environment variable, creates a tar file of the repository, and uploads it to the release if the branch is main. Otherwise, it uploads the tar file as an artifact for later use. This job is necessary for the inclusion of the Zstd library in the project.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 0

Details

Commits Files that changed from the base of the PR and between aaab9e4 and 920a0d9.
Files selected for processing (1)
  • .github/workflows/release.yml (5 hunks)
Additional comments (Suppressed): 4
.github/workflows/release.yml (4)
  • 3-5: The new branch feature/RB-143-compression-pack-network has been added to the list of branches that trigger the workflow. Ensure that this branch exists and is the correct one for this feature.

  • 11-11: A new environment variable ZSTD_TAG has been added. This is used to specify the version of the Zstandard library to be used. Ensure that the version "v1.5.5" is the correct and intended one.

  • 119-119: The zstd-to-tar job has been added as a dependency for the release-windows, release-linux, and release-macos jobs. This means these jobs will wait for the zstd-to-tar job to complete successfully before they start. Ensure that this is the intended behavior.

  • 225-259: A new job zstd-to-tar has been added. This job clones the Zstandard repository, creates a tar file of the repository, and uploads it to the release if the branch is main. Otherwise, it uploads the tar file as an artifact for later use. This job is necessary for the new feature of using the Zstandard library for network packet compression and decompression. The job seems to be correctly defined and follows the same pattern as the other similar jobs in the workflow.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 0

Details

Commits Files that changed from the base of the PR and between 920a0d9 and 532ee84.
Files selected for processing (1)
  • .github/workflows/release.yml (1 hunks)
Files skipped from review due to trivial changes (1)
  • .github/workflows/release.yml

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 0

Details

Commits Files that changed from the base of the PR and between 532ee84 and 4bccb1a.
Files selected for processing (1)
  • deps/zstd/CMakeLists.txt (1 hunks)
Additional comments (Suppressed): 1
deps/zstd/CMakeLists.txt (1)
  • 1-28: The CMakeLists.txt file for the zstd library is well-structured and follows best practices for including external libraries in a CMake project. The library is fetched and built statically, which is a good choice for a network library to avoid potential issues with dynamic linking. The library is then linked to both the client and server projects, which is necessary for both to use the compression and decompression functions. The use of FetchContent_Declare and FetchContent_MakeAvailable is a good practice for fetching and making available the content at configure time. The URL for fetching the zstd library points to a tar file in the releases of the R-Bus project, which is a reasonable choice if the project maintains a version of the zstd library that is known to work with the project. However, it would be good to verify that the URL is correct and that the tar file is available at that location.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 2

Details

Commits Files that changed from the base of the PR and between 4bccb1a and c223425.
Files selected for processing (8)
  • .github/workflows/compil.yml (2 hunks)
  • deps/CMakeLists.txt (1 hunks)
  • deps/boost/CMakeLists.txt (1 hunks)
  • deps/date/CMakeLists.txt (1 hunks)
  • deps/json/CMakeLists.txt (1 hunks)
  • deps/raylib/CMakeLists.txt (1 hunks)
  • deps/zstd/CMakeLists.txt (1 hunks)
  • scripts/compil.sh (1 hunks)
Files skipped from review due to trivial changes (4)
  • .github/workflows/compil.yml
  • deps/CMakeLists.txt
  • deps/date/CMakeLists.txt
  • deps/raylib/CMakeLists.txt
Additional comments (Suppressed): 3
scripts/compil.sh (1)
  • 12-12: The addition of the --parallel flag to the cmake --build command is a good practice as it allows for parallel building, which can significantly speed up the build process by utilizing multiple cores of the processor. However, please ensure that all dependencies in your build are properly managed to avoid potential race conditions that could lead to inconsistent or failed builds.```diff
  • cmake --build build
  • cmake --build build --parallel



</blockquote></details>
<details><summary>deps/json/CMakeLists.txt (1)</summary><blockquote>

* 7-7: The `CMAKE_ARGS` option is not a valid option for the `FetchContent_Declare` function. If you want to set CMake arguments for the json library, you should do it in the `FetchContent_MakeAvailable` call or in a `FetchContent_GetProperties` call before `FetchContent_MakeAvailable`. Please verify this change.




</blockquote></details>
<details><summary>deps/zstd/CMakeLists.txt (1)</summary><blockquote>

* 1-41: The CMakeLists.txt file for zstd looks good. It fetches the zstd library, builds it statically, and links it to both the client and server projects. The `FetchContent_Declare` function is used to fetch the zstd library from a URL, and `FetchContent_MakeAvailable` is used to make the library available for the project. The `target_link_libraries` function is used to link the zstd library to the client and server projects, and `target_include_directories` is used to add the zstd library's include directories to the client and server projects. The `set` function is used to set the build options for the zstd library. The `cmake_minimum_required` function is used to specify the minimum required version of CMake.



</blockquote></details></blockquote></details>



</details>

@Saverio976 Saverio976 merged commit fc3ce4e into dev Oct 21, 2023
@Saverio976 Saverio976 deleted the feature/RB-143-compression-pack-network branch March 4, 2024 14:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants