[
  {
    "path": ".gitignore",
    "content": "target/\nbin/\nnode_modules/\ntest-output/\nlicense/\ndependency-reduced-pom.xml\n.settings/\n.project\n.classpath\njavaClientId-tcplocalhost*\njavaSimpleEdgeNode-tcplocalhost*\n*.o\nsparkplug_example\n.DS_Store\n.pydevproject\n*.pyc\n*-tcp*\npom.xml.versionsBackup\n"
  },
  {
    "path": "LICENCE",
    "content": "Eclipse Public License - v 2.0\n\n    THE ACCOMPANYING PROGRAM IS PROVIDED UNDER THE TERMS OF THIS ECLIPSE\n    PUBLIC LICENSE (\"AGREEMENT\"). ANY USE, REPRODUCTION OR DISTRIBUTION\n    OF THE PROGRAM CONSTITUTES RECIPIENT'S ACCEPTANCE OF THIS AGREEMENT.\n\n1. DEFINITIONS\n\n\"Contribution\" means:\n\n  a) in the case of the initial Contributor, the initial content\n     Distributed under this Agreement, and\n\n  b) in the case of each subsequent Contributor:\n     i) changes to the Program, and\n     ii) additions to the Program;\n  where such changes and/or additions to the Program originate from\n  and are Distributed by that particular Contributor. A Contribution\n  \"originates\" from a Contributor if it was added to the Program by\n  such Contributor itself or anyone acting on such Contributor's behalf.\n  Contributions do not include changes or additions to the Program that\n  are not Modified Works.\n\n\"Contributor\" means any person or entity that Distributes the Program.\n\n\"Licensed Patents\" mean patent claims licensable by a Contributor which\nare necessarily infringed by the use or sale of its Contribution alone\nor when combined with the Program.\n\n\"Program\" means the Contributions Distributed in accordance with this\nAgreement.\n\n\"Recipient\" means anyone who receives the Program under this Agreement\nor any Secondary License (as applicable), including Contributors.\n\n\"Derivative Works\" shall mean any work, whether in Source Code or other\nform, that is based on (or derived from) the Program and for which the\neditorial revisions, annotations, elaborations, or other modifications\nrepresent, as a whole, an original work of authorship.\n\n\"Modified Works\" shall mean any work in Source Code or other form that\nresults from an addition to, deletion from, or modification of the\ncontents of the Program, including, for purposes of clarity any new file\nin Source Code form that contains any contents of the Program. Modified\nWorks shall not include works that contain only declarations,\ninterfaces, types, classes, structures, or files of the Program solely\nin each case in order to link to, bind by name, or subclass the Program\nor Modified Works thereof.\n\n\"Distribute\" means the acts of a) distributing or b) making available\nin any manner that enables the transfer of a copy.\n\n\"Source Code\" means the form of a Program preferred for making\nmodifications, including but not limited to software source code,\ndocumentation source, and configuration files.\n\n\"Secondary License\" means either the GNU General Public License,\nVersion 2.0, or any later versions of that license, including any\nexceptions or additional permissions as identified by the initial\nContributor.\n\n2. GRANT OF RIGHTS\n\n  a) Subject to the terms of this Agreement, each Contributor hereby\n  grants Recipient a non-exclusive, worldwide, royalty-free copyright\n  license to reproduce, prepare Derivative Works of, publicly display,\n  publicly perform, Distribute and sublicense the Contribution of such\n  Contributor, if any, and such Derivative Works.\n\n  b) Subject to the terms of this Agreement, each Contributor hereby\n  grants Recipient a non-exclusive, worldwide, royalty-free patent\n  license under Licensed Patents to make, use, sell, offer to sell,\n  import and otherwise transfer the Contribution of such Contributor,\n  if any, in Source Code or other form. This patent license shall\n  apply to the combination of the Contribution and the Program if, at\n  the time the Contribution is added by the Contributor, such addition\n  of the Contribution causes such combination to be covered by the\n  Licensed Patents. The patent license shall not apply to any other\n  combinations which include the Contribution. No hardware per se is\n  licensed hereunder.\n\n  c) Recipient understands that although each Contributor grants the\n  licenses to its Contributions set forth herein, no assurances are\n  provided by any Contributor that the Program does not infringe the\n  patent or other intellectual property rights of any other entity.\n  Each Contributor disclaims any liability to Recipient for claims\n  brought by any other entity based on infringement of intellectual\n  property rights or otherwise. As a condition to exercising the\n  rights and licenses granted hereunder, each Recipient hereby\n  assumes sole responsibility to secure any other intellectual\n  property rights needed, if any. For example, if a third party\n  patent license is required to allow Recipient to Distribute the\n  Program, it is Recipient's responsibility to acquire that license\n  before distributing the Program.\n\n  d) Each Contributor represents that to its knowledge it has\n  sufficient copyright rights in its Contribution, if any, to grant\n  the copyright license set forth in this Agreement.\n\n  e) Notwithstanding the terms of any Secondary License, no\n  Contributor makes additional grants to any Recipient (other than\n  those set forth in this Agreement) as a result of such Recipient's\n  receipt of the Program under the terms of a Secondary License\n  (if permitted under the terms of Section 3).\n\n3. REQUIREMENTS\n\n3.1 If a Contributor Distributes the Program in any form, then:\n\n  a) the Program must also be made available as Source Code, in\n  accordance with section 3.2, and the Contributor must accompany\n  the Program with a statement that the Source Code for the Program\n  is available under this Agreement, and informs Recipients how to\n  obtain it in a reasonable manner on or through a medium customarily\n  used for software exchange; and\n\n  b) the Contributor may Distribute the Program under a license\n  different than this Agreement, provided that such license:\n     i) effectively disclaims on behalf of all other Contributors all\n     warranties and conditions, express and implied, including\n     warranties or conditions of title and non-infringement, and\n     implied warranties or conditions of merchantability and fitness\n     for a particular purpose;\n\n     ii) effectively excludes on behalf of all other Contributors all\n     liability for damages, including direct, indirect, special,\n     incidental and consequential damages, such as lost profits;\n\n     iii) does not attempt to limit or alter the recipients' rights\n     in the Source Code under section 3.2; and\n\n     iv) requires any subsequent distribution of the Program by any\n     party to be under a license that satisfies the requirements\n     of this section 3.\n\n3.2 When the Program is Distributed as Source Code:\n\n  a) it must be made available under this Agreement, or if the\n  Program (i) is combined with other material in a separate file or\n  files made available under a Secondary License, and (ii) the initial\n  Contributor attached to the Source Code the notice described in\n  Exhibit A of this Agreement, then the Program may be made available\n  under the terms of such Secondary Licenses, and\n\n  b) a copy of this Agreement must be included with each copy of\n  the Program.\n\n3.3 Contributors may not remove or alter any copyright, patent,\ntrademark, attribution notices, disclaimers of warranty, or limitations\nof liability (\"notices\") contained within the Program from any copy of\nthe Program which they Distribute, provided that Contributors may add\ntheir own appropriate notices.\n\n4. COMMERCIAL DISTRIBUTION\n\nCommercial distributors of software may accept certain responsibilities\nwith respect to end users, business partners and the like. While this\nlicense is intended to facilitate the commercial use of the Program,\nthe Contributor who includes the Program in a commercial product\noffering should do so in a manner which does not create potential\nliability for other Contributors. Therefore, if a Contributor includes\nthe Program in a commercial product offering, such Contributor\n(\"Commercial Contributor\") hereby agrees to defend and indemnify every\nother Contributor (\"Indemnified Contributor\") against any losses,\ndamages and costs (collectively \"Losses\") arising from claims, lawsuits\nand other legal actions brought by a third party against the Indemnified\nContributor to the extent caused by the acts or omissions of such\nCommercial Contributor in connection with its distribution of the Program\nin a commercial product offering. The obligations in this section do not\napply to any claims or Losses relating to any actual or alleged\nintellectual property infringement. In order to qualify, an Indemnified\nContributor must: a) promptly notify the Commercial Contributor in\nwriting of such claim, and b) allow the Commercial Contributor to control,\nand cooperate with the Commercial Contributor in, the defense and any\nrelated settlement negotiations. The Indemnified Contributor may\nparticipate in any such claim at its own expense.\n\nFor example, a Contributor might include the Program in a commercial\nproduct offering, Product X. That Contributor is then a Commercial\nContributor. If that Commercial Contributor then makes performance\nclaims, or offers warranties related to Product X, those performance\nclaims and warranties are such Commercial Contributor's responsibility\nalone. Under this section, the Commercial Contributor would have to\ndefend claims against the other Contributors related to those performance\nclaims and warranties, and if a court requires any other Contributor to\npay any damages as a result, the Commercial Contributor must pay\nthose damages.\n\n5. NO WARRANTY\n\nEXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, AND TO THE EXTENT\nPERMITTED BY APPLICABLE LAW, THE PROGRAM IS PROVIDED ON AN \"AS IS\"\nBASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER EXPRESS OR\nIMPLIED INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OR CONDITIONS OF\nTITLE, NON-INFRINGEMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR\nPURPOSE. Each Recipient is solely responsible for determining the\nappropriateness of using and distributing the Program and assumes all\nrisks associated with its exercise of rights under this Agreement,\nincluding but not limited to the risks and costs of program errors,\ncompliance with applicable laws, damage to or loss of data, programs\nor equipment, and unavailability or interruption of operations.\n\n6. DISCLAIMER OF LIABILITY\n\nEXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, AND TO THE EXTENT\nPERMITTED BY APPLICABLE LAW, NEITHER RECIPIENT NOR ANY CONTRIBUTORS\nSHALL HAVE ANY LIABILITY FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,\nEXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING WITHOUT LIMITATION LOST\nPROFITS), HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN\nCONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)\nARISING IN ANY WAY OUT OF THE USE OR DISTRIBUTION OF THE PROGRAM OR THE\nEXERCISE OF ANY RIGHTS GRANTED HEREUNDER, EVEN IF ADVISED OF THE\nPOSSIBILITY OF SUCH DAMAGES.\n\n7. GENERAL\n\nIf any provision of this Agreement is invalid or unenforceable under\napplicable law, it shall not affect the validity or enforceability of\nthe remainder of the terms of this Agreement, and without further\naction by the parties hereto, such provision shall be reformed to the\nminimum extent necessary to make such provision valid and enforceable.\n\nIf Recipient institutes patent litigation against any entity\n(including a cross-claim or counterclaim in a lawsuit) alleging that the\nProgram itself (excluding combinations of the Program with other software\nor hardware) infringes such Recipient's patent(s), then such Recipient's\nrights granted under Section 2(b) shall terminate as of the date such\nlitigation is filed.\n\nAll Recipient's rights under this Agreement shall terminate if it\nfails to comply with any of the material terms or conditions of this\nAgreement and does not cure such failure in a reasonable period of\ntime after becoming aware of such noncompliance. If all Recipient's\nrights under this Agreement terminate, Recipient agrees to cease use\nand distribution of the Program as soon as reasonably practicable.\nHowever, Recipient's obligations under this Agreement and any licenses\ngranted by Recipient relating to the Program shall continue and survive.\n\nEveryone is permitted to copy and distribute copies of this Agreement,\nbut in order to avoid inconsistency the Agreement is copyrighted and\nmay only be modified in the following manner. The Agreement Steward\nreserves the right to publish new versions (including revisions) of\nthis Agreement from time to time. No one other than the Agreement\nSteward has the right to modify this Agreement. The Eclipse Foundation\nis the initial Agreement Steward. The Eclipse Foundation may assign the\nresponsibility to serve as the Agreement Steward to a suitable separate\nentity. Each new version of the Agreement will be given a distinguishing\nversion number. The Program (including Contributions) may always be\nDistributed subject to the version of the Agreement under which it was\nreceived. In addition, after a new version of the Agreement is published,\nContributor may elect to Distribute the Program (including its\nContributions) under the new version.\n\nExcept as expressly stated in Sections 2(a) and 2(b) above, Recipient\nreceives no rights or licenses to the intellectual property of any\nContributor under this Agreement, whether expressly, by implication,\nestoppel or otherwise. All rights in the Program not expressly granted\nunder this Agreement are reserved. Nothing in this Agreement is intended\nto be enforceable by any entity that is not a Contributor or Recipient.\nNo third-party beneficiary rights are created under this Agreement.\n\nExhibit A - Form of Secondary Licenses Notice\n\n\"This Source Code may also be made available under the following \nSecondary Licenses when the conditions for such availability set forth \nin the Eclipse Public License, v. 2.0 are satisfied: {name license(s),\nversion(s), and exceptions or additional permissions here}.\"\n\n  Simply including a copy of this Agreement, including this Exhibit A\n  is not sufficient to license the Source Code under Secondary Licenses.\n\n  If it is not possible or desirable to put the notice in a particular\n  file, then You may include the notice in a location (such as a LICENSE\n  file in a relevant directory) where a recipient would be likely to\n  look for such a notice.\n\n  You may add additional accurate notices of copyright ownership.\n"
  },
  {
    "path": "README.md",
    "content": "# Eclipse Tahu\n\nEclipse Tahu provide client libraries and reference implementations in various languages and for various devices\nto show how the device/remote application must connect and disconnect from the MQTT server using the Sparkplug\nspecification explained below.  This includes device lifecycle messages such as the required birth and last will &\ntestament messages that must be sent to ensure the device lifecycle state and data integrity.\n\n# Sparkplug\n\nSparkplug®, Sparkplug Compatible, and the Sparkplug Logo are trademarks of the Eclipse Foundation.\n\nSparkplug is a specification for MQTT enabled devices and applications to send and receive messages in a stateful way.\nWhile MQTT is stateful by nature it doesn't ensure that all data on a receiving MQTT application is current or valid.\nSparkplug provides a mechanism for ensuring that remote device or application data is current and valid.\n\nSparkplug A was the original version of the Sparkplug specification and used Eclipse Kura's protobuf definition for\npayload encoding.  However, it was quickly determined that this definition was too limited to handle the metadata that\ntypical Sparkplug payloads require.  As a result, Sparkplug B was developed to add additional features and capabilities\nthat were not possible in the original Kura payload definition.  These features include:\n* Complex data types using templates\n* Datasets\n* Richer metrics with the ability to add property metadata for each metric\n* Metric alias support to maintain rich metric naming while keeping bandwidth usage to a minimum\n* Historical data\n* File data\n\nSparkplug Specification v3.0.0: https://www.eclipse.org/tahu/spec/sparkplug_spec.pdf\nEclipse Sparkplug Project: https://projects.eclipse.org/projects/iot.sparkplug\nEclipse Sparkplug & TCK Github Repository: https://github.com/eclipse-sparkplug/sparkplug\n\n# Contributing\nContributing to the Sparkplug Tahu Project is easy and contributions are welcome. In order to submit a pull request (PR) you must follow these steps. Failure to follow these steps will likely lead to the PR being rejected.\n1. Sign the Eclipse Contributor Agreement (ECA): https://accounts.eclipse.org/user/eca\n2. Make sure the email tied to your Github account is the same one you used to sign the ECA.\n3. Submit your PR against the develop branch of the repository. PRs against master will not be accepted: https://github.com/eclipse/sparkplug/tree/develop\n4. Sign off on your PR using the '-s' flag. For example: 'git commit -m\"My brief comment\" ChangedFile'\n5. Make sure to include any important context or information associated with the PR in the PR submission. Keep your commit comment brief.\n"
  },
  {
    "path": "RELEASE_NOTES.md",
    "content": "# Eclipse Tahu v1.0.0\n\n* Initial complete Java based Sparkplug v3.0.0 compatible implementation\n* Java based library for simple creation of both Sparkplug Edge Nodes and Host Applications\n* Partial example implementations exist for C, C#, JavaScript, Node RED, and Python\n"
  },
  {
    "path": "about.html",
    "content": "<!DOCTYPE html>\n<html>\n<head>\n    <meta http-equiv=\"Content-Type\" content=\"text/html; charset=ISO-8859-1\"/>\n    <title>About</title>\n</head>\n<body lang=\"EN-US\">\n<h2>About This Content</h2>\n\n<p>January 24, 2014</p>\n<h3>License</h3>\n\n<p>The Eclipse Foundation makes available all content in this plug-in (&ldquo;Content&rdquo;). Unless otherwise\n    indicated below, the Content is provided to you under the terms and conditions of the\n    <a href=\"http://www.eclipse.org/legal/epl-v20.html\">Eclipse Public License Version 2.0 (&ldquo;EPL&rdquo;)</a>\n    and <a href=\"http://www.opensource.org/licenses/apache2.0.php\">Apache License Version 2.0</a>.\n    A copy of the EPL is available at <a href=\"http://www.eclipse.org/legal/epl-v20.html\">http://www.eclipse.org/legal/epl-v20.html</a>\n    and a copy of the Apache License Version 2.0 is available at <a\n        href=\"http://www.opensource.org/licenses/apache2.0.php\">http://www.opensource.org/licenses/apache2.0.php</a>.\n    You may elect to redistribute this code under either of these licenses.\n    For purposes of the EPL, &ldquo;Program&rdquo; will mean the Content.\n</p>\n\n<p>\n    If you did not receive this Content directly from the Eclipse Foundation, the Content is\n    being redistributed by another party (&ldquo;Redistributor&rdquo;) and different terms and conditions may\n    apply to your use of any object code in the Content. Check the Redistributor&rsquo;s license that was\n    provided with the Content. If no such license exists, contact the Redistributor. Unless otherwise\n    indicated below, the terms and conditions of the EPL and Apache License 2.0 still apply to any source code\n    in the Content and such source code may be obtained at <a href=\"http://www.eclipse.org\">http://www.eclipse.org</a>.\n</p>\n\n</body>\n</html>\n"
  },
  {
    "path": "c/core/Makefile",
    "content": "#/********************************************************************************\n# * Copyright (c) 2014-2019 Cirrus Link Solutions and others\n# *\n# * This program and the accompanying materials are made available under the\n# * terms of the Eclipse Public License 2.0 which is available at\n# * http://www.eclipse.org/legal/epl-2.0.\n# *\n# * SPDX-License-Identifier: EPL-2.0\n# *\n# * Contributors:\n# *   Cirrus Link Solutions - initial implementation\n# ********************************************************************************/\n\nCC       :=  gcc\nNAME     :=  tahu\nSNAME    :=  lib/lib$(NAME).a\nDNAME    :=  lib/lib$(NAME).so\nSRC      :=  $(wildcard src/*.c)\nOBJ      :=  $(SRC:.c=.o)\n#CFLAGS   :=  -Wall -g3 -fPIC -Iinclude/\nCFLAGS   :=  -g -g3 -fPIC -Iinclude/\nLDFLAGS  :=  -L.\n#LDLIBS  :=  -l$(...)\n\nTEST     :=  test\nTEST_OBJ := test/test.c\nLD_TEST  := -Llib -L/usr/local/lib -l$(NAME)\n\n.PHONY: all clean test re\n\nall: $(SNAME) $(DNAME) $(TEST)\n\n$(SNAME): $(OBJ)\n\tmkdir -p lib\n\t$(AR) $(ARFLAGS) $@ $^\n\n$(DNAME): LDFLAGS += -shared\n$(DNAME): $(OBJ)\n\tmkdir -p lib\n\t$(CC) $(LDFLAGS) -o $@ $^ $(LDLIBS)\n\n$(TEST): $(SNAME) $(DNAME)\n\t$(CC) $(CFLAGS) -o test/test_static $(TEST_OBJ) $(SNAME) -lmosquitto\n\t$(CC) $(CFLAGS) $(LD_TEST) -o test/test_dynamic $(TEST_OBJ) -l$(NAME) -lmosquitto\n\nclean:\n\t$(RM) $(OBJ)\n\t$(RM) $(SNAME) $(DNAME)\n\t$(RM) test/test_static test/test_dynamic\n\nre: clean all\n"
  },
  {
    "path": "c/core/include/pb.h",
    "content": "/* Common parts of the nanopb library. Most of these are quite low-level\n * stuff. For the high-level interface, see pb_encode.h and pb_decode.h.\n */\n\n#ifndef PB_H_INCLUDED\n#define PB_H_INCLUDED\n\n/*****************************************************************\n * Nanopb compilation time options. You can change these here by *\n * uncommenting the lines, or on the compiler command line.      *\n *****************************************************************/\n\n/* Enable support for dynamically allocated fields */\n#define PB_ENABLE_MALLOC 1\n\n/* Define this if your CPU / compiler combination does not support\n * unaligned memory access to packed structures. */\n/* #define PB_NO_PACKED_STRUCTS 1 */\n\n/* Increase the number of required fields that are tracked.\n * A compiler warning will tell if you need this. */\n/* #define PB_MAX_REQUIRED_FIELDS 256 */\n\n/* Add support for tag numbers > 65536 and fields larger than 65536 bytes. */\n#define PB_FIELD_32BIT 1\n\n/* Disable support for error messages in order to save some code space. */\n/* #define PB_NO_ERRMSG 1 */\n\n/* Disable support for custom streams (support only memory buffers). */\n/* #define PB_BUFFER_ONLY 1 */\n\n/* Disable support for 64-bit datatypes, for compilers without int64_t\n   or to save some code space. */\n/* #define PB_WITHOUT_64BIT 1 */\n\n/* Don't encode scalar arrays as packed. This is only to be used when\n * the decoder on the receiving side cannot process packed scalar arrays.\n * Such example is older protobuf.js. */\n/* #define PB_ENCODE_ARRAYS_UNPACKED 1 */\n\n/* Enable conversion of doubles to floats for platforms that do not\n * support 64-bit doubles. Most commonly AVR. */\n/* #define PB_CONVERT_DOUBLE_FLOAT 1 */\n\n/* Check whether incoming strings are valid UTF-8 sequences. Slows down\n * the string processing slightly and slightly increases code size. */\n/* #define PB_VALIDATE_UTF8 1 */\n\n/******************************************************************\n * You usually don't need to change anything below this line.     *\n * Feel free to look around and use the defined macros, though.   *\n ******************************************************************/\n\n\n/* Version of the nanopb library. Just in case you want to check it in\n * your own program. */\n#define NANOPB_VERSION nanopb-0.4.1\n\n/* Include all the system headers needed by nanopb. You will need the\n * definitions of the following:\n * - strlen, memcpy, memset functions\n * - [u]int_least8_t, uint_fast8_t, [u]int_least16_t, [u]int32_t, [u]int64_t\n * - size_t\n * - bool\n *\n * If you don't have the standard header files, you can instead provide\n * a custom header that defines or includes all this. In that case,\n * define PB_SYSTEM_HEADER to the path of this file.\n */\n#ifdef PB_SYSTEM_HEADER\n#include PB_SYSTEM_HEADER\n#else\n#include <stdint.h>\n#include <stddef.h>\n#include <stdbool.h>\n#include <string.h>\n#include <limits.h>\n\n#ifdef PB_ENABLE_MALLOC\n#include <stdlib.h>\n#endif\n#endif\n\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n/* Macro for defining packed structures (compiler dependent).\n * This just reduces memory requirements, but is not required.\n */\n#if defined(PB_NO_PACKED_STRUCTS)\n    /* Disable struct packing */\n#   define PB_PACKED_STRUCT_START\n#   define PB_PACKED_STRUCT_END\n#   define pb_packed\n#elif defined(__GNUC__) || defined(__clang__)\n    /* For GCC and clang */\n#   define PB_PACKED_STRUCT_START\n#   define PB_PACKED_STRUCT_END\n#   define pb_packed __attribute__((packed))\n#elif defined(__ICCARM__) || defined(__CC_ARM)\n    /* For IAR ARM and Keil MDK-ARM compilers */\n#   define PB_PACKED_STRUCT_START _Pragma(\"pack(push, 1)\")\n#   define PB_PACKED_STRUCT_END _Pragma(\"pack(pop)\")\n#   define pb_packed\n#elif defined(_MSC_VER) && (_MSC_VER >= 1500)\n    /* For Microsoft Visual C++ */\n#   define PB_PACKED_STRUCT_START __pragma(pack(push, 1))\n#   define PB_PACKED_STRUCT_END __pragma(pack(pop))\n#   define pb_packed\n#else\n    /* Unknown compiler */\n#   define PB_PACKED_STRUCT_START\n#   define PB_PACKED_STRUCT_END\n#   define pb_packed\n#endif\n\n/* Handly macro for suppressing unreferenced-parameter compiler warnings. */\n#ifndef PB_UNUSED\n#define PB_UNUSED(x) (void)(x)\n#endif\n\n/* Harvard-architecture processors may need special attributes for storing\n * field information in program memory. */\n#ifndef PB_PROGMEM\n#ifdef __AVR__\n#include <avr/pgmspace.h>\n#define PB_PROGMEM             PROGMEM\n#define PB_PROGMEM_READU32(x)  pgm_read_dword(&x)\n#else\n#define PB_PROGMEM\n#define PB_PROGMEM_READU32(x)  (x)\n#endif\n#endif\n\n/* Compile-time assertion, used for checking compatible compilation options.\n * If this does not work properly on your compiler, use\n * #define PB_NO_STATIC_ASSERT to disable it.\n *\n * But before doing that, check carefully the error message / place where it\n * comes from to see if the error has a real cause. Unfortunately the error\n * message is not always very clear to read, but you can see the reason better\n * in the place where the PB_STATIC_ASSERT macro was called.\n */\n#ifndef PB_NO_STATIC_ASSERT\n#  ifndef PB_STATIC_ASSERT\n#    if defined(__STDC_VERSION__) && __STDC_VERSION__ >= 201112L\n       /* C11 standard _Static_assert mechanism */\n#      define PB_STATIC_ASSERT(COND,MSG) _Static_assert(COND,#MSG);\n#    else\n       /* Classic negative-size-array static assert mechanism */\n#      define PB_STATIC_ASSERT(COND,MSG) typedef char PB_STATIC_ASSERT_MSG(MSG, __LINE__, __COUNTER__)[(COND)?1:-1];\n#      define PB_STATIC_ASSERT_MSG(MSG, LINE, COUNTER) PB_STATIC_ASSERT_MSG_(MSG, LINE, COUNTER)\n#      define PB_STATIC_ASSERT_MSG_(MSG, LINE, COUNTER) pb_static_assertion_##MSG##_##LINE##_##COUNTER\n#    endif\n#  endif\n#else\n   /* Static asserts disabled by PB_NO_STATIC_ASSERT */\n#  define PB_STATIC_ASSERT(COND,MSG)\n#endif\n\n/* Number of required fields to keep track of. */\n#ifndef PB_MAX_REQUIRED_FIELDS\n#define PB_MAX_REQUIRED_FIELDS 64\n#endif\n\n#if PB_MAX_REQUIRED_FIELDS < 64\n#error You should not lower PB_MAX_REQUIRED_FIELDS from the default value (64).\n#endif\n\n#ifdef PB_WITHOUT_64BIT\n#ifdef PB_CONVERT_DOUBLE_FLOAT\n/* Cannot use doubles without 64-bit types */\n#undef PB_CONVERT_DOUBLE_FLOAT\n#endif\n#endif\n\n/* List of possible field types. These are used in the autogenerated code.\n * Least-significant 4 bits tell the scalar type\n * Most-significant 4 bits specify repeated/required/packed etc.\n */\n\ntypedef uint_least8_t pb_type_t;\n\n/**** Field data types ****/\n\n/* Numeric types */\n#define PB_LTYPE_BOOL    0x00U /* bool */\n#define PB_LTYPE_VARINT  0x01U /* int32, int64, enum, bool */\n#define PB_LTYPE_UVARINT 0x02U /* uint32, uint64 */\n#define PB_LTYPE_SVARINT 0x03U /* sint32, sint64 */\n#define PB_LTYPE_FIXED32 0x04U /* fixed32, sfixed32, float */\n#define PB_LTYPE_FIXED64 0x05U /* fixed64, sfixed64, double */\n\n/* Marker for last packable field type. */\n#define PB_LTYPE_LAST_PACKABLE 0x05U\n\n/* Byte array with pre-allocated buffer.\n * data_size is the length of the allocated PB_BYTES_ARRAY structure. */\n#define PB_LTYPE_BYTES 0x06U\n\n/* String with pre-allocated buffer.\n * data_size is the maximum length. */\n#define PB_LTYPE_STRING 0x07U\n\n/* Submessage\n * submsg_fields is pointer to field descriptions */\n#define PB_LTYPE_SUBMESSAGE 0x08U\n\n/* Submessage with pre-decoding callback\n * The pre-decoding callback is stored as pb_callback_t right before pSize.\n * submsg_fields is pointer to field descriptions */\n#define PB_LTYPE_SUBMSG_W_CB 0x09U\n\n/* Extension pseudo-field\n * The field contains a pointer to pb_extension_t */\n#define PB_LTYPE_EXTENSION 0x0AU\n\n/* Byte array with inline, pre-allocated byffer.\n * data_size is the length of the inline, allocated buffer.\n * This differs from PB_LTYPE_BYTES by defining the element as\n * pb_byte_t[data_size] rather than pb_bytes_array_t. */\n#define PB_LTYPE_FIXED_LENGTH_BYTES 0x0BU\n\n/* Number of declared LTYPES */\n#define PB_LTYPES_COUNT 0x0CU\n#define PB_LTYPE_MASK 0x0FU\n\n/**** Field repetition rules ****/\n\n#define PB_HTYPE_REQUIRED 0x00U\n#define PB_HTYPE_OPTIONAL 0x10U\n#define PB_HTYPE_SINGULAR 0x10U\n#define PB_HTYPE_REPEATED 0x20U\n#define PB_HTYPE_FIXARRAY 0x20U\n#define PB_HTYPE_ONEOF    0x30U\n#define PB_HTYPE_MASK     0x30U\n\n/**** Field allocation types ****/\n \n#define PB_ATYPE_STATIC   0x00U\n#define PB_ATYPE_POINTER  0x80U\n#define PB_ATYPE_CALLBACK 0x40U\n#define PB_ATYPE_MASK     0xC0U\n\n#define PB_ATYPE(x) ((x) & PB_ATYPE_MASK)\n#define PB_HTYPE(x) ((x) & PB_HTYPE_MASK)\n#define PB_LTYPE(x) ((x) & PB_LTYPE_MASK)\n#define PB_LTYPE_IS_SUBMSG(x) (PB_LTYPE(x) == PB_LTYPE_SUBMESSAGE || \\\n                               PB_LTYPE(x) == PB_LTYPE_SUBMSG_W_CB)\n\n/* Data type used for storing sizes of struct fields\n * and array counts.\n */\n#if defined(PB_FIELD_32BIT)\n    typedef uint32_t pb_size_t;\n    typedef int32_t pb_ssize_t;\n#else\n    typedef uint_least16_t pb_size_t;\n    typedef int_least16_t pb_ssize_t;\n#endif\n#define PB_SIZE_MAX ((pb_size_t)-1)\n\n/* Data type for storing encoded data and other byte streams.\n * This typedef exists to support platforms where uint8_t does not exist.\n * You can regard it as equivalent on uint8_t on other platforms.\n */\ntypedef uint_least8_t pb_byte_t;\n\n/* Forward declaration of struct types */\ntypedef struct pb_istream_s pb_istream_t;\ntypedef struct pb_ostream_s pb_ostream_t;\ntypedef struct pb_field_iter_s pb_field_iter_t;\n\n/* This structure is used in auto-generated constants\n * to specify struct fields.\n */\nPB_PACKED_STRUCT_START\ntypedef struct pb_msgdesc_s pb_msgdesc_t;\nstruct pb_msgdesc_s {\n    pb_size_t field_count;\n    const uint32_t *field_info;\n    const pb_msgdesc_t * const * submsg_info;\n    const pb_byte_t *default_value;\n\n    bool (*field_callback)(pb_istream_t *istream, pb_ostream_t *ostream, const pb_field_iter_t *field);\n} pb_packed;\nPB_PACKED_STRUCT_END\n\n/* Iterator for message descriptor */\nstruct pb_field_iter_s {\n    const pb_msgdesc_t *descriptor;  /* Pointer to message descriptor constant */\n    void *message;                   /* Pointer to start of the structure */\n\n    pb_size_t index;                 /* Index of the field */\n    pb_size_t field_info_index;      /* Index to descriptor->field_info array */\n    pb_size_t required_field_index;  /* Index that counts only the required fields */\n    pb_size_t submessage_index;      /* Index that counts only submessages */\n\n    pb_size_t tag;                   /* Tag of current field */\n    pb_size_t data_size;             /* sizeof() of a single item */\n    pb_size_t array_size;            /* Number of array entries */\n    pb_type_t type;                  /* Type of current field */\n\n    void *pField;                    /* Pointer to current field in struct */\n    void *pData;                     /* Pointer to current data contents. Different than pField for arrays and pointers. */\n    void *pSize;                     /* Pointer to count/has field */\n\n    const pb_msgdesc_t *submsg_desc; /* For submessage fields, pointer to field descriptor for the submessage. */\n};\n\n/* For compatibility with legacy code */\ntypedef pb_field_iter_t pb_field_t;\n\n/* Make sure that the standard integer types are of the expected sizes.\n * Otherwise fixed32/fixed64 fields can break.\n *\n * If you get errors here, it probably means that your stdint.h is not\n * correct for your platform.\n */\n#ifndef PB_WITHOUT_64BIT\nPB_STATIC_ASSERT(sizeof(int64_t) == 2 * sizeof(int32_t), INT64_T_WRONG_SIZE)\nPB_STATIC_ASSERT(sizeof(uint64_t) == 2 * sizeof(uint32_t), UINT64_T_WRONG_SIZE)\n#endif\n\n/* This structure is used for 'bytes' arrays.\n * It has the number of bytes in the beginning, and after that an array.\n * Note that actual structs used will have a different length of bytes array.\n */\n#define PB_BYTES_ARRAY_T(n) struct { pb_size_t size; pb_byte_t bytes[n]; }\n#define PB_BYTES_ARRAY_T_ALLOCSIZE(n) ((size_t)n + offsetof(pb_bytes_array_t, bytes))\n\nstruct pb_bytes_array_s {\n    pb_size_t size;\n    pb_byte_t bytes[1];\n};\ntypedef struct pb_bytes_array_s pb_bytes_array_t;\n\n/* This structure is used for giving the callback function.\n * It is stored in the message structure and filled in by the method that\n * calls pb_decode.\n *\n * The decoding callback will be given a limited-length stream\n * If the wire type was string, the length is the length of the string.\n * If the wire type was a varint/fixed32/fixed64, the length is the length\n * of the actual value.\n * The function may be called multiple times (especially for repeated types,\n * but also otherwise if the message happens to contain the field multiple\n * times.)\n *\n * The encoding callback will receive the actual output stream.\n * It should write all the data in one call, including the field tag and\n * wire type. It can write multiple fields.\n *\n * The callback can be null if you want to skip a field.\n */\ntypedef struct pb_callback_s pb_callback_t;\nstruct pb_callback_s {\n    /* Callback functions receive a pointer to the arg field.\n     * You can access the value of the field as *arg, and modify it if needed.\n     */\n    union {\n        bool (*decode)(pb_istream_t *stream, const pb_field_t *field, void **arg);\n        bool (*encode)(pb_ostream_t *stream, const pb_field_t *field, void * const *arg);\n    } funcs;\n    \n    /* Free arg for use by callback */\n    void *arg;\n};\n\nextern bool pb_default_field_callback(pb_istream_t *istream, pb_ostream_t *ostream, const pb_field_t *field);\n\n/* Wire types. Library user needs these only in encoder callbacks. */\ntypedef enum {\n    PB_WT_VARINT = 0,\n    PB_WT_64BIT  = 1,\n    PB_WT_STRING = 2,\n    PB_WT_32BIT  = 5\n} pb_wire_type_t;\n\n/* Structure for defining the handling of unknown/extension fields.\n * Usually the pb_extension_type_t structure is automatically generated,\n * while the pb_extension_t structure is created by the user. However,\n * if you want to catch all unknown fields, you can also create a custom\n * pb_extension_type_t with your own callback.\n */\ntypedef struct pb_extension_type_s pb_extension_type_t;\ntypedef struct pb_extension_s pb_extension_t;\nstruct pb_extension_type_s {\n    /* Called for each unknown field in the message.\n     * If you handle the field, read off all of its data and return true.\n     * If you do not handle the field, do not read anything and return true.\n     * If you run into an error, return false.\n     * Set to NULL for default handler.\n     */\n    bool (*decode)(pb_istream_t *stream, pb_extension_t *extension,\n                   uint32_t tag, pb_wire_type_t wire_type);\n    \n    /* Called once after all regular fields have been encoded.\n     * If you have something to write, do so and return true.\n     * If you do not have anything to write, just return true.\n     * If you run into an error, return false.\n     * Set to NULL for default handler.\n     */\n    bool (*encode)(pb_ostream_t *stream, const pb_extension_t *extension);\n    \n    /* Free field for use by the callback. */\n    const void *arg;\n};\n\nstruct pb_extension_s {\n    /* Type describing the extension field. Usually you'll initialize\n     * this to a pointer to the automatically generated structure. */\n    const pb_extension_type_t *type;\n    \n    /* Destination for the decoded data. This must match the datatype\n     * of the extension field. */\n    void *dest;\n    \n    /* Pointer to the next extension handler, or NULL.\n     * If this extension does not match a field, the next handler is\n     * automatically called. */\n    pb_extension_t *next;\n\n    /* The decoder sets this to true if the extension was found.\n     * Ignored for encoding. */\n    bool found;\n};\n\n#define pb_extension_init_zero {NULL,NULL,NULL,false}\n\n/* Memory allocation functions to use. You can define pb_realloc and\n * pb_free to custom functions if you want. */\n#ifdef PB_ENABLE_MALLOC\n#   ifndef pb_realloc\n#       define pb_realloc(ptr, size) realloc(ptr, size)\n#   endif\n#   ifndef pb_free\n#       define pb_free(ptr) free(ptr)\n#   endif\n#endif\n\n/* This is used to inform about need to regenerate .pb.h/.pb.c files. */\n#define PB_PROTO_HEADER_VERSION 40\n\n/* These macros are used to declare pb_field_t's in the constant array. */\n/* Size of a structure member, in bytes. */\n#define pb_membersize(st, m) (sizeof ((st*)0)->m)\n/* Number of entries in an array. */\n#define pb_arraysize(st, m) (pb_membersize(st, m) / pb_membersize(st, m[0]))\n/* Delta from start of one member to the start of another member. */\n#define pb_delta(st, m1, m2) ((int)offsetof(st, m1) - (int)offsetof(st, m2))\n\n/* Force expansion of macro value */\n#define PB_EXPAND(x) x\n\n/* Binding of a message field set into a specific structure */\n#define PB_BIND(msgname, structname, width) \\\n    const uint32_t structname ## _field_info[] PB_PROGMEM = \\\n    { \\\n        msgname ## _FIELDLIST(PB_GEN_FIELD_INFO_ ## width, structname) \\\n        0 \\\n    }; \\\n    const pb_msgdesc_t* const structname ## _submsg_info[] = \\\n    { \\\n        msgname ## _FIELDLIST(PB_GEN_SUBMSG_INFO, structname) \\\n        NULL \\\n    }; \\\n    const pb_msgdesc_t structname ## _msg = \\\n    { \\\n       0 msgname ## _FIELDLIST(PB_GEN_FIELD_COUNT, structname), \\\n       structname ## _field_info, \\\n       structname ## _submsg_info, \\\n       msgname ## _DEFAULT, \\\n       msgname ## _CALLBACK, \\\n    }; \\\n    msgname ## _FIELDLIST(PB_GEN_FIELD_INFO_ASSERT_ ## width, structname)\n\n#define PB_GEN_FIELD_COUNT(structname, atype, htype, ltype, fieldname, tag) +1\n\n#define PB_GEN_FIELD_INFO_1(structname, atype, htype, ltype, fieldname, tag) \\\n    PB_GEN_FIELD_INFO(1, structname, atype, htype, ltype, fieldname, tag)\n\n#define PB_GEN_FIELD_INFO_2(structname, atype, htype, ltype, fieldname, tag) \\\n    PB_GEN_FIELD_INFO(2, structname, atype, htype, ltype, fieldname, tag)\n\n#define PB_GEN_FIELD_INFO_4(structname, atype, htype, ltype, fieldname, tag) \\\n    PB_GEN_FIELD_INFO(4, structname, atype, htype, ltype, fieldname, tag)\n\n#define PB_GEN_FIELD_INFO_8(structname, atype, htype, ltype, fieldname, tag) \\\n    PB_GEN_FIELD_INFO(8, structname, atype, htype, ltype, fieldname, tag)\n\n#define PB_GEN_FIELD_INFO_AUTO(structname, atype, htype, ltype, fieldname, tag) \\\n    PB_GEN_FIELD_INFO_AUTO2(PB_FIELDINFO_WIDTH_AUTO(atype, htype, ltype), structname, atype, htype, ltype, fieldname, tag)\n\n#define PB_GEN_FIELD_INFO_AUTO2(width, structname, atype, htype, ltype, fieldname, tag) \\\n    PB_GEN_FIELD_INFO(width, structname, atype, htype, ltype, fieldname, tag)\n\n#define PB_GEN_FIELD_INFO(width, structname, atype, htype, ltype, fieldname, tag) \\\n    PB_FIELDINFO_ ## width(tag, PB_ATYPE_ ## atype | PB_HTYPE_ ## htype | PB_LTYPE_MAP_ ## ltype, \\\n                   PB_DATA_OFFSET_ ## atype(htype, structname, fieldname), \\\n                   PB_DATA_SIZE_ ## atype(htype, structname, fieldname), \\\n                   PB_SIZE_OFFSET_ ## atype(htype, structname, fieldname), \\\n                   PB_ARRAY_SIZE_ ## atype(htype, structname, fieldname))\n\n#define PB_GEN_FIELD_INFO_ASSERT_1(structname, atype, htype, ltype, fieldname, tag) \\\n    PB_GEN_FIELD_INFO_ASSERT(1, structname, atype, htype, ltype, fieldname, tag)\n\n#define PB_GEN_FIELD_INFO_ASSERT_2(structname, atype, htype, ltype, fieldname, tag) \\\n    PB_GEN_FIELD_INFO_ASSERT(2, structname, atype, htype, ltype, fieldname, tag)\n\n#define PB_GEN_FIELD_INFO_ASSERT_4(structname, atype, htype, ltype, fieldname, tag) \\\n    PB_GEN_FIELD_INFO_ASSERT(4, structname, atype, htype, ltype, fieldname, tag)\n\n#define PB_GEN_FIELD_INFO_ASSERT_8(structname, atype, htype, ltype, fieldname, tag) \\\n    PB_GEN_FIELD_INFO_ASSERT(8, structname, atype, htype, ltype, fieldname, tag)\n\n#define PB_GEN_FIELD_INFO_ASSERT_AUTO(structname, atype, htype, ltype, fieldname, tag) \\\n    PB_GEN_FIELD_INFO_ASSERT_AUTO2(PB_FIELDINFO_WIDTH_AUTO(atype, htype, ltype), structname, atype, htype, ltype, fieldname, tag)\n\n#define PB_GEN_FIELD_INFO_ASSERT_AUTO2(width, structname, atype, htype, ltype, fieldname, tag) \\\n    PB_GEN_FIELD_INFO_ASSERT(width, structname, atype, htype, ltype, fieldname, tag)\n\n#define PB_GEN_FIELD_INFO_ASSERT(width, structname, atype, htype, ltype, fieldname, tag) \\\n    PB_FIELDINFO_ASSERT_ ## width(tag, PB_ATYPE_ ## atype | PB_HTYPE_ ## htype | PB_LTYPE_MAP_ ## ltype, \\\n                   PB_DATA_OFFSET_ ## atype(htype, structname, fieldname), \\\n                   PB_DATA_SIZE_ ## atype(htype, structname, fieldname), \\\n                   PB_SIZE_OFFSET_ ## atype(htype, structname, fieldname), \\\n                   PB_ARRAY_SIZE_ ## atype(htype, structname, fieldname))\n\n#define PB_DATA_OFFSET_STATIC(htype, structname, fieldname) PB_DATA_OFFSET_ ## htype(structname, fieldname)\n#define PB_DATA_OFFSET_POINTER(htype, structname, fieldname) PB_DATA_OFFSET_ ## htype(structname, fieldname)\n#define PB_DATA_OFFSET_CALLBACK(htype, structname, fieldname) PB_DATA_OFFSET_ ## htype(structname, fieldname)\n#define PB_DATA_OFFSET_REQUIRED(structname, fieldname) offsetof(structname, fieldname)\n#define PB_DATA_OFFSET_SINGULAR(structname, fieldname) offsetof(structname, fieldname)\n#define PB_DATA_OFFSET_ONEOF(structname, fieldname) offsetof(structname, PB_ONEOF_NAME(FULL, fieldname))\n#define PB_DATA_OFFSET_OPTIONAL(structname, fieldname) offsetof(structname, fieldname)\n#define PB_DATA_OFFSET_REPEATED(structname, fieldname) offsetof(structname, fieldname)\n#define PB_DATA_OFFSET_FIXARRAY(structname, fieldname) offsetof(structname, fieldname)\n\n#define PB_SIZE_OFFSET_STATIC(htype, structname, fieldname) PB_SIZE_OFFSET_ ## htype(structname, fieldname)\n#define PB_SIZE_OFFSET_POINTER(htype, structname, fieldname) PB_SIZE_OFFSET_PTR_ ## htype(structname, fieldname)\n#define PB_SIZE_OFFSET_CALLBACK(htype, structname, fieldname) PB_SIZE_OFFSET_CB_ ## htype(structname, fieldname)\n#define PB_SIZE_OFFSET_REQUIRED(structname, fieldname) 0\n#define PB_SIZE_OFFSET_SINGULAR(structname, fieldname) 0\n#define PB_SIZE_OFFSET_ONEOF(structname, fieldname) PB_SIZE_OFFSET_ONEOF2(structname, PB_ONEOF_NAME(FULL, fieldname), PB_ONEOF_NAME(UNION, fieldname))\n#define PB_SIZE_OFFSET_ONEOF2(structname, fullname, unionname) PB_SIZE_OFFSET_ONEOF3(structname, fullname, unionname)\n#define PB_SIZE_OFFSET_ONEOF3(structname, fullname, unionname) pb_delta(structname, fullname, which_ ## unionname)\n#define PB_SIZE_OFFSET_OPTIONAL(structname, fieldname) pb_delta(structname, fieldname, has_ ## fieldname)\n#define PB_SIZE_OFFSET_REPEATED(structname, fieldname) pb_delta(structname, fieldname, fieldname ## _count)\n#define PB_SIZE_OFFSET_FIXARRAY(structname, fieldname) 0\n#define PB_SIZE_OFFSET_PTR_REQUIRED(structname, fieldname) 0\n#define PB_SIZE_OFFSET_PTR_SINGULAR(structname, fieldname) 0\n#define PB_SIZE_OFFSET_PTR_ONEOF(structname, fieldname) PB_SIZE_OFFSET_ONEOF(structname, fieldname)\n#define PB_SIZE_OFFSET_PTR_OPTIONAL(structname, fieldname) 0\n#define PB_SIZE_OFFSET_PTR_REPEATED(structname, fieldname) PB_SIZE_OFFSET_REPEATED(structname, fieldname)\n#define PB_SIZE_OFFSET_PTR_FIXARRAY(structname, fieldname) 0\n#define PB_SIZE_OFFSET_CB_REQUIRED(structname, fieldname) 0\n#define PB_SIZE_OFFSET_CB_SINGULAR(structname, fieldname) 0\n#define PB_SIZE_OFFSET_CB_ONEOF(structname, fieldname) PB_SIZE_OFFSET_ONEOF(structname, fieldname)\n#define PB_SIZE_OFFSET_CB_OPTIONAL(structname, fieldname) 0\n#define PB_SIZE_OFFSET_CB_REPEATED(structname, fieldname) 0\n#define PB_SIZE_OFFSET_CB_FIXARRAY(structname, fieldname) 0\n\n#define PB_ARRAY_SIZE_STATIC(htype, structname, fieldname) PB_ARRAY_SIZE_ ## htype(structname, fieldname)\n#define PB_ARRAY_SIZE_POINTER(htype, structname, fieldname) PB_ARRAY_SIZE_PTR_ ## htype(structname, fieldname)\n#define PB_ARRAY_SIZE_CALLBACK(htype, structname, fieldname) 1\n#define PB_ARRAY_SIZE_REQUIRED(structname, fieldname) 1\n#define PB_ARRAY_SIZE_SINGULAR(structname, fieldname) 1\n#define PB_ARRAY_SIZE_OPTIONAL(structname, fieldname) 1\n#define PB_ARRAY_SIZE_ONEOF(structname, fieldname) 1\n#define PB_ARRAY_SIZE_REPEATED(structname, fieldname) pb_arraysize(structname, fieldname)\n#define PB_ARRAY_SIZE_FIXARRAY(structname, fieldname) pb_arraysize(structname, fieldname)\n#define PB_ARRAY_SIZE_PTR_REQUIRED(structname, fieldname) 1\n#define PB_ARRAY_SIZE_PTR_SINGULAR(structname, fieldname) 1\n#define PB_ARRAY_SIZE_PTR_OPTIONAL(structname, fieldname) 1\n#define PB_ARRAY_SIZE_PTR_ONEOF(structname, fieldname) 1\n#define PB_ARRAY_SIZE_PTR_REPEATED(structname, fieldname) 1\n#define PB_ARRAY_SIZE_PTR_FIXARRAY(structname, fieldname) pb_arraysize(structname, fieldname[0])\n\n#define PB_DATA_SIZE_STATIC(htype, structname, fieldname) PB_DATA_SIZE_ ## htype(structname, fieldname)\n#define PB_DATA_SIZE_POINTER(htype, structname, fieldname) PB_DATA_SIZE_PTR_ ## htype(structname, fieldname)\n#define PB_DATA_SIZE_CALLBACK(htype, structname, fieldname) PB_DATA_SIZE_CB_ ## htype(structname, fieldname)\n#define PB_DATA_SIZE_REQUIRED(structname, fieldname) pb_membersize(structname, fieldname)\n#define PB_DATA_SIZE_SINGULAR(structname, fieldname) pb_membersize(structname, fieldname)\n#define PB_DATA_SIZE_OPTIONAL(structname, fieldname) pb_membersize(structname, fieldname)\n#define PB_DATA_SIZE_ONEOF(structname, fieldname) pb_membersize(structname, PB_ONEOF_NAME(FULL, fieldname))\n#define PB_DATA_SIZE_REPEATED(structname, fieldname) pb_membersize(structname, fieldname[0])\n#define PB_DATA_SIZE_FIXARRAY(structname, fieldname) pb_membersize(structname, fieldname[0])\n#define PB_DATA_SIZE_PTR_REQUIRED(structname, fieldname) pb_membersize(structname, fieldname[0])\n#define PB_DATA_SIZE_PTR_SINGULAR(structname, fieldname) pb_membersize(structname, fieldname[0])\n#define PB_DATA_SIZE_PTR_OPTIONAL(structname, fieldname) pb_membersize(structname, fieldname[0])\n#define PB_DATA_SIZE_PTR_ONEOF(structname, fieldname) pb_membersize(structname, PB_ONEOF_NAME(FULL, fieldname)[0])\n#define PB_DATA_SIZE_PTR_REPEATED(structname, fieldname) pb_membersize(structname, fieldname[0])\n#define PB_DATA_SIZE_PTR_FIXARRAY(structname, fieldname) pb_membersize(structname, fieldname[0][0])\n#define PB_DATA_SIZE_CB_REQUIRED(structname, fieldname) pb_membersize(structname, fieldname)\n#define PB_DATA_SIZE_CB_SINGULAR(structname, fieldname) pb_membersize(structname, fieldname)\n#define PB_DATA_SIZE_CB_OPTIONAL(structname, fieldname) pb_membersize(structname, fieldname)\n#define PB_DATA_SIZE_CB_ONEOF(structname, fieldname) pb_membersize(structname, PB_ONEOF_NAME(FULL, fieldname))\n#define PB_DATA_SIZE_CB_REPEATED(structname, fieldname) pb_membersize(structname, fieldname)\n#define PB_DATA_SIZE_CB_FIXARRAY(structname, fieldname) pb_membersize(structname, fieldname)\n\n#define PB_ONEOF_NAME(type, tuple) PB_EXPAND(PB_ONEOF_NAME_ ## type tuple)\n#define PB_ONEOF_NAME_UNION(unionname,membername,fullname) unionname\n#define PB_ONEOF_NAME_MEMBER(unionname,membername,fullname) membername\n#define PB_ONEOF_NAME_FULL(unionname,membername,fullname) fullname\n\n#define PB_GEN_SUBMSG_INFO(structname, atype, htype, ltype, fieldname, tag) \\\n    PB_SUBMSG_INFO_ ## htype(ltype, structname, fieldname)\n\n#define PB_SUBMSG_INFO_REQUIRED(ltype, structname, fieldname) PB_SUBMSG_INFO_ ## ltype(structname ## _ ## fieldname ## _MSGTYPE)\n#define PB_SUBMSG_INFO_SINGULAR(ltype, structname, fieldname) PB_SUBMSG_INFO_ ## ltype(structname ## _ ## fieldname ## _MSGTYPE)\n#define PB_SUBMSG_INFO_OPTIONAL(ltype, structname, fieldname) PB_SUBMSG_INFO_ ## ltype(structname ## _ ## fieldname ## _MSGTYPE)\n#define PB_SUBMSG_INFO_ONEOF(ltype, structname, fieldname) PB_SUBMSG_INFO_ONEOF2(ltype, structname, PB_ONEOF_NAME(UNION, fieldname), PB_ONEOF_NAME(MEMBER, fieldname))\n#define PB_SUBMSG_INFO_ONEOF2(ltype, structname, unionname, membername) PB_SUBMSG_INFO_ONEOF3(ltype, structname, unionname, membername)\n#define PB_SUBMSG_INFO_ONEOF3(ltype, structname, unionname, membername) PB_SUBMSG_INFO_ ## ltype(structname ## _ ## unionname ## _ ## membername ## _MSGTYPE)\n#define PB_SUBMSG_INFO_REPEATED(ltype, structname, fieldname) PB_SUBMSG_INFO_ ## ltype(structname ## _ ## fieldname ## _MSGTYPE)\n#define PB_SUBMSG_INFO_FIXARRAY(ltype, structname, fieldname) PB_SUBMSG_INFO_ ## ltype(structname ## _ ## fieldname ## _MSGTYPE)\n#define PB_SUBMSG_INFO_BOOL(t)\n#define PB_SUBMSG_INFO_BYTES(t)\n#define PB_SUBMSG_INFO_DOUBLE(t)\n#define PB_SUBMSG_INFO_ENUM(t)\n#define PB_SUBMSG_INFO_UENUM(t)\n#define PB_SUBMSG_INFO_FIXED32(t)\n#define PB_SUBMSG_INFO_FIXED64(t)\n#define PB_SUBMSG_INFO_FLOAT(t)\n#define PB_SUBMSG_INFO_INT32(t)\n#define PB_SUBMSG_INFO_INT64(t)\n#define PB_SUBMSG_INFO_MESSAGE(t)  PB_SUBMSG_DESCRIPTOR(t)\n#define PB_SUBMSG_INFO_MSG_W_CB(t) PB_SUBMSG_DESCRIPTOR(t)\n#define PB_SUBMSG_INFO_SFIXED32(t)\n#define PB_SUBMSG_INFO_SFIXED64(t)\n#define PB_SUBMSG_INFO_SINT32(t)\n#define PB_SUBMSG_INFO_SINT64(t)\n#define PB_SUBMSG_INFO_STRING(t)\n#define PB_SUBMSG_INFO_UINT32(t)\n#define PB_SUBMSG_INFO_UINT64(t)\n#define PB_SUBMSG_INFO_EXTENSION(t)\n#define PB_SUBMSG_INFO_FIXED_LENGTH_BYTES(t)\n#define PB_SUBMSG_DESCRIPTOR(t)    &(t ## _msg),\n\n/* The field descriptors use a variable width format, with width of either\n * 1, 2, 4 or 8 of 32-bit words. The two lowest bytes of the first byte always\n * encode the descriptor size, 6 lowest bits of field tag number, and 8 bits\n * of the field type.\n *\n * Descriptor size is encoded as 0 = 1 word, 1 = 2 words, 2 = 4 words, 3 = 8 words.\n *\n * Formats, listed starting with the least significant bit of the first word.\n * 1 word:  [2-bit len] [6-bit tag] [8-bit type] [8-bit data_offset] [4-bit size_offset] [4-bit data_size]\n *\n * 2 words: [2-bit len] [6-bit tag] [8-bit type] [12-bit array_size] [4-bit size_offset]\n *          [16-bit data_offset] [12-bit data_size] [4-bit tag>>6]\n *\n * 4 words: [2-bit len] [6-bit tag] [8-bit type] [16-bit array_size]\n *          [8-bit size_offset] [24-bit tag>>6]\n *          [32-bit data_offset]\n *          [32-bit data_size]\n *\n * 8 words: [2-bit len] [6-bit tag] [8-bit type] [16-bit reserved]\n *          [8-bit size_offset] [24-bit tag>>6]\n *          [32-bit data_offset]\n *          [32-bit data_size]\n *          [32-bit array_size]\n *          [32-bit reserved]\n *          [32-bit reserved]\n *          [32-bit reserved]\n */\n\n#define PB_FIELDINFO_1(tag, type, data_offset, data_size, size_offset, array_size) \\\n    (0 | (((tag) << 2) & 0xFF) | ((type) << 8) | (((uint32_t)(data_offset) & 0xFF) << 16) | \\\n     (((uint32_t)(size_offset) & 0x0F) << 24) | (((uint32_t)(data_size) & 0x0F) << 28)),\n\n#define PB_FIELDINFO_2(tag, type, data_offset, data_size, size_offset, array_size) \\\n    (1 | (((tag) << 2) & 0xFF) | ((type) << 8) | (((uint32_t)(array_size) & 0xFFF) << 16) | (((uint32_t)(size_offset) & 0x0F) << 28)), \\\n    (((uint32_t)(data_offset) & 0xFFFF) | (((uint32_t)(data_size) & 0xFFF) << 16) | (((uint32_t)(tag) & 0x3c0) << 22)),\n\n#define PB_FIELDINFO_4(tag, type, data_offset, data_size, size_offset, array_size) \\\n    (2 | (((tag) << 2) & 0xFF) | ((type) << 8) | (((uint32_t)(array_size) & 0xFFFF) << 16)), \\\n    ((uint32_t)(int_least8_t)(size_offset) | (((uint32_t)(tag) << 2) & 0xFFFFFF00)), \\\n    (data_offset), (data_size),\n\n#define PB_FIELDINFO_8(tag, type, data_offset, data_size, size_offset, array_size) \\\n    (3 | (((tag) << 2) & 0xFF) | ((type) << 8)), \\\n    ((uint32_t)(int_least8_t)(size_offset) | (((uint32_t)(tag) << 2) & 0xFFFFFF00)), \\\n    (data_offset), (data_size), (array_size), 0, 0, 0,\n\n/* These assertions verify that the field information fits in the allocated space.\n * The generator tries to automatically determine the correct width that can fit all\n * data associated with a message. These asserts will fail only if there has been a\n * problem in the automatic logic - this may be worth reporting as a bug. As a workaround,\n * you can increase the descriptor width by defining PB_FIELDINFO_WIDTH or by setting\n * descriptorsize option in .options file.\n */\n#define PB_FITS(value,bits) ((uint32_t)(value) < ((uint32_t)1<<bits))\n#define PB_FIELDINFO_ASSERT_1(tag, type, data_offset, data_size, size_offset, array_size) \\\n    PB_STATIC_ASSERT(PB_FITS(tag,6) && PB_FITS(data_offset,8) && PB_FITS(size_offset,4) && PB_FITS(data_size,4) && PB_FITS(array_size,1), FIELDINFO_DOES_NOT_FIT_width1_field ## tag)\n\n#define PB_FIELDINFO_ASSERT_2(tag, type, data_offset, data_size, size_offset, array_size) \\\n    PB_STATIC_ASSERT(PB_FITS(tag,10) && PB_FITS(data_offset,16) && PB_FITS(size_offset,4) && PB_FITS(data_size,12) && PB_FITS(array_size,12), FIELDINFO_DOES_NOT_FIT_width2_field ## tag)\n\n#ifndef PB_FIELD_32BIT\n/* Maximum field sizes are still 16-bit if pb_size_t is 16-bit */\n#define PB_FIELDINFO_ASSERT_4(tag, type, data_offset, data_size, size_offset, array_size) \\\n    PB_STATIC_ASSERT(PB_FITS(tag,16) && PB_FITS(data_offset,16) && PB_FITS((int_least8_t)size_offset,8) && PB_FITS(data_size,16) && PB_FITS(array_size,16), FIELDINFO_DOES_NOT_FIT_width4_field ## tag)\n\n#define PB_FIELDINFO_ASSERT_8(tag, type, data_offset, data_size, size_offset, array_size) \\\n    PB_STATIC_ASSERT(PB_FITS(tag,16) && PB_FITS(data_offset,16) && PB_FITS((int_least8_t)size_offset,8) && PB_FITS(data_size,16) && PB_FITS(array_size,16), FIELDINFO_DOES_NOT_FIT_width8_field ## tag)\n#else\n/* Up to 32-bit fields supported.\n * Note that the checks are against 31 bits to avoid compiler warnings about shift wider than type in the test.\n * I expect that there is no reasonable use for >2GB messages with nanopb anyway.\n */\n#define PB_FIELDINFO_ASSERT_4(tag, type, data_offset, data_size, size_offset, array_size) \\\n    PB_STATIC_ASSERT(PB_FITS(tag,30) && PB_FITS(data_offset,31) && PB_FITS(size_offset,8) && PB_FITS(data_size,31) && PB_FITS(array_size,16), FIELDINFO_DOES_NOT_FIT_width4_field ## tag)\n\n#define PB_FIELDINFO_ASSERT_8(tag, type, data_offset, data_size, size_offset, array_size) \\\n    PB_STATIC_ASSERT(PB_FITS(tag,30) && PB_FITS(data_offset,31) && PB_FITS(size_offset,8) && PB_FITS(data_size,31) && PB_FITS(array_size,31), FIELDINFO_DOES_NOT_FIT_width8_field ## tag)\n#endif\n\n\n/* Automatic picking of FIELDINFO width:\n * Uses width 1 when possible, otherwise resorts to width 2.\n * This is used when PB_BIND() is called with \"AUTO\" as the argument.\n * The generator will give explicit size argument when it knows that a message\n * structure grows beyond 1-word format limits.\n */\n#define PB_FIELDINFO_WIDTH_AUTO(atype, htype, ltype) PB_FIELDINFO_WIDTH_ ## atype(htype, ltype)\n#define PB_FIELDINFO_WIDTH_STATIC(htype, ltype) PB_FIELDINFO_WIDTH_ ## htype(ltype)\n#define PB_FIELDINFO_WIDTH_POINTER(htype, ltype) PB_FIELDINFO_WIDTH_ ## htype(ltype)\n#define PB_FIELDINFO_WIDTH_CALLBACK(htype, ltype) 2\n#define PB_FIELDINFO_WIDTH_REQUIRED(ltype) PB_FIELDINFO_WIDTH_ ## ltype\n#define PB_FIELDINFO_WIDTH_SINGULAR(ltype) PB_FIELDINFO_WIDTH_ ## ltype\n#define PB_FIELDINFO_WIDTH_OPTIONAL(ltype) PB_FIELDINFO_WIDTH_ ## ltype\n#define PB_FIELDINFO_WIDTH_ONEOF(ltype) PB_FIELDINFO_WIDTH_ ## ltype\n#define PB_FIELDINFO_WIDTH_REPEATED(ltype) 2\n#define PB_FIELDINFO_WIDTH_FIXARRAY(ltype) 2\n#define PB_FIELDINFO_WIDTH_BOOL      1\n#define PB_FIELDINFO_WIDTH_BYTES     2\n#define PB_FIELDINFO_WIDTH_DOUBLE    1\n#define PB_FIELDINFO_WIDTH_ENUM      1\n#define PB_FIELDINFO_WIDTH_UENUM     1\n#define PB_FIELDINFO_WIDTH_FIXED32   1\n#define PB_FIELDINFO_WIDTH_FIXED64   1\n#define PB_FIELDINFO_WIDTH_FLOAT     1\n#define PB_FIELDINFO_WIDTH_INT32     1\n#define PB_FIELDINFO_WIDTH_INT64     1\n#define PB_FIELDINFO_WIDTH_MESSAGE   2\n#define PB_FIELDINFO_WIDTH_MSG_W_CB  2\n#define PB_FIELDINFO_WIDTH_SFIXED32  1\n#define PB_FIELDINFO_WIDTH_SFIXED64  1\n#define PB_FIELDINFO_WIDTH_SINT32    1\n#define PB_FIELDINFO_WIDTH_SINT64    1\n#define PB_FIELDINFO_WIDTH_STRING    2\n#define PB_FIELDINFO_WIDTH_UINT32    1\n#define PB_FIELDINFO_WIDTH_UINT64    1\n#define PB_FIELDINFO_WIDTH_EXTENSION 1\n#define PB_FIELDINFO_WIDTH_FIXED_LENGTH_BYTES 2\n\n/* The mapping from protobuf types to LTYPEs is done using these macros. */\n#define PB_LTYPE_MAP_BOOL               PB_LTYPE_BOOL\n#define PB_LTYPE_MAP_BYTES              PB_LTYPE_BYTES\n#define PB_LTYPE_MAP_DOUBLE             PB_LTYPE_FIXED64\n#define PB_LTYPE_MAP_ENUM               PB_LTYPE_VARINT\n#define PB_LTYPE_MAP_UENUM              PB_LTYPE_UVARINT\n#define PB_LTYPE_MAP_FIXED32            PB_LTYPE_FIXED32\n#define PB_LTYPE_MAP_FIXED64            PB_LTYPE_FIXED64\n#define PB_LTYPE_MAP_FLOAT              PB_LTYPE_FIXED32\n#define PB_LTYPE_MAP_INT32              PB_LTYPE_VARINT\n#define PB_LTYPE_MAP_INT64              PB_LTYPE_VARINT\n#define PB_LTYPE_MAP_MESSAGE            PB_LTYPE_SUBMESSAGE\n#define PB_LTYPE_MAP_MSG_W_CB           PB_LTYPE_SUBMSG_W_CB\n#define PB_LTYPE_MAP_SFIXED32           PB_LTYPE_FIXED32\n#define PB_LTYPE_MAP_SFIXED64           PB_LTYPE_FIXED64\n#define PB_LTYPE_MAP_SINT32             PB_LTYPE_SVARINT\n#define PB_LTYPE_MAP_SINT64             PB_LTYPE_SVARINT\n#define PB_LTYPE_MAP_STRING             PB_LTYPE_STRING\n#define PB_LTYPE_MAP_UINT32             PB_LTYPE_UVARINT\n#define PB_LTYPE_MAP_UINT64             PB_LTYPE_UVARINT\n#define PB_LTYPE_MAP_EXTENSION          PB_LTYPE_EXTENSION\n#define PB_LTYPE_MAP_FIXED_LENGTH_BYTES PB_LTYPE_FIXED_LENGTH_BYTES\n\n/* These macros are used for giving out error messages.\n * They are mostly a debugging aid; the main error information\n * is the true/false return value from functions.\n * Some code space can be saved by disabling the error\n * messages if not used.\n *\n * PB_SET_ERROR() sets the error message if none has been set yet.\n *                msg must be a constant string literal.\n * PB_GET_ERROR() always returns a pointer to a string.\n * PB_RETURN_ERROR() sets the error and returns false from current\n *                   function.\n */\n#ifdef PB_NO_ERRMSG\n#define PB_SET_ERROR(stream, msg) PB_UNUSED(stream)\n#define PB_GET_ERROR(stream) \"(errmsg disabled)\"\n#else\n#define PB_SET_ERROR(stream, msg) (stream->errmsg = (stream)->errmsg ? (stream)->errmsg : (msg))\n#define PB_GET_ERROR(stream) ((stream)->errmsg ? (stream)->errmsg : \"(none)\")\n#endif\n\n#define PB_RETURN_ERROR(stream, msg) return PB_SET_ERROR(stream, msg), false\n\n#ifdef __cplusplus\n} /* extern \"C\" */\n#endif\n\n#ifdef __cplusplus\n#if __cplusplus >= 201103L\n#define PB_CONSTEXPR constexpr\n#else  // __cplusplus >= 201103L\n#define PB_CONSTEXPR\n#endif  // __cplusplus >= 201103L\n\n#if __cplusplus >= 201703L\n#define PB_INLINE_CONSTEXPR inline constexpr\n#else  // __cplusplus >= 201703L\n#define PB_INLINE_CONSTEXPR PB_CONSTEXPR\n#endif  // __cplusplus >= 201703L\n\nnamespace nanopb {\n// Each type will be partially specialized by the generator.\ntemplate <typename GenMessageT> struct MessageDescriptor;\n}  // namespace nanopb\n#endif  /* __cplusplus */\n\n#endif\n\n"
  },
  {
    "path": "c/core/include/pb_common.h",
    "content": "/* pb_common.h: Common support functions for pb_encode.c and pb_decode.c.\n * These functions are rarely needed by applications directly.\n */\n\n#ifndef PB_COMMON_H_INCLUDED\n#define PB_COMMON_H_INCLUDED\n\n#include \"pb.h\"\n\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n/* Initialize the field iterator structure to beginning.\n * Returns false if the message type is empty. */\nbool pb_field_iter_begin(pb_field_iter_t *iter, const pb_msgdesc_t *desc, void *message);\n\n/* Get a field iterator for extension field. */\nbool pb_field_iter_begin_extension(pb_field_iter_t *iter, pb_extension_t *extension);\n\n/* Same as pb_field_iter_begin(), but for const message pointer.\n * Note that the pointers in pb_field_iter_t will be non-const but shouldn't\n * be written to when using these functions. */\nbool pb_field_iter_begin_const(pb_field_iter_t *iter, const pb_msgdesc_t *desc, const void *message);\nbool pb_field_iter_begin_extension_const(pb_field_iter_t *iter, const pb_extension_t *extension);\n\n/* Advance the iterator to the next field.\n * Returns false when the iterator wraps back to the first field. */\nbool pb_field_iter_next(pb_field_iter_t *iter);\n\n/* Advance the iterator until it points at a field with the given tag.\n * Returns false if no such field exists. */\nbool pb_field_iter_find(pb_field_iter_t *iter, uint32_t tag);\n\n#ifdef PB_VALIDATE_UTF8\n/* Validate UTF-8 text string */\nbool pb_validate_utf8(const char *s);\n#endif\n\n#ifdef __cplusplus\n} /* extern \"C\" */\n#endif\n\n#endif\n\n"
  },
  {
    "path": "c/core/include/pb_decode.h",
    "content": "/* pb_decode.h: Functions to decode protocol buffers. Depends on pb_decode.c.\n * The main function is pb_decode. You also need an input stream, and the\n * field descriptions created by nanopb_generator.py.\n */\n\n#ifndef PB_DECODE_H_INCLUDED\n#define PB_DECODE_H_INCLUDED\n\n#include \"pb.h\"\n\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n/* Structure for defining custom input streams. You will need to provide\n * a callback function to read the bytes from your storage, which can be\n * for example a file or a network socket.\n * \n * The callback must conform to these rules:\n *\n * 1) Return false on IO errors. This will cause decoding to abort.\n * 2) You can use state to store your own data (e.g. buffer pointer),\n *    and rely on pb_read to verify that no-body reads past bytes_left.\n * 3) Your callback may be used with substreams, in which case bytes_left\n *    is different than from the main stream. Don't use bytes_left to compute\n *    any pointers.\n */\nstruct pb_istream_s\n{\n#ifdef PB_BUFFER_ONLY\n    /* Callback pointer is not used in buffer-only configuration.\n     * Having an int pointer here allows binary compatibility but\n     * gives an error if someone tries to assign callback function.\n     */\n    int *callback;\n#else\n    bool (*callback)(pb_istream_t *stream, pb_byte_t *buf, size_t count);\n#endif\n\n    void *state; /* Free field for use by callback implementation */\n    size_t bytes_left;\n    \n#ifndef PB_NO_ERRMSG\n    const char *errmsg;\n#endif\n};\n\n#ifndef PB_NO_ERRMSG\n#define PB_ISTREAM_EMPTY {0,0,0,0}\n#else\n#define PB_ISTREAM_EMPTY {0,0,0}\n#endif\n\n/***************************\n * Main decoding functions *\n ***************************/\n \n/* Decode a single protocol buffers message from input stream into a C structure.\n * Returns true on success, false on any failure.\n * The actual struct pointed to by dest must match the description in fields.\n * Callback fields of the destination structure must be initialized by caller.\n * All other fields will be initialized by this function.\n *\n * Example usage:\n *    MyMessage msg = {};\n *    uint8_t buffer[64];\n *    pb_istream_t stream;\n *    \n *    // ... read some data into buffer ...\n *\n *    stream = pb_istream_from_buffer(buffer, count);\n *    pb_decode(&stream, MyMessage_fields, &msg);\n */\nbool pb_decode(pb_istream_t *stream, const pb_msgdesc_t *fields, void *dest_struct);\n\n/* Extended version of pb_decode, with several options to control\n * the decoding process:\n *\n * PB_DECODE_NOINIT:         Do not initialize the fields to default values.\n *                           This is slightly faster if you do not need the default\n *                           values and instead initialize the structure to 0 using\n *                           e.g. memset(). This can also be used for merging two\n *                           messages, i.e. combine already existing data with new\n *                           values.\n *\n * PB_DECODE_DELIMITED:      Input message starts with the message size as varint.\n *                           Corresponds to parseDelimitedFrom() in Google's\n *                           protobuf API.\n *\n * PB_DECODE_NULLTERMINATED: Stop reading when field tag is read as 0. This allows\n *                           reading null terminated messages.\n *                           NOTE: Until nanopb-0.4.0, pb_decode() also allows\n *                           null-termination. This behaviour is not supported in\n *                           most other protobuf implementations, so PB_DECODE_DELIMITED\n *                           is a better option for compatibility.\n *\n * Multiple flags can be combined with bitwise or (| operator)\n */\n#define PB_DECODE_NOINIT          0x01U\n#define PB_DECODE_DELIMITED       0x02U\n#define PB_DECODE_NULLTERMINATED  0x04U\nbool pb_decode_ex(pb_istream_t *stream, const pb_msgdesc_t *fields, void *dest_struct, unsigned int flags);\n\n/* Defines for backwards compatibility with code written before nanopb-0.4.0 */\n#define pb_decode_noinit(s,f,d) pb_decode_ex(s,f,d, PB_DECODE_NOINIT)\n#define pb_decode_delimited(s,f,d) pb_decode_ex(s,f,d, PB_DECODE_DELIMITED)\n#define pb_decode_delimited_noinit(s,f,d) pb_decode_ex(s,f,d, PB_DECODE_DELIMITED | PB_DECODE_NOINIT)\n#define pb_decode_nullterminated(s,f,d) pb_decode_ex(s,f,d, PB_DECODE_NULLTERMINATED)\n\n#ifdef PB_ENABLE_MALLOC\n/* Release any allocated pointer fields. If you use dynamic allocation, you should\n * call this for any successfully decoded message when you are done with it. If\n * pb_decode() returns with an error, the message is already released.\n */\nvoid pb_release(const pb_msgdesc_t *fields, void *dest_struct);\n#endif\n\n\n/**************************************\n * Functions for manipulating streams *\n **************************************/\n\n/* Create an input stream for reading from a memory buffer.\n *\n * Alternatively, you can use a custom stream that reads directly from e.g.\n * a file or a network socket.\n */\npb_istream_t pb_istream_from_buffer(const pb_byte_t *buf, size_t bufsize);\n\n/* Function to read from a pb_istream_t. You can use this if you need to\n * read some custom header data, or to read data in field callbacks.\n */\nbool pb_read(pb_istream_t *stream, pb_byte_t *buf, size_t count);\n\n\n/************************************************\n * Helper functions for writing field callbacks *\n ************************************************/\n\n/* Decode the tag for the next field in the stream. Gives the wire type and\n * field tag. At end of the message, returns false and sets eof to true. */\nbool pb_decode_tag(pb_istream_t *stream, pb_wire_type_t *wire_type, uint32_t *tag, bool *eof);\n\n/* Skip the field payload data, given the wire type. */\nbool pb_skip_field(pb_istream_t *stream, pb_wire_type_t wire_type);\n\n/* Decode an integer in the varint format. This works for enum, int32,\n * int64, uint32 and uint64 field types. */\n#ifndef PB_WITHOUT_64BIT\nbool pb_decode_varint(pb_istream_t *stream, uint64_t *dest);\n#else\n#define pb_decode_varint pb_decode_varint32\n#endif\n\n/* Decode an integer in the varint format. This works for enum, int32,\n * and uint32 field types. */\nbool pb_decode_varint32(pb_istream_t *stream, uint32_t *dest);\n\n/* Decode a bool value in varint format. */\nbool pb_decode_bool(pb_istream_t *stream, bool *dest);\n\n/* Decode an integer in the zig-zagged svarint format. This works for sint32\n * and sint64. */\n#ifndef PB_WITHOUT_64BIT\nbool pb_decode_svarint(pb_istream_t *stream, int64_t *dest);\n#else\nbool pb_decode_svarint(pb_istream_t *stream, int32_t *dest);\n#endif\n\n/* Decode a fixed32, sfixed32 or float value. You need to pass a pointer to\n * a 4-byte wide C variable. */\nbool pb_decode_fixed32(pb_istream_t *stream, void *dest);\n\n#ifndef PB_WITHOUT_64BIT\n/* Decode a fixed64, sfixed64 or double value. You need to pass a pointer to\n * a 8-byte wide C variable. */\nbool pb_decode_fixed64(pb_istream_t *stream, void *dest);\n#endif\n\n#ifdef PB_CONVERT_DOUBLE_FLOAT\n/* Decode a double value into float variable. */\nbool pb_decode_double_as_float(pb_istream_t *stream, float *dest);\n#endif\n\n/* Make a limited-length substream for reading a PB_WT_STRING field. */\nbool pb_make_string_substream(pb_istream_t *stream, pb_istream_t *substream);\nbool pb_close_string_substream(pb_istream_t *stream, pb_istream_t *substream);\n\n#ifdef __cplusplus\n} /* extern \"C\" */\n#endif\n\n#endif\n"
  },
  {
    "path": "c/core/include/pb_encode.h",
    "content": "/* pb_encode.h: Functions to encode protocol buffers. Depends on pb_encode.c.\n * The main function is pb_encode. You also need an output stream, and the\n * field descriptions created by nanopb_generator.py.\n */\n\n#ifndef PB_ENCODE_H_INCLUDED\n#define PB_ENCODE_H_INCLUDED\n\n#include \"pb.h\"\n\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n/* Structure for defining custom output streams. You will need to provide\n * a callback function to write the bytes to your storage, which can be\n * for example a file or a network socket.\n *\n * The callback must conform to these rules:\n *\n * 1) Return false on IO errors. This will cause encoding to abort.\n * 2) You can use state to store your own data (e.g. buffer pointer).\n * 3) pb_write will update bytes_written after your callback runs.\n * 4) Substreams will modify max_size and bytes_written. Don't use them\n *    to calculate any pointers.\n */\nstruct pb_ostream_s\n{\n#ifdef PB_BUFFER_ONLY\n    /* Callback pointer is not used in buffer-only configuration.\n     * Having an int pointer here allows binary compatibility but\n     * gives an error if someone tries to assign callback function.\n     * Also, NULL pointer marks a 'sizing stream' that does not\n     * write anything.\n     */\n    int *callback;\n#else\n    bool (*callback)(pb_ostream_t *stream, const pb_byte_t *buf, size_t count);\n#endif\n    void *state;          /* Free field for use by callback implementation. */\n    size_t max_size;      /* Limit number of output bytes written (or use SIZE_MAX). */\n    size_t bytes_written; /* Number of bytes written so far. */\n    \n#ifndef PB_NO_ERRMSG\n    const char *errmsg;\n#endif\n};\n\n/***************************\n * Main encoding functions *\n ***************************/\n\n/* Encode a single protocol buffers message from C structure into a stream.\n * Returns true on success, false on any failure.\n * The actual struct pointed to by src_struct must match the description in fields.\n * All required fields in the struct are assumed to have been filled in.\n *\n * Example usage:\n *    MyMessage msg = {};\n *    uint8_t buffer[64];\n *    pb_ostream_t stream;\n *\n *    msg.field1 = 42;\n *    stream = pb_ostream_from_buffer(buffer, sizeof(buffer));\n *    pb_encode(&stream, MyMessage_fields, &msg);\n */\nbool pb_encode(pb_ostream_t *stream, const pb_msgdesc_t *fields, const void *src_struct);\n\n/* Extended version of pb_encode, with several options to control the\n * encoding process:\n *\n * PB_ENCODE_DELIMITED:      Prepend the length of message as a varint.\n *                           Corresponds to writeDelimitedTo() in Google's\n *                           protobuf API.\n *\n * PB_ENCODE_NULLTERMINATED: Append a null byte to the message for termination.\n *                           NOTE: This behaviour is not supported in most other\n *                           protobuf implementations, so PB_ENCODE_DELIMITED\n *                           is a better option for compatibility.\n */\n#define PB_ENCODE_DELIMITED       0x02U\n#define PB_ENCODE_NULLTERMINATED  0x04U\nbool pb_encode_ex(pb_ostream_t *stream, const pb_msgdesc_t *fields, const void *src_struct, unsigned int flags);\n\n/* Defines for backwards compatibility with code written before nanopb-0.4.0 */\n#define pb_encode_delimited(s,f,d) pb_encode_ex(s,f,d, PB_ENCODE_DELIMITED)\n#define pb_encode_nullterminated(s,f,d) pb_encode_ex(s,f,d, PB_ENCODE_NULLTERMINATED)\n\n/* Encode the message to get the size of the encoded data, but do not store\n * the data. */\nbool pb_get_encoded_size(size_t *size, const pb_msgdesc_t *fields, const void *src_struct);\n\n/**************************************\n * Functions for manipulating streams *\n **************************************/\n\n/* Create an output stream for writing into a memory buffer.\n * The number of bytes written can be found in stream.bytes_written after\n * encoding the message.\n *\n * Alternatively, you can use a custom stream that writes directly to e.g.\n * a file or a network socket.\n */\npb_ostream_t pb_ostream_from_buffer(pb_byte_t *buf, size_t bufsize);\n\n/* Pseudo-stream for measuring the size of a message without actually storing\n * the encoded data.\n * \n * Example usage:\n *    MyMessage msg = {};\n *    pb_ostream_t stream = PB_OSTREAM_SIZING;\n *    pb_encode(&stream, MyMessage_fields, &msg);\n *    printf(\"Message size is %d\\n\", stream.bytes_written);\n */\n#ifndef PB_NO_ERRMSG\n#define PB_OSTREAM_SIZING {0,0,0,0,0}\n#else\n#define PB_OSTREAM_SIZING {0,0,0,0}\n#endif\n\n/* Function to write into a pb_ostream_t stream. You can use this if you need\n * to append or prepend some custom headers to the message.\n */\nbool pb_write(pb_ostream_t *stream, const pb_byte_t *buf, size_t count);\n\n\n/************************************************\n * Helper functions for writing field callbacks *\n ************************************************/\n\n/* Encode field header based on type and field number defined in the field\n * structure. Call this from the callback before writing out field contents. */\nbool pb_encode_tag_for_field(pb_ostream_t *stream, const pb_field_iter_t *field);\n\n/* Encode field header by manually specifing wire type. You need to use this\n * if you want to write out packed arrays from a callback field. */\nbool pb_encode_tag(pb_ostream_t *stream, pb_wire_type_t wiretype, uint32_t field_number);\n\n/* Encode an integer in the varint format.\n * This works for bool, enum, int32, int64, uint32 and uint64 field types. */\n#ifndef PB_WITHOUT_64BIT\nbool pb_encode_varint(pb_ostream_t *stream, uint64_t value);\n#else\nbool pb_encode_varint(pb_ostream_t *stream, uint32_t value);\n#endif\n\n/* Encode an integer in the zig-zagged svarint format.\n * This works for sint32 and sint64. */\n#ifndef PB_WITHOUT_64BIT\nbool pb_encode_svarint(pb_ostream_t *stream, int64_t value);\n#else\nbool pb_encode_svarint(pb_ostream_t *stream, int32_t value);\n#endif\n\n/* Encode a string or bytes type field. For strings, pass strlen(s) as size. */\nbool pb_encode_string(pb_ostream_t *stream, const pb_byte_t *buffer, size_t size);\n\n/* Encode a fixed32, sfixed32 or float value.\n * You need to pass a pointer to a 4-byte wide C variable. */\nbool pb_encode_fixed32(pb_ostream_t *stream, const void *value);\n\n#ifndef PB_WITHOUT_64BIT\n/* Encode a fixed64, sfixed64 or double value.\n * You need to pass a pointer to a 8-byte wide C variable. */\nbool pb_encode_fixed64(pb_ostream_t *stream, const void *value);\n#endif\n\n#ifdef PB_CONVERT_DOUBLE_FLOAT\n/* Encode a float value so that it appears like a double in the encoded\n * message. */\nbool pb_encode_float_as_double(pb_ostream_t *stream, float value);\n#endif\n\n/* Encode a submessage field.\n * You need to pass the pb_field_t array and pointer to struct, just like\n * with pb_encode(). This internally encodes the submessage twice, first to\n * calculate message size and then to actually write it out.\n */\nbool pb_encode_submessage(pb_ostream_t *stream, const pb_msgdesc_t *fields, const void *src_struct);\n\n#ifdef __cplusplus\n} /* extern \"C\" */\n#endif\n\n#endif\n"
  },
  {
    "path": "c/core/include/tahu.h",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2019 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\n#include <tahu.pb.h>\n\n#include <time.h>\n#include <sys/time.h>\n\n#ifdef __MACH__\n#include <mach/clock.h>\n#include <mach/mach.h>\n#endif\n\n#ifndef _SPARKPLUGLIB_H_\n#define _SPARKPLUGLIB_H_\n\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n// Enable/disable debug messages\n#define SPARKPLUG_DEBUG 1\n\n#ifdef SPARKPLUG_DEBUG\n#define DEBUG_PRINT(...) printf(__VA_ARGS__)\n#else\n#define DEBUG_PRINT(...) do {} while (0)\n#endif\n\n// Constants\n#define DATA_SET_DATA_TYPE_UNKNOWN 0\n#define DATA_SET_DATA_TYPE_INT8 1\n#define DATA_SET_DATA_TYPE_INT16 2\n#define DATA_SET_DATA_TYPE_INT32 3\n#define DATA_SET_DATA_TYPE_INT64 4\n#define DATA_SET_DATA_TYPE_UINT8 5\n#define DATA_SET_DATA_TYPE_UINT16 6\n#define DATA_SET_DATA_TYPE_UINT32 7\n#define DATA_SET_DATA_TYPE_UINT64 8\n#define DATA_SET_DATA_TYPE_FLOAT 9\n#define DATA_SET_DATA_TYPE_DOUBLE 10\n#define DATA_SET_DATA_TYPE_BOOLEAN 11\n#define DATA_SET_DATA_TYPE_STRING 12\n#define DATA_SET_DATA_TYPE_DATETIME 13\n#define DATA_SET_DATA_TYPE_TEXT 14\n\n#define METRIC_DATA_TYPE_UNKNOWN 0\n#define METRIC_DATA_TYPE_INT8 1\n#define METRIC_DATA_TYPE_INT16 2\n#define METRIC_DATA_TYPE_INT32 3\n#define METRIC_DATA_TYPE_INT64 4\n#define METRIC_DATA_TYPE_UINT8 5\n#define METRIC_DATA_TYPE_UINT16 6\n#define METRIC_DATA_TYPE_UINT32 7\n#define METRIC_DATA_TYPE_UINT64 8\n#define METRIC_DATA_TYPE_FLOAT 9\n#define METRIC_DATA_TYPE_DOUBLE 10\n#define METRIC_DATA_TYPE_BOOLEAN 11\n#define METRIC_DATA_TYPE_STRING 12\n#define METRIC_DATA_TYPE_DATETIME 13\n#define METRIC_DATA_TYPE_TEXT 14\n#define METRIC_DATA_TYPE_UUID 15\n#define METRIC_DATA_TYPE_DATASET 16\n#define METRIC_DATA_TYPE_BYTES 17\n#define METRIC_DATA_TYPE_FILE 18\n#define METRIC_DATA_TYPE_TEMPLATE 19\n\n#define PARAMETER_DATA_TYPE_UNKNOWN 0\n#define PARAMETER_DATA_TYPE_INT8 1\n#define PARAMETER_DATA_TYPE_INT16 2\n#define PARAMETER_DATA_TYPE_INT32 3\n#define PARAMETER_DATA_TYPE_INT64 4\n#define PARAMETER_DATA_TYPE_UINT8 5\n#define PARAMETER_DATA_TYPE_UINT16 6\n#define PARAMETER_DATA_TYPE_UINT32 7\n#define PARAMETER_DATA_TYPE_UINT64 8\n#define PARAMETER_DATA_TYPE_FLOAT 9\n#define PARAMETER_DATA_TYPE_DOUBLE 10\n#define PARAMETER_DATA_TYPE_BOOLEAN 11\n#define PARAMETER_DATA_TYPE_STRING 12\n#define PARAMETER_DATA_TYPE_DATETIME 13\n#define PARAMETER_DATA_TYPE_TEXT 14\n\n#define PROPERTY_DATA_TYPE_UNKNOWN 0\n#define PROPERTY_DATA_TYPE_INT8 1\n#define PROPERTY_DATA_TYPE_INT16 2\n#define PROPERTY_DATA_TYPE_INT32 3\n#define PROPERTY_DATA_TYPE_INT64 4\n#define PROPERTY_DATA_TYPE_UINT8 5\n#define PROPERTY_DATA_TYPE_UINT16 6\n#define PROPERTY_DATA_TYPE_UINT32 7\n#define PROPERTY_DATA_TYPE_UINT64 8\n#define PROPERTY_DATA_TYPE_FLOAT 9\n#define PROPERTY_DATA_TYPE_DOUBLE 10\n#define PROPERTY_DATA_TYPE_BOOLEAN 11\n#define PROPERTY_DATA_TYPE_STRING 12\n#define PROPERTY_DATA_TYPE_DATETIME 13\n#define PROPERTY_DATA_TYPE_TEXT 14\n\n/**\n * Attach Metadata to an existing Metric.\n *\n * <p>Caution: The metadata structure is duplicated via shallow copy, and\n * it is expected that any pointers within it are safe to pass to free().\n * This will happen if pb_release() is called on this structure or any\n * structure referencing it, for example via a call to free_payload().\n *\n * @param metric   Pointer to destination metric that metadata will be added to\n * @param metadata Pointer to a source metadata structure that will be copied onto metric\n *\n * @return Returns >= 0 on success, or negative on failure\n */\nint add_metadata_to_metric(org_eclipse_tahu_protobuf_Payload_Metric *metric,\n                           org_eclipse_tahu_protobuf_Payload_MetaData *metadata);\n\n/**\n * Attach a Metric to an existing Payload.\n *\n * <p>Caution: The metric structure is duplicated via shallow\n * copy, and it is expected that any pointers within it are safe\n * to pass to free(). This will happen if pb_release() is called\n * on this structure or any structure referencing it, for\n * example via a call to free_payload().\n *\n * @param payload Pointer to the destination payload that metric will be added to\n * @param metric  Pointer to source metric structure that will be copied onto payload\n *\n * @return Returns >= 0 on success, or negative on failure\n */\nint add_metric_to_payload(org_eclipse_tahu_protobuf_Payload *payload,\n                          org_eclipse_tahu_protobuf_Payload_Metric *metric);\n\n/**\n * Helper function to properly cast and push a value into the propertyvalue data structure.\n *\n * <p>Mostly useful when directly building property structures.\n *\n * (No pointers passed into this function are retained by the target structure)\n *\n * @param propertyvalue\n *                 Pointer to propertyvalue structure to receive the value\n * @param datatype Datatype of the value being received (e.g. PROPERTY_DATA_TYPE_INT8)\n * @param value    Pointer to the value to use (cannot be NULL)\n * @param size     Size of the memory pointed to by value\n *\n * @return Returns >= 0 on success, or negative on failure\n */\nint set_propertyvalue(org_eclipse_tahu_protobuf_Payload_PropertyValue *propertyvalue,\n                      uint32_t datatype,\n                      const void *value,\n                      size_t size);\n\n/**\n * Add a simple Property to an existing PropertySet\n *\n * (No pointers passed into this function are retained by the target structure)\n *\n * @param propertyset\n *               Pointer to destination PropertySet that property will be added to\n * @param key    Pointer to null-terminated string giving name of new property\n * @param type   Datatype of new property value (e.g. PROPERTY_DATA_TYPE_INT8)\n * @param value  Pointer to value to use for new property, or NULL if reported property value should be NULL.\n * @param size_of_value\n *               Size of data pointed to by value\n *\n * @return Returns >= 0 on success, or negative on failure\n */\nint add_property_to_set(org_eclipse_tahu_protobuf_Payload_PropertySet *propertyset,\n                        const char *key,\n                        uint32_t type,\n                        const void *value,\n                        size_t size_of_value);\n\n/**\n * Add a PropertySet to an existing Metric\n *\n * <p>Caution: The propertyset structure is duplicated via shallow\n * copy, and it is expected that any pointers within it are safe\n * to pass to free(). This will happen if pb_release() is called\n * on this structure or any structure referencing it, for\n * example via a call to free_payload().\n *\n * @param metric     Pointer to the destination metric that propertyset will be added to\n * @param properties Pointer to source propertyset structure that will be copied onto metric\n *\n * @return Returns >= 0 on success, or negative on failure\n */\nint add_propertyset_to_metric(org_eclipse_tahu_protobuf_Payload_Metric *metric,\n                              org_eclipse_tahu_protobuf_Payload_PropertySet *properties);\n\n/**\n * Helper function to properly cast and push a value into the\n * metric data structure.\n *\n * <p>Mostly useful when directly building metric structures.\n *\n * <p>Caution: When using datatype METRIC_DATA_TYPE_DATASET or\n * METRIC_DATA_TYPE_TEMPLATE, the structure passed in via value\n * is duplicated using a shallow copy, and it is expected that\n * any pointers within it are safe to pass to free(). This will\n * happen if pb_release() is called on the metric or any\n * structure referencing it, for example via a call to\n * free_payload().\n *\n * When using other datatype values, no pointers are retained by the metric.\n *\n * @param metric\n *                 Pointer to metric structure to receive the\n *                 value\n * @param datatype Datatype of the value being received (e.g. PROPERTY_DATA_TYPE_INT8)\n * @param value    Pointer to the value to use (cannot be NULL)\n * @param size     Size of the memory pointed to by value\n *\n * @return Returns >= 0 on success, or negative on failure\n */\nint set_metric_value(org_eclipse_tahu_protobuf_Payload_Metric *metric, uint32_t datatype, const void *value, size_t size);\n\n/**\n * Add a simple Metric to an existing Payload\n *\n * <p>Caution: When using datatype METRIC_DATA_TYPE_DATASET or\n * METRIC_DATA_TYPE_TEMPLATE, the structure passed in via value\n * is duplicated using a shallow copy, and it is expected that\n * any pointers within it are safe to pass to free(). This will\n * happen if pb_release() is called on the metric or any\n * structure referencing it, for example via a call to\n * free_payload().\n *\n * When using other datatype values, no pointers are retained by the metric.\n *\n * CAUTION: The underlying library will allocate memory as\n * needed when building the structure.  On success, it will be\n * necessary to call free_payload() on the structure to release\n * those allocations.\n *\n * @param payload   Pointer to the destination payload that metric will be added to\n * @param name      Pointer to null-terminated string giving name of metric; may be NULL if not using name field on this metric\n * @param has_alias Boolean indicating if the alias number should be included on the metric\n * @param alias     Alias number to use if has_alias is true\n * @param datatype  Datatype of the value (e.g. METRIC_DATA_TYPE_BOOLEAN)\n * @param is_historical\n *                  Boolean indicating if is_historical falg should be set on this metric\n * @param is_transient\n *                  Boolean if is_transient flag should be set on this metric\n * @param value     Pointer to value to use for metric; may be NULL if desired to set is_null flag and not include a value\n * @param size_of_value\n *                  Size of data pointed to by value\n *\n * @return Returns >= 0 on success, or negative on failure\n */\nint add_simple_metric(org_eclipse_tahu_protobuf_Payload *payload,\n                      const char *name,\n                      bool has_alias,\n                      uint64_t alias,\n                      uint64_t datatype,\n                      bool is_historical,\n                      bool is_transient,\n                      const void *value,\n                      size_t size_of_value);\n\n/**\n * Encode a Payload into an array of bytes\n *\n * @param out_buffer Pointer to destination buffer to receive\n *                   the encoded payload, or NULL if you just\n *                   want to calculate the size of the encoded\n *                   payload\n * @param buffer_length\n *                   Size of the destination buffer in bytes\n * @param payload    Pointer to the source payload structure\n *\n * @return Returns the size of the encoded payload in bytes on\n *         success, or -1 on failure\n */\nssize_t encode_payload(uint8_t *out_buffer,\n                       size_t buffer_length,\n                       const org_eclipse_tahu_protobuf_Payload *payload);\n\n/**\n * Build a payload structure from an encoded buffer\n *\n * <p>CAUTION: The underlying library will allocate memory as\n * needed when building the structure.  On success, it will be\n * necessary to call free_payload() on the structure to release\n * those allocations when done using it.\n *\n * @param payload   Pointer to the destination structure to receive the payload;\n *                  WARNING: any memory allocations referenced\n *                  by the payload structure before it is passed\n *                  into this function will be lost.  They\n *                  should be explicitly freed first if\n *                  necessary.\n * @param in_buffer Pointer to the buffer holding the encoded payload\n * @param buffer_length\n *                  Size of the incoming buffer\n *\n * @return Returns negative on failure, or number of bytes\n *         unused from buffer_length on success\n */\nssize_t decode_payload(org_eclipse_tahu_protobuf_Payload *payload,\n                       const uint8_t *in_buffer,\n                       size_t buffer_length);\n\n/**\n * Free memory from an existing Payload\n *\n * <p>This walks through the payload structure and any sub-structures it references, and frees all pointers as dynamic allocations.\n *\n * <p>This does NOT release the payload structure itself.  It is up to the calling application to do that if necessary.\n *\n * @param payload Pointer to the Payload structure to release.\n *\n * @return Returns >= 0 on success, or negative on failure\n */\nint free_payload(org_eclipse_tahu_protobuf_Payload *payload);\n\n/**\n * Get the current timestamp in milliseconds (format used inside SparkPlug payloads)\n *\n * @return The current timestamp in milliseconds since Jan 1, 1970 UTC.\n */\nuint64_t get_current_timestamp(void);\n\n/**\n * Reset the sequence number to 0.\n *\n * This should be used just before starting a new NBIRTH message.\n */\nvoid reset_sparkplug_sequence(void);\n\n/**\n * Get the next empty Payload.\n *\n * <p>This does the initial payload setup including the timestamp and sequence number.\n *\n * @param payload Pointer to the destination payload structure to setup;\n *                WARNING: any memory allocations referenced\n *                by the payload structure before it is passed\n *                into this function will be lost.  They\n *                should be explicitly freed first if\n *                necessary.\n *\n * @return Returns >= 0 on success, or negative on failure\n */\nint get_next_payload(org_eclipse_tahu_protobuf_Payload *payload);\n\n/**\n * Initialize a Dataset with the values passed in\n *\n * <p>Caution: The row value structures are duplicated via\n * shallow copy, and it is expected that any pointers within\n * them are safe to pass to free(). This will happen if\n * pb_release() is called on this structure or any structure\n * referencing it, for example via a call to free_payload().\n *\n * @param dataset   Pointer to dataset to initialize\n *                  WARNING: any memory allocations referenced\n *                  by the dataset structure before it is passed\n *                  into this function will be lost.  They\n *                  should be explicitly freed first if\n *                  necessary.\n * @param num_of_rows\n *                  Number of rows in the dataset\n * @param num_of_columns\n *                  Number of columns in the dataset\n * @param datatypes Array of datatypes, one per column (e.g. DATA_SET_DATA_TYPE_INT8)\n * @param column_keys\n *                  Array of pointers to null-terminated strings\n *                  giving names for each column (these strings\n *                  are copied into new allocations)\n * @param row_data  Array of row value structures\n *\n * @return Returns >= 0 on success, or negative on failure\n */\nint init_dataset(org_eclipse_tahu_protobuf_Payload_DataSet *dataset,\n                 uint64_t num_of_rows,\n                 uint64_t num_of_columns,\n                 const uint32_t datatypes[],\n                 const char *column_keys[],\n                 const org_eclipse_tahu_protobuf_Payload_DataSet_Row row_data[]);\n\n/**\n * Initialize a Metric with the values of the arguments passed in\n *\n * <p>Caution: When using datatype METRIC_DATA_TYPE_DATASET or\n * METRIC_DATA_TYPE_TEMPLATE, the structure passed in via value\n * is duplicated using a shallow copy, and it is expected that\n * any pointers within it are safe to pass to free(). This will\n * happen if pb_release() is called on the metric or any\n * structure referencing it, for example via a call to\n * free_payload().\n *\n * <p>When using other datatype values, no pointers are retained by the metric.\n *\n * <p>CAUTION: The underlying library will allocate memory as\n * needed when building the structure.  On success, it will be\n * necessary to call free_payload() on the structure to release\n * those allocations.\n *\n * @param metric    Pointer to the metric data structure to initialize;\n *                  WARNING: any memory allocations referenced\n *                  by the metric structure before it is passed\n *                  into this function will be lost.  They\n *                  should be explicitly freed first if\n *                  necessary.\n * @param name      Pointer to null-terminated string giving name of metric; may be NULL if not using name field on this metric\n * @param has_alias Boolean indicating if the alias number should be included on the metric\n * @param alias     Alias number to use if has_alias is true\n * @param datatype  Datatype of the value (e.g. METRIC_DATA_TYPE_BOOLEAN)\n * @param is_historical\n *                  Boolean indicating if is_historical falg should be set on this metric\n * @param is_transient\n *                  Boolean if is_transient flag should be set on this metric\n * @param value     Pointer to value to use for metric; may be NULL if desired to set is_null flag and not include a value\n * @param size_of_value\n *                  Size of data pointed to by value\n *\n * @return Returns >= 0 on success, or negative on failure\n */\nint init_metric(org_eclipse_tahu_protobuf_Payload_Metric *metric,\n                const char *name,\n                bool has_alias,\n                uint64_t alias,\n                uint64_t datatype,\n                bool is_historical,\n                bool is_transient,\n                const void *value,\n                size_t size_of_value);\n\n/**\n * Display a full Sparkplug Payload\n *\n * @param payload Pointer to the payload structure to display\n */\nvoid print_payload(org_eclipse_tahu_protobuf_Payload *payload);\n\n#ifdef __cplusplus\n} /* extern \"C\" */\n#endif\n\n#endif\n\n"
  },
  {
    "path": "c/core/include/tahu.pb.h",
    "content": "/* Automatically generated nanopb header */\n/* Generated by nanopb-0.4.1 */\n\n#ifndef PB_ORG_ECLIPSE_TAHU_PROTOBUF_TAHU_PB_H_INCLUDED\n#define PB_ORG_ECLIPSE_TAHU_PROTOBUF_TAHU_PB_H_INCLUDED\n#include <pb.h>\n\n#if PB_PROTO_HEADER_VERSION != 40\n#error Regenerate this file with the current version of nanopb generator.\n#endif\n\n#ifdef __cplusplus\nextern \"C\" {\n#endif\n\n/* Struct definitions */\ntypedef struct _org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension {\n    pb_extension_t *extensions;\n} org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension;\n\ntypedef struct _org_eclipse_tahu_protobuf_Payload_DataSet_Row {\n    pb_size_t elements_count;\n    struct _org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue *elements;\n    pb_extension_t *extensions;\n} org_eclipse_tahu_protobuf_Payload_DataSet_Row;\n\ntypedef struct _org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension {\n    pb_extension_t *extensions;\n} org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension;\n\ntypedef struct _org_eclipse_tahu_protobuf_Payload_PropertySet {\n    pb_size_t keys_count;\n    char **keys;\n    pb_size_t values_count;\n    struct _org_eclipse_tahu_protobuf_Payload_PropertyValue *values;\n    pb_extension_t *extensions;\n} org_eclipse_tahu_protobuf_Payload_PropertySet;\n\ntypedef struct _org_eclipse_tahu_protobuf_Payload_PropertySetList {\n    pb_size_t propertyset_count;\n    struct _org_eclipse_tahu_protobuf_Payload_PropertySet *propertyset;\n    pb_extension_t *extensions;\n} org_eclipse_tahu_protobuf_Payload_PropertySetList;\n\ntypedef struct _org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension {\n    pb_extension_t *extensions;\n} org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension;\n\ntypedef struct _org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension {\n    pb_extension_t *extensions;\n} org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension;\n\ntypedef struct _org_eclipse_tahu_protobuf_Payload {\n    bool has_timestamp;\n    uint64_t timestamp;\n    pb_size_t metrics_count;\n    struct _org_eclipse_tahu_protobuf_Payload_Metric *metrics;\n    bool has_seq;\n    uint64_t seq;\n    char *uuid;\n    pb_bytes_array_t *body;\n    pb_extension_t *extensions;\n} org_eclipse_tahu_protobuf_Payload;\n\ntypedef struct _org_eclipse_tahu_protobuf_Payload_DataSet {\n    bool has_num_of_columns;\n    uint64_t num_of_columns;\n    pb_size_t columns_count;\n    char **columns;\n    pb_size_t types_count;\n    uint32_t *types;\n    pb_size_t rows_count;\n    struct _org_eclipse_tahu_protobuf_Payload_DataSet_Row *rows;\n    pb_extension_t *extensions;\n} org_eclipse_tahu_protobuf_Payload_DataSet;\n\ntypedef struct _org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue {\n    pb_size_t which_value;\n    union {\n        uint32_t int_value;\n        uint64_t long_value;\n        float float_value;\n        double double_value;\n        bool boolean_value;\n        char *string_value;\n        org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension extension_value;\n    } value;\n} org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue;\n\ntypedef struct _org_eclipse_tahu_protobuf_Payload_MetaData {\n    bool has_is_multi_part;\n    bool is_multi_part;\n    char *content_type;\n    bool has_size;\n    uint64_t size;\n    bool has_seq;\n    uint64_t seq;\n    char *file_name;\n    char *file_type;\n    char *md5;\n    char *description;\n    pb_extension_t *extensions;\n} org_eclipse_tahu_protobuf_Payload_MetaData;\n\ntypedef struct _org_eclipse_tahu_protobuf_Payload_PropertyValue {\n    bool has_type;\n    uint32_t type;\n    bool has_is_null;\n    bool is_null;\n    pb_size_t which_value;\n    union {\n        uint32_t int_value;\n        uint64_t long_value;\n        float float_value;\n        double double_value;\n        bool boolean_value;\n        char *string_value;\n        org_eclipse_tahu_protobuf_Payload_PropertySet propertyset_value;\n        org_eclipse_tahu_protobuf_Payload_PropertySetList propertysets_value;\n        org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension extension_value;\n    } value;\n} org_eclipse_tahu_protobuf_Payload_PropertyValue;\n\ntypedef struct _org_eclipse_tahu_protobuf_Payload_Template {\n    char *version;\n    pb_size_t metrics_count;\n    struct _org_eclipse_tahu_protobuf_Payload_Metric *metrics;\n    pb_size_t parameters_count;\n    struct _org_eclipse_tahu_protobuf_Payload_Template_Parameter *parameters;\n    char *template_ref;\n    bool has_is_definition;\n    bool is_definition;\n    pb_extension_t *extensions;\n} org_eclipse_tahu_protobuf_Payload_Template;\n\ntypedef struct _org_eclipse_tahu_protobuf_Payload_Template_Parameter {\n    char *name;\n    bool has_type;\n    uint32_t type;\n    pb_size_t which_value;\n    union {\n        uint32_t int_value;\n        uint64_t long_value;\n        float float_value;\n        double double_value;\n        bool boolean_value;\n        char *string_value;\n        org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension extension_value;\n    } value;\n} org_eclipse_tahu_protobuf_Payload_Template_Parameter;\n\ntypedef struct _org_eclipse_tahu_protobuf_Payload_Metric {\n    char *name;\n    bool has_alias;\n    uint64_t alias;\n    bool has_timestamp;\n    uint64_t timestamp;\n    bool has_datatype;\n    uint32_t datatype;\n    bool has_is_historical;\n    bool is_historical;\n    bool has_is_transient;\n    bool is_transient;\n    bool has_is_null;\n    bool is_null;\n    bool has_metadata;\n    org_eclipse_tahu_protobuf_Payload_MetaData metadata;\n    bool has_properties;\n    org_eclipse_tahu_protobuf_Payload_PropertySet properties;\n    pb_size_t which_value;\n    union {\n        uint32_t int_value;\n        uint64_t long_value;\n        float float_value;\n        double double_value;\n        bool boolean_value;\n        char *string_value;\n        pb_bytes_array_t *bytes_value;\n        org_eclipse_tahu_protobuf_Payload_DataSet dataset_value;\n        org_eclipse_tahu_protobuf_Payload_Template template_value;\n        org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension extension_value;\n    } value;\n} org_eclipse_tahu_protobuf_Payload_Metric;\n\n\n/* Initializer values for message structs */\n#define org_eclipse_tahu_protobuf_Payload_init_default {false, 0, 0, NULL, false, 0, NULL, NULL, NULL}\n#define org_eclipse_tahu_protobuf_Payload_Template_init_default {NULL, 0, NULL, 0, NULL, NULL, false, 0, NULL}\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_init_default {NULL, false, 0, 0, {0}}\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_init_default {NULL}\n#define org_eclipse_tahu_protobuf_Payload_DataSet_init_default {false, 0, 0, NULL, 0, NULL, 0, NULL, NULL}\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_init_default {0, {0}}\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_init_default {NULL}\n#define org_eclipse_tahu_protobuf_Payload_DataSet_Row_init_default {0, NULL, NULL}\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_init_default {false, 0, false, 0, 0, {0}}\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_init_default {NULL}\n#define org_eclipse_tahu_protobuf_Payload_PropertySet_init_default {0, NULL, 0, NULL, NULL}\n#define org_eclipse_tahu_protobuf_Payload_PropertySetList_init_default {0, NULL, NULL}\n#define org_eclipse_tahu_protobuf_Payload_MetaData_init_default {false, 0, NULL, false, 0, false, 0, NULL, NULL, NULL, NULL, NULL}\n#define org_eclipse_tahu_protobuf_Payload_Metric_init_default {NULL, false, 0, false, 0, false, 0, false, 0, false, 0, false, 0, false, org_eclipse_tahu_protobuf_Payload_MetaData_init_default, false, org_eclipse_tahu_protobuf_Payload_PropertySet_init_default, 0, {0}}\n#define org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_init_default {NULL}\n#define org_eclipse_tahu_protobuf_Payload_init_zero {false, 0, 0, NULL, false, 0, NULL, NULL, NULL}\n#define org_eclipse_tahu_protobuf_Payload_Template_init_zero {NULL, 0, NULL, 0, NULL, NULL, false, 0, NULL}\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_init_zero {NULL, false, 0, 0, {0}}\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_init_zero {NULL}\n#define org_eclipse_tahu_protobuf_Payload_DataSet_init_zero {false, 0, 0, NULL, 0, NULL, 0, NULL, NULL}\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_init_zero {0, {0}}\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_init_zero {NULL}\n#define org_eclipse_tahu_protobuf_Payload_DataSet_Row_init_zero {0, NULL, NULL}\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_init_zero {false, 0, false, 0, 0, {0}}\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_init_zero {NULL}\n#define org_eclipse_tahu_protobuf_Payload_PropertySet_init_zero {0, NULL, 0, NULL, NULL}\n#define org_eclipse_tahu_protobuf_Payload_PropertySetList_init_zero {0, NULL, NULL}\n#define org_eclipse_tahu_protobuf_Payload_MetaData_init_zero {false, 0, NULL, false, 0, false, 0, NULL, NULL, NULL, NULL, NULL}\n#define org_eclipse_tahu_protobuf_Payload_Metric_init_zero {NULL, false, 0, false, 0, false, 0, false, 0, false, 0, false, 0, false, org_eclipse_tahu_protobuf_Payload_MetaData_init_zero, false, org_eclipse_tahu_protobuf_Payload_PropertySet_init_zero, 0, {0}}\n#define org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_init_zero {NULL}\n\n/* Field tags (for use in manual encoding/decoding) */\n#define org_eclipse_tahu_protobuf_Payload_DataSet_Row_elements_tag 1\n#define org_eclipse_tahu_protobuf_Payload_PropertySet_keys_tag 1\n#define org_eclipse_tahu_protobuf_Payload_PropertySet_values_tag 2\n#define org_eclipse_tahu_protobuf_Payload_PropertySetList_propertyset_tag 1\n#define org_eclipse_tahu_protobuf_Payload_timestamp_tag 1\n#define org_eclipse_tahu_protobuf_Payload_metrics_tag 2\n#define org_eclipse_tahu_protobuf_Payload_seq_tag 3\n#define org_eclipse_tahu_protobuf_Payload_uuid_tag 4\n#define org_eclipse_tahu_protobuf_Payload_body_tag 5\n#define org_eclipse_tahu_protobuf_Payload_DataSet_num_of_columns_tag 1\n#define org_eclipse_tahu_protobuf_Payload_DataSet_columns_tag 2\n#define org_eclipse_tahu_protobuf_Payload_DataSet_types_tag 3\n#define org_eclipse_tahu_protobuf_Payload_DataSet_rows_tag 4\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_int_value_tag 1\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_long_value_tag 2\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_float_value_tag 3\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_double_value_tag 4\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_boolean_value_tag 5\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_string_value_tag 6\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_extension_value_tag 7\n#define org_eclipse_tahu_protobuf_Payload_MetaData_is_multi_part_tag 1\n#define org_eclipse_tahu_protobuf_Payload_MetaData_content_type_tag 2\n#define org_eclipse_tahu_protobuf_Payload_MetaData_size_tag 3\n#define org_eclipse_tahu_protobuf_Payload_MetaData_seq_tag 4\n#define org_eclipse_tahu_protobuf_Payload_MetaData_file_name_tag 5\n#define org_eclipse_tahu_protobuf_Payload_MetaData_file_type_tag 6\n#define org_eclipse_tahu_protobuf_Payload_MetaData_md5_tag 7\n#define org_eclipse_tahu_protobuf_Payload_MetaData_description_tag 8\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_int_value_tag 3\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_long_value_tag 4\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_float_value_tag 5\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_double_value_tag 6\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_boolean_value_tag 7\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_string_value_tag 8\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_propertyset_value_tag 9\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_propertysets_value_tag 10\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_extension_value_tag 11\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_type_tag 1\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_is_null_tag 2\n#define org_eclipse_tahu_protobuf_Payload_Template_version_tag 1\n#define org_eclipse_tahu_protobuf_Payload_Template_metrics_tag 2\n#define org_eclipse_tahu_protobuf_Payload_Template_parameters_tag 3\n#define org_eclipse_tahu_protobuf_Payload_Template_template_ref_tag 4\n#define org_eclipse_tahu_protobuf_Payload_Template_is_definition_tag 5\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_int_value_tag 3\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_long_value_tag 4\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_float_value_tag 5\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_double_value_tag 6\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_boolean_value_tag 7\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_string_value_tag 8\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_extension_value_tag 9\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_name_tag 1\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_type_tag 2\n#define org_eclipse_tahu_protobuf_Payload_Metric_int_value_tag 10\n#define org_eclipse_tahu_protobuf_Payload_Metric_long_value_tag 11\n#define org_eclipse_tahu_protobuf_Payload_Metric_float_value_tag 12\n#define org_eclipse_tahu_protobuf_Payload_Metric_double_value_tag 13\n#define org_eclipse_tahu_protobuf_Payload_Metric_boolean_value_tag 14\n#define org_eclipse_tahu_protobuf_Payload_Metric_string_value_tag 15\n#define org_eclipse_tahu_protobuf_Payload_Metric_bytes_value_tag 16\n#define org_eclipse_tahu_protobuf_Payload_Metric_dataset_value_tag 17\n#define org_eclipse_tahu_protobuf_Payload_Metric_template_value_tag 18\n#define org_eclipse_tahu_protobuf_Payload_Metric_extension_value_tag 19\n#define org_eclipse_tahu_protobuf_Payload_Metric_name_tag 1\n#define org_eclipse_tahu_protobuf_Payload_Metric_alias_tag 2\n#define org_eclipse_tahu_protobuf_Payload_Metric_timestamp_tag 3\n#define org_eclipse_tahu_protobuf_Payload_Metric_datatype_tag 4\n#define org_eclipse_tahu_protobuf_Payload_Metric_is_historical_tag 5\n#define org_eclipse_tahu_protobuf_Payload_Metric_is_transient_tag 6\n#define org_eclipse_tahu_protobuf_Payload_Metric_is_null_tag 7\n#define org_eclipse_tahu_protobuf_Payload_Metric_metadata_tag 8\n#define org_eclipse_tahu_protobuf_Payload_Metric_properties_tag 9\n\n/* Struct field encoding specification for nanopb */\n#define org_eclipse_tahu_protobuf_Payload_FIELDLIST(X, a) \\\nX(a, STATIC,   OPTIONAL, UINT64,   timestamp,         1) \\\nX(a, POINTER,  REPEATED, MESSAGE,  metrics,           2) \\\nX(a, STATIC,   OPTIONAL, UINT64,   seq,               3) \\\nX(a, POINTER,  OPTIONAL, STRING,   uuid,              4) \\\nX(a, POINTER,  OPTIONAL, BYTES,    body,              5) \\\nX(a, CALLBACK, OPTIONAL, EXTENSION, extensions,        6)\n#define org_eclipse_tahu_protobuf_Payload_CALLBACK pb_default_field_callback\n#define org_eclipse_tahu_protobuf_Payload_DEFAULT NULL\n#define org_eclipse_tahu_protobuf_Payload_metrics_MSGTYPE org_eclipse_tahu_protobuf_Payload_Metric\n\n#define org_eclipse_tahu_protobuf_Payload_Template_FIELDLIST(X, a) \\\nX(a, POINTER,  OPTIONAL, STRING,   version,           1) \\\nX(a, POINTER,  REPEATED, MESSAGE,  metrics,           2) \\\nX(a, POINTER,  REPEATED, MESSAGE,  parameters,        3) \\\nX(a, POINTER,  OPTIONAL, STRING,   template_ref,      4) \\\nX(a, STATIC,   OPTIONAL, BOOL,     is_definition,     5) \\\nX(a, CALLBACK, OPTIONAL, EXTENSION, extensions,        6)\n#define org_eclipse_tahu_protobuf_Payload_Template_CALLBACK pb_default_field_callback\n#define org_eclipse_tahu_protobuf_Payload_Template_DEFAULT NULL\n#define org_eclipse_tahu_protobuf_Payload_Template_metrics_MSGTYPE org_eclipse_tahu_protobuf_Payload_Metric\n#define org_eclipse_tahu_protobuf_Payload_Template_parameters_MSGTYPE org_eclipse_tahu_protobuf_Payload_Template_Parameter\n\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_FIELDLIST(X, a) \\\nX(a, POINTER,  OPTIONAL, STRING,   name,              1) \\\nX(a, STATIC,   OPTIONAL, UINT32,   type,              2) \\\nX(a, STATIC,   ONEOF,    UINT32,   (value,int_value,value.int_value),   3) \\\nX(a, STATIC,   ONEOF,    UINT64,   (value,long_value,value.long_value),   4) \\\nX(a, STATIC,   ONEOF,    FLOAT,    (value,float_value,value.float_value),   5) \\\nX(a, STATIC,   ONEOF,    DOUBLE,   (value,double_value,value.double_value),   6) \\\nX(a, STATIC,   ONEOF,    BOOL,     (value,boolean_value,value.boolean_value),   7) \\\nX(a, POINTER,  ONEOF,    STRING,   (value,string_value,value.string_value),   8) \\\nX(a, STATIC,   ONEOF,    MESSAGE,  (value,extension_value,value.extension_value),   9)\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_CALLBACK NULL\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_DEFAULT NULL\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_value_extension_value_MSGTYPE org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension\n\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_FIELDLIST(X, a) \\\nX(a, CALLBACK, OPTIONAL, EXTENSION, extensions,        1)\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_CALLBACK pb_default_field_callback\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_DEFAULT NULL\n\n#define org_eclipse_tahu_protobuf_Payload_DataSet_FIELDLIST(X, a) \\\nX(a, STATIC,   OPTIONAL, UINT64,   num_of_columns,    1) \\\nX(a, POINTER,  REPEATED, STRING,   columns,           2) \\\nX(a, POINTER,  REPEATED, UINT32,   types,             3) \\\nX(a, POINTER,  REPEATED, MESSAGE,  rows,              4) \\\nX(a, CALLBACK, OPTIONAL, EXTENSION, extensions,        5)\n#define org_eclipse_tahu_protobuf_Payload_DataSet_CALLBACK pb_default_field_callback\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DEFAULT NULL\n#define org_eclipse_tahu_protobuf_Payload_DataSet_rows_MSGTYPE org_eclipse_tahu_protobuf_Payload_DataSet_Row\n\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_FIELDLIST(X, a) \\\nX(a, STATIC,   ONEOF,    UINT32,   (value,int_value,value.int_value),   1) \\\nX(a, STATIC,   ONEOF,    UINT64,   (value,long_value,value.long_value),   2) \\\nX(a, STATIC,   ONEOF,    FLOAT,    (value,float_value,value.float_value),   3) \\\nX(a, STATIC,   ONEOF,    DOUBLE,   (value,double_value,value.double_value),   4) \\\nX(a, STATIC,   ONEOF,    BOOL,     (value,boolean_value,value.boolean_value),   5) \\\nX(a, POINTER,  ONEOF,    STRING,   (value,string_value,value.string_value),   6) \\\nX(a, STATIC,   ONEOF,    MESSAGE,  (value,extension_value,value.extension_value),   7)\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_CALLBACK NULL\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DEFAULT NULL\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_value_extension_value_MSGTYPE org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension\n\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_FIELDLIST(X, a) \\\nX(a, CALLBACK, OPTIONAL, EXTENSION, extensions,        1)\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_CALLBACK pb_default_field_callback\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_DEFAULT NULL\n\n#define org_eclipse_tahu_protobuf_Payload_DataSet_Row_FIELDLIST(X, a) \\\nX(a, POINTER,  REPEATED, MESSAGE,  elements,          1) \\\nX(a, CALLBACK, OPTIONAL, EXTENSION, extensions,        2)\n#define org_eclipse_tahu_protobuf_Payload_DataSet_Row_CALLBACK pb_default_field_callback\n#define org_eclipse_tahu_protobuf_Payload_DataSet_Row_DEFAULT NULL\n#define org_eclipse_tahu_protobuf_Payload_DataSet_Row_elements_MSGTYPE org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue\n\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_FIELDLIST(X, a) \\\nX(a, STATIC,   OPTIONAL, UINT32,   type,              1) \\\nX(a, STATIC,   OPTIONAL, BOOL,     is_null,           2) \\\nX(a, STATIC,   ONEOF,    UINT32,   (value,int_value,value.int_value),   3) \\\nX(a, STATIC,   ONEOF,    UINT64,   (value,long_value,value.long_value),   4) \\\nX(a, STATIC,   ONEOF,    FLOAT,    (value,float_value,value.float_value),   5) \\\nX(a, STATIC,   ONEOF,    DOUBLE,   (value,double_value,value.double_value),   6) \\\nX(a, STATIC,   ONEOF,    BOOL,     (value,boolean_value,value.boolean_value),   7) \\\nX(a, POINTER,  ONEOF,    STRING,   (value,string_value,value.string_value),   8) \\\nX(a, STATIC,   ONEOF,    MESSAGE,  (value,propertyset_value,value.propertyset_value),   9) \\\nX(a, STATIC,   ONEOF,    MESSAGE,  (value,propertysets_value,value.propertysets_value),  10) \\\nX(a, STATIC,   ONEOF,    MESSAGE,  (value,extension_value,value.extension_value),  11)\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_CALLBACK NULL\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_DEFAULT NULL\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_value_propertyset_value_MSGTYPE org_eclipse_tahu_protobuf_Payload_PropertySet\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_value_propertysets_value_MSGTYPE org_eclipse_tahu_protobuf_Payload_PropertySetList\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_value_extension_value_MSGTYPE org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension\n\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_FIELDLIST(X, a) \\\nX(a, CALLBACK, OPTIONAL, EXTENSION, extensions,        1)\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_CALLBACK pb_default_field_callback\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_DEFAULT NULL\n\n#define org_eclipse_tahu_protobuf_Payload_PropertySet_FIELDLIST(X, a) \\\nX(a, POINTER,  REPEATED, STRING,   keys,              1) \\\nX(a, POINTER,  REPEATED, MESSAGE,  values,            2) \\\nX(a, CALLBACK, OPTIONAL, EXTENSION, extensions,        3)\n#define org_eclipse_tahu_protobuf_Payload_PropertySet_CALLBACK pb_default_field_callback\n#define org_eclipse_tahu_protobuf_Payload_PropertySet_DEFAULT NULL\n#define org_eclipse_tahu_protobuf_Payload_PropertySet_values_MSGTYPE org_eclipse_tahu_protobuf_Payload_PropertyValue\n\n#define org_eclipse_tahu_protobuf_Payload_PropertySetList_FIELDLIST(X, a) \\\nX(a, POINTER,  REPEATED, MESSAGE,  propertyset,       1) \\\nX(a, CALLBACK, OPTIONAL, EXTENSION, extensions,        2)\n#define org_eclipse_tahu_protobuf_Payload_PropertySetList_CALLBACK pb_default_field_callback\n#define org_eclipse_tahu_protobuf_Payload_PropertySetList_DEFAULT NULL\n#define org_eclipse_tahu_protobuf_Payload_PropertySetList_propertyset_MSGTYPE org_eclipse_tahu_protobuf_Payload_PropertySet\n\n#define org_eclipse_tahu_protobuf_Payload_MetaData_FIELDLIST(X, a) \\\nX(a, STATIC,   OPTIONAL, BOOL,     is_multi_part,     1) \\\nX(a, POINTER,  OPTIONAL, STRING,   content_type,      2) \\\nX(a, STATIC,   OPTIONAL, UINT64,   size,              3) \\\nX(a, STATIC,   OPTIONAL, UINT64,   seq,               4) \\\nX(a, POINTER,  OPTIONAL, STRING,   file_name,         5) \\\nX(a, POINTER,  OPTIONAL, STRING,   file_type,         6) \\\nX(a, POINTER,  OPTIONAL, STRING,   md5,               7) \\\nX(a, POINTER,  OPTIONAL, STRING,   description,       8) \\\nX(a, CALLBACK, OPTIONAL, EXTENSION, extensions,        9)\n#define org_eclipse_tahu_protobuf_Payload_MetaData_CALLBACK pb_default_field_callback\n#define org_eclipse_tahu_protobuf_Payload_MetaData_DEFAULT NULL\n\n#define org_eclipse_tahu_protobuf_Payload_Metric_FIELDLIST(X, a) \\\nX(a, POINTER,  OPTIONAL, STRING,   name,              1) \\\nX(a, STATIC,   OPTIONAL, UINT64,   alias,             2) \\\nX(a, STATIC,   OPTIONAL, UINT64,   timestamp,         3) \\\nX(a, STATIC,   OPTIONAL, UINT32,   datatype,          4) \\\nX(a, STATIC,   OPTIONAL, BOOL,     is_historical,     5) \\\nX(a, STATIC,   OPTIONAL, BOOL,     is_transient,      6) \\\nX(a, STATIC,   OPTIONAL, BOOL,     is_null,           7) \\\nX(a, STATIC,   OPTIONAL, MESSAGE,  metadata,          8) \\\nX(a, STATIC,   OPTIONAL, MESSAGE,  properties,        9) \\\nX(a, STATIC,   ONEOF,    UINT32,   (value,int_value,value.int_value),  10) \\\nX(a, STATIC,   ONEOF,    UINT64,   (value,long_value,value.long_value),  11) \\\nX(a, STATIC,   ONEOF,    FLOAT,    (value,float_value,value.float_value),  12) \\\nX(a, STATIC,   ONEOF,    DOUBLE,   (value,double_value,value.double_value),  13) \\\nX(a, STATIC,   ONEOF,    BOOL,     (value,boolean_value,value.boolean_value),  14) \\\nX(a, POINTER,  ONEOF,    STRING,   (value,string_value,value.string_value),  15) \\\nX(a, POINTER,  ONEOF,    BYTES,    (value,bytes_value,value.bytes_value),  16) \\\nX(a, STATIC,   ONEOF,    MESSAGE,  (value,dataset_value,value.dataset_value),  17) \\\nX(a, STATIC,   ONEOF,    MESSAGE,  (value,template_value,value.template_value),  18) \\\nX(a, STATIC,   ONEOF,    MESSAGE,  (value,extension_value,value.extension_value),  19)\n#define org_eclipse_tahu_protobuf_Payload_Metric_CALLBACK NULL\n#define org_eclipse_tahu_protobuf_Payload_Metric_DEFAULT NULL\n#define org_eclipse_tahu_protobuf_Payload_Metric_metadata_MSGTYPE org_eclipse_tahu_protobuf_Payload_MetaData\n#define org_eclipse_tahu_protobuf_Payload_Metric_properties_MSGTYPE org_eclipse_tahu_protobuf_Payload_PropertySet\n#define org_eclipse_tahu_protobuf_Payload_Metric_value_dataset_value_MSGTYPE org_eclipse_tahu_protobuf_Payload_DataSet\n#define org_eclipse_tahu_protobuf_Payload_Metric_value_template_value_MSGTYPE org_eclipse_tahu_protobuf_Payload_Template\n#define org_eclipse_tahu_protobuf_Payload_Metric_value_extension_value_MSGTYPE org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension\n\n#define org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_FIELDLIST(X, a) \\\nX(a, CALLBACK, OPTIONAL, EXTENSION, extensions,        1)\n#define org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_CALLBACK pb_default_field_callback\n#define org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_DEFAULT NULL\n\nextern const pb_msgdesc_t org_eclipse_tahu_protobuf_Payload_msg;\nextern const pb_msgdesc_t org_eclipse_tahu_protobuf_Payload_Template_msg;\nextern const pb_msgdesc_t org_eclipse_tahu_protobuf_Payload_Template_Parameter_msg;\nextern const pb_msgdesc_t org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_msg;\nextern const pb_msgdesc_t org_eclipse_tahu_protobuf_Payload_DataSet_msg;\nextern const pb_msgdesc_t org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_msg;\nextern const pb_msgdesc_t org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_msg;\nextern const pb_msgdesc_t org_eclipse_tahu_protobuf_Payload_DataSet_Row_msg;\nextern const pb_msgdesc_t org_eclipse_tahu_protobuf_Payload_PropertyValue_msg;\nextern const pb_msgdesc_t org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_msg;\nextern const pb_msgdesc_t org_eclipse_tahu_protobuf_Payload_PropertySet_msg;\nextern const pb_msgdesc_t org_eclipse_tahu_protobuf_Payload_PropertySetList_msg;\nextern const pb_msgdesc_t org_eclipse_tahu_protobuf_Payload_MetaData_msg;\nextern const pb_msgdesc_t org_eclipse_tahu_protobuf_Payload_Metric_msg;\nextern const pb_msgdesc_t org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_msg;\n\n/* Defines for backwards compatibility with code written before nanopb-0.4.0 */\n#define org_eclipse_tahu_protobuf_Payload_fields &org_eclipse_tahu_protobuf_Payload_msg\n#define org_eclipse_tahu_protobuf_Payload_Template_fields &org_eclipse_tahu_protobuf_Payload_Template_msg\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_fields &org_eclipse_tahu_protobuf_Payload_Template_Parameter_msg\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_fields &org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_msg\n#define org_eclipse_tahu_protobuf_Payload_DataSet_fields &org_eclipse_tahu_protobuf_Payload_DataSet_msg\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_fields &org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_msg\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_fields &org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_msg\n#define org_eclipse_tahu_protobuf_Payload_DataSet_Row_fields &org_eclipse_tahu_protobuf_Payload_DataSet_Row_msg\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_fields &org_eclipse_tahu_protobuf_Payload_PropertyValue_msg\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_fields &org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_msg\n#define org_eclipse_tahu_protobuf_Payload_PropertySet_fields &org_eclipse_tahu_protobuf_Payload_PropertySet_msg\n#define org_eclipse_tahu_protobuf_Payload_PropertySetList_fields &org_eclipse_tahu_protobuf_Payload_PropertySetList_msg\n#define org_eclipse_tahu_protobuf_Payload_MetaData_fields &org_eclipse_tahu_protobuf_Payload_MetaData_msg\n#define org_eclipse_tahu_protobuf_Payload_Metric_fields &org_eclipse_tahu_protobuf_Payload_Metric_msg\n#define org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_fields &org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_msg\n\n/* Maximum encoded size of messages (where known) */\n/* org_eclipse_tahu_protobuf_Payload_size depends on runtime parameters */\n/* org_eclipse_tahu_protobuf_Payload_Template_size depends on runtime parameters */\n/* org_eclipse_tahu_protobuf_Payload_Template_Parameter_size depends on runtime parameters */\n#define org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_size 0\n/* org_eclipse_tahu_protobuf_Payload_DataSet_size depends on runtime parameters */\n/* org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_size depends on runtime parameters */\n#define org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_size 0\n/* org_eclipse_tahu_protobuf_Payload_DataSet_Row_size depends on runtime parameters */\n/* org_eclipse_tahu_protobuf_Payload_PropertyValue_size depends on runtime parameters */\n#define org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_size 0\n/* org_eclipse_tahu_protobuf_Payload_PropertySet_size depends on runtime parameters */\n/* org_eclipse_tahu_protobuf_Payload_PropertySetList_size depends on runtime parameters */\n/* org_eclipse_tahu_protobuf_Payload_MetaData_size depends on runtime parameters */\n/* org_eclipse_tahu_protobuf_Payload_Metric_size depends on runtime parameters */\n#define org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_size 0\n\n#ifdef __cplusplus\n} /* extern \"C\" */\n#endif\n\n#endif\n"
  },
  {
    "path": "c/core/readme.txt",
    "content": "#/********************************************************************************\n# * Copyright (c) 2014-2019 Cirrus Link Solutions and others\n# *\n# * This program and the accompanying materials are made available under the\n# * terms of the Eclipse Public License 2.0 which is available at\n# * http://www.eclipse.org/legal/epl-2.0.\n# *\n# * SPDX-License-Identifier: EPL-2.0\n# *\n# * Contributors:\n# *   Cirrus Link Solutions - initial implementation\n# ********************************************************************************/\n\n# To generate the base protobuf tahu NanoPB C library (using Protoc v2.6.1 and Nanopb v0.3.5)\nprotoc --proto_path=../../ -otahu.pb ../../sparkplug_b/sparkplug_b.proto \n~/nanopb/nanopb-0.3.5-linux-x86/generator/nanopb_generator.py -f tahu.options tahu.pb\nmv tahu.pb src/\nmv tahu.pb.c src/\nmv tahu.pb.h include/\n"
  },
  {
    "path": "c/core/src/pb_common.c",
    "content": "/* pb_common.c: Common support functions for pb_encode.c and pb_decode.c.\n *\n * 2014 Petteri Aimonen <jpa@kapsi.fi>\n */\n\n#include \"pb_common.h\"\n\nstatic bool load_descriptor_values(pb_field_iter_t *iter)\n{\n    uint32_t word0;\n    uint32_t data_offset;\n    uint_least8_t format;\n    int_least8_t size_offset;\n\n    if (iter->index >= iter->descriptor->field_count)\n        return false;\n\n    word0 = PB_PROGMEM_READU32(iter->descriptor->field_info[iter->field_info_index]);\n    format = word0 & 3;\n    iter->tag = (pb_size_t)((word0 >> 2) & 0x3F);\n    iter->type = (pb_type_t)((word0 >> 8) & 0xFF);\n\n    if (format == 0)\n    {\n        /* 1-word format */\n        iter->array_size = 1;\n        size_offset = (int_least8_t)((word0 >> 24) & 0x0F);\n        data_offset = (word0 >> 16) & 0xFF;\n        iter->data_size = (pb_size_t)((word0 >> 28) & 0x0F);\n    }\n    else if (format == 1)\n    {\n        /* 2-word format */\n        uint32_t word1 = PB_PROGMEM_READU32(iter->descriptor->field_info[iter->field_info_index + 1]);\n\n        iter->array_size = (pb_size_t)((word0 >> 16) & 0x0FFF);\n        iter->tag = (pb_size_t)(iter->tag | ((word1 >> 28) << 6));\n        size_offset = (int_least8_t)((word0 >> 28) & 0x0F);\n        data_offset = word1 & 0xFFFF;\n        iter->data_size = (pb_size_t)((word1 >> 16) & 0x0FFF);\n    }\n    else if (format == 2)\n    {\n        /* 4-word format */\n        uint32_t word1 = PB_PROGMEM_READU32(iter->descriptor->field_info[iter->field_info_index + 1]);\n        uint32_t word2 = PB_PROGMEM_READU32(iter->descriptor->field_info[iter->field_info_index + 2]);\n        uint32_t word3 = PB_PROGMEM_READU32(iter->descriptor->field_info[iter->field_info_index + 3]);\n\n        iter->array_size = (pb_size_t)(word0 >> 16);\n        iter->tag = (pb_size_t)(iter->tag | ((word1 >> 8) << 6));\n        size_offset = (int_least8_t)(word1 & 0xFF);\n        data_offset = word2;\n        iter->data_size = (pb_size_t)word3;\n    }\n    else\n    {\n        /* 8-word format */\n        uint32_t word1 = PB_PROGMEM_READU32(iter->descriptor->field_info[iter->field_info_index + 1]);\n        uint32_t word2 = PB_PROGMEM_READU32(iter->descriptor->field_info[iter->field_info_index + 2]);\n        uint32_t word3 = PB_PROGMEM_READU32(iter->descriptor->field_info[iter->field_info_index + 3]);\n        uint32_t word4 = PB_PROGMEM_READU32(iter->descriptor->field_info[iter->field_info_index + 4]);\n\n        iter->array_size = (pb_size_t)word4;\n        iter->tag = (pb_size_t)(iter->tag | ((word1 >> 8) << 6));\n        size_offset = (int_least8_t)(word1 & 0xFF);\n        data_offset = word2;\n        iter->data_size = (pb_size_t)word3;\n    }\n\n    iter->pField = (char*)iter->message + data_offset;\n\n    if (size_offset)\n    {\n        iter->pSize = (char*)iter->pField - size_offset;\n    }\n    else if (PB_HTYPE(iter->type) == PB_HTYPE_REPEATED &&\n             (PB_ATYPE(iter->type) == PB_ATYPE_STATIC ||\n              PB_ATYPE(iter->type) == PB_ATYPE_POINTER))\n    {\n        /* Fixed count array */\n        iter->pSize = &iter->array_size;\n    }\n    else\n    {\n        iter->pSize = NULL;\n    }\n\n    if (PB_ATYPE(iter->type) == PB_ATYPE_POINTER && iter->pField != NULL)\n    {\n        iter->pData = *(void**)iter->pField;\n    }\n    else\n    {\n        iter->pData = iter->pField;\n    }\n\n    if (PB_LTYPE_IS_SUBMSG(iter->type))\n    {\n        iter->submsg_desc = iter->descriptor->submsg_info[iter->submessage_index];\n    }\n    else\n    {\n        iter->submsg_desc = NULL;\n    }\n\n    return true;\n}\n\nstatic void advance_iterator(pb_field_iter_t *iter)\n{\n    iter->index++;\n\n    if (iter->index >= iter->descriptor->field_count)\n    {\n        /* Restart */\n        iter->index = 0;\n        iter->field_info_index = 0;\n        iter->submessage_index = 0;\n        iter->required_field_index = 0;\n    }\n    else\n    {\n        /* Increment indexes based on previous field type.\n         * All field info formats have the following fields:\n         * - lowest 2 bits tell the amount of words in the descriptor (2^n words)\n         * - bits 2..7 give the lowest bits of tag number.\n         * - bits 8..15 give the field type.\n         */\n        uint32_t prev_descriptor = PB_PROGMEM_READU32(iter->descriptor->field_info[iter->field_info_index]);\n        pb_type_t prev_type = (prev_descriptor >> 8) & 0xFF;\n        pb_size_t descriptor_len = (pb_size_t)(1 << (prev_descriptor & 3));\n\n        iter->field_info_index = (pb_size_t)(iter->field_info_index + descriptor_len);\n\n        if (PB_HTYPE(prev_type) == PB_HTYPE_REQUIRED)\n        {\n            iter->required_field_index++;\n        }\n\n        if (PB_LTYPE_IS_SUBMSG(prev_type))\n        {\n            iter->submessage_index++;\n        }\n    }\n}\n\nbool pb_field_iter_begin(pb_field_iter_t *iter, const pb_msgdesc_t *desc, void *message)\n{\n    memset(iter, 0, sizeof(*iter));\n\n    iter->descriptor = desc;\n    iter->message = message;\n\n    return load_descriptor_values(iter);\n}\n\nbool pb_field_iter_begin_extension(pb_field_iter_t *iter, pb_extension_t *extension)\n{\n    const pb_msgdesc_t *msg = (const pb_msgdesc_t*)extension->type->arg;\n    bool status;\n\n    uint32_t word0 = PB_PROGMEM_READU32(msg->field_info[0]);\n    if (PB_ATYPE(word0 >> 8) == PB_ATYPE_POINTER)\n    {\n        /* For pointer extensions, the pointer is stored directly\n         * in the extension structure. This avoids having an extra\n         * indirection. */\n        status = pb_field_iter_begin(iter, msg, &extension->dest);\n    }\n    else\n    {\n        status = pb_field_iter_begin(iter, msg, extension->dest);\n    }\n\n    iter->pSize = &extension->found;\n    return status;\n}\n\nbool pb_field_iter_next(pb_field_iter_t *iter)\n{\n    advance_iterator(iter);\n    (void)load_descriptor_values(iter);\n    return iter->index != 0;\n}\n\nbool pb_field_iter_find(pb_field_iter_t *iter, uint32_t tag)\n{\n    if (iter->tag == tag)\n    {\n        return true; /* Nothing to do, correct field already. */\n    }\n    else\n    {\n        pb_size_t start = iter->index;\n        uint32_t fieldinfo;\n\n        do\n        {\n            /* Advance iterator but don't load values yet */\n            advance_iterator(iter);\n\n            /* Do fast check for tag number match */\n            fieldinfo = PB_PROGMEM_READU32(iter->descriptor->field_info[iter->field_info_index]);\n\n            if (((fieldinfo >> 2) & 0x3F) == (tag & 0x3F))\n            {\n                /* Good candidate, check further */\n                (void)load_descriptor_values(iter);\n\n                if (iter->tag == tag &&\n                    PB_LTYPE(iter->type) != PB_LTYPE_EXTENSION)\n                {\n                    /* Found it */\n                    return true;\n                }\n            }\n        } while (iter->index != start);\n\n        /* Searched all the way back to start, and found nothing. */\n        (void)load_descriptor_values(iter);\n        return false;\n    }\n}\n\nstatic void *pb_const_cast(const void *p)\n{\n    /* Note: this casts away const, in order to use the common field iterator\n     * logic for both encoding and decoding. The cast is done using union\n     * to avoid spurious compiler warnings. */\n    union {\n        void *p1;\n        const void *p2;\n    } t;\n    t.p2 = p;\n    return t.p1;\n}\n\nbool pb_field_iter_begin_const(pb_field_iter_t *iter, const pb_msgdesc_t *desc, const void *message)\n{\n    return pb_field_iter_begin(iter, desc, pb_const_cast(message));\n}\n\nbool pb_field_iter_begin_extension_const(pb_field_iter_t *iter, const pb_extension_t *extension)\n{\n    return pb_field_iter_begin_extension(iter, (pb_extension_t*)pb_const_cast(extension));\n}\n\nbool pb_default_field_callback(pb_istream_t *istream, pb_ostream_t *ostream, const pb_field_t *field)\n{\n    if (field->data_size == sizeof(pb_callback_t))\n    {\n        pb_callback_t *pCallback = (pb_callback_t*)field->pData;\n\n        if (pCallback != NULL)\n        {\n            if (istream != NULL && pCallback->funcs.decode != NULL)\n            {\n                return pCallback->funcs.decode(istream, field, &pCallback->arg);\n            }\n\n            if (ostream != NULL && pCallback->funcs.encode != NULL)\n            {\n                return pCallback->funcs.encode(ostream, field, &pCallback->arg);\n            }\n        }\n    }\n\n    return true; /* Success, but didn't do anything */\n\n}\n\n#ifdef PB_VALIDATE_UTF8\n\n/* This function checks whether a string is valid UTF-8 text.\n *\n * Algorithm is adapted from https://www.cl.cam.ac.uk/~mgk25/ucs/utf8_check.c\n * Original copyright: Markus Kuhn <http://www.cl.cam.ac.uk/~mgk25/> 2005-03-30\n * Licensed under \"Short code license\", which allows use under MIT license or\n * any compatible with it.\n */\n\nbool pb_validate_utf8(const char *str)\n{\n    const pb_byte_t *s = (const pb_byte_t*)str;\n    while (*s)\n    {\n        if (*s < 0x80)\n        {\n            /* 0xxxxxxx */\n            s++;\n        }\n        else if ((s[0] & 0xe0) == 0xc0)\n        {\n            /* 110XXXXx 10xxxxxx */\n            if ((s[1] & 0xc0) != 0x80 ||\n                (s[0] & 0xfe) == 0xc0)                        /* overlong? */\n                return false;\n            else\n                s += 2;\n        }\n        else if ((s[0] & 0xf0) == 0xe0)\n        {\n            /* 1110XXXX 10Xxxxxx 10xxxxxx */\n            if ((s[1] & 0xc0) != 0x80 ||\n                (s[2] & 0xc0) != 0x80 ||\n                (s[0] == 0xe0 && (s[1] & 0xe0) == 0x80) ||    /* overlong? */\n                (s[0] == 0xed && (s[1] & 0xe0) == 0xa0) ||    /* surrogate? */\n                (s[0] == 0xef && s[1] == 0xbf &&\n                (s[2] & 0xfe) == 0xbe))                 /* U+FFFE or U+FFFF? */\n                return false;\n            else\n                s += 3;\n        }\n        else if ((s[0] & 0xf8) == 0xf0)\n        {\n            /* 11110XXX 10XXxxxx 10xxxxxx 10xxxxxx */\n            if ((s[1] & 0xc0) != 0x80 ||\n                (s[2] & 0xc0) != 0x80 ||\n                (s[3] & 0xc0) != 0x80 ||\n                (s[0] == 0xf0 && (s[1] & 0xf0) == 0x80) ||    /* overlong? */\n                (s[0] == 0xf4 && s[1] > 0x8f) || s[0] > 0xf4) /* > U+10FFFF? */\n                return false;\n            else\n                s += 4;\n        }\n        else\n        {\n            return false;\n        }\n    }\n\n    return true;\n}\n\n#endif\n\n"
  },
  {
    "path": "c/core/src/pb_decode.c",
    "content": "/* pb_decode.c -- decode a protobuf using minimal resources\n *\n * 2011 Petteri Aimonen <jpa@kapsi.fi>\n */\n\n/* Use the GCC warn_unused_result attribute to check that all return values\n * are propagated correctly. On other compilers and gcc before 3.4.0 just\n * ignore the annotation.\n */\n#if !defined(__GNUC__) || ( __GNUC__ < 3) || (__GNUC__ == 3 && __GNUC_MINOR__ < 4)\n    #define checkreturn\n#else\n    #define checkreturn __attribute__((warn_unused_result))\n#endif\n\n#include \"pb.h\"\n#include \"pb_decode.h\"\n#include \"pb_common.h\"\n\n/**************************************\n * Declarations internal to this file *\n **************************************/\n\nstatic bool checkreturn buf_read(pb_istream_t *stream, pb_byte_t *buf, size_t count);\nstatic bool checkreturn pb_decode_varint32_eof(pb_istream_t *stream, uint32_t *dest, bool *eof);\nstatic bool checkreturn read_raw_value(pb_istream_t *stream, pb_wire_type_t wire_type, pb_byte_t *buf, size_t *size);\nstatic bool checkreturn check_wire_type(pb_wire_type_t wire_type, pb_field_iter_t *field);\nstatic bool checkreturn decode_basic_field(pb_istream_t *stream, pb_field_iter_t *field);\nstatic bool checkreturn decode_static_field(pb_istream_t *stream, pb_wire_type_t wire_type, pb_field_iter_t *field);\nstatic bool checkreturn decode_pointer_field(pb_istream_t *stream, pb_wire_type_t wire_type, pb_field_iter_t *field);\nstatic bool checkreturn decode_callback_field(pb_istream_t *stream, pb_wire_type_t wire_type, pb_field_iter_t *field);\nstatic bool checkreturn decode_field(pb_istream_t *stream, pb_wire_type_t wire_type, pb_field_iter_t *field);\nstatic bool checkreturn default_extension_decoder(pb_istream_t *stream, pb_extension_t *extension, uint32_t tag, pb_wire_type_t wire_type);\nstatic bool checkreturn decode_extension(pb_istream_t *stream, uint32_t tag, pb_wire_type_t wire_type, pb_field_iter_t *iter);\nstatic bool checkreturn find_extension_field(pb_field_iter_t *iter);\nstatic bool pb_message_set_to_defaults(pb_field_iter_t *iter);\nstatic bool checkreturn pb_dec_bool(pb_istream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn pb_dec_varint(pb_istream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn pb_dec_fixed(pb_istream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn pb_dec_bytes(pb_istream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn pb_dec_string(pb_istream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn pb_dec_submessage(pb_istream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn pb_dec_fixed_length_bytes(pb_istream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn pb_skip_varint(pb_istream_t *stream);\nstatic bool checkreturn pb_skip_string(pb_istream_t *stream);\n\n#ifdef PB_ENABLE_MALLOC\nstatic bool checkreturn allocate_field(pb_istream_t *stream, void *pData, size_t data_size, size_t array_size);\nstatic void initialize_pointer_field(void *pItem, pb_field_iter_t *field);\nstatic bool checkreturn pb_release_union_field(pb_istream_t *stream, pb_field_iter_t *field);\nstatic void pb_release_single_field(pb_field_iter_t *field);\n#endif\n\n#ifdef PB_WITHOUT_64BIT\n#define pb_int64_t int32_t\n#define pb_uint64_t uint32_t\n#else\n#define pb_int64_t int64_t\n#define pb_uint64_t uint64_t\n#endif\n\ntypedef struct {\n    uint32_t bitfield[(PB_MAX_REQUIRED_FIELDS + 31) / 32];\n} pb_fields_seen_t;\n\n/*******************************\n * pb_istream_t implementation *\n *******************************/\n\nstatic bool checkreturn buf_read(pb_istream_t *stream, pb_byte_t *buf, size_t count)\n{\n    size_t i;\n    const pb_byte_t *source = (const pb_byte_t*)stream->state;\n    stream->state = (pb_byte_t*)stream->state + count;\n    \n    if (buf != NULL)\n    {\n        for (i = 0; i < count; i++)\n            buf[i] = source[i];\n    }\n    \n    return true;\n}\n\nbool checkreturn pb_read(pb_istream_t *stream, pb_byte_t *buf, size_t count)\n{\n    if (count == 0)\n        return true;\n\n#ifndef PB_BUFFER_ONLY\n\tif (buf == NULL && stream->callback != buf_read)\n\t{\n\t\t/* Skip input bytes */\n\t\tpb_byte_t tmp[16];\n\t\twhile (count > 16)\n\t\t{\n\t\t\tif (!pb_read(stream, tmp, 16))\n\t\t\t\treturn false;\n\t\t\t\n\t\t\tcount -= 16;\n\t\t}\n\t\t\n\t\treturn pb_read(stream, tmp, count);\n\t}\n#endif\n\n    if (stream->bytes_left < count)\n        PB_RETURN_ERROR(stream, \"end-of-stream\");\n    \n#ifndef PB_BUFFER_ONLY\n    if (!stream->callback(stream, buf, count))\n        PB_RETURN_ERROR(stream, \"io error\");\n#else\n    if (!buf_read(stream, buf, count))\n        return false;\n#endif\n    \n    stream->bytes_left -= count;\n    return true;\n}\n\n/* Read a single byte from input stream. buf may not be NULL.\n * This is an optimization for the varint decoding. */\nstatic bool checkreturn pb_readbyte(pb_istream_t *stream, pb_byte_t *buf)\n{\n    if (stream->bytes_left == 0)\n        PB_RETURN_ERROR(stream, \"end-of-stream\");\n\n#ifndef PB_BUFFER_ONLY\n    if (!stream->callback(stream, buf, 1))\n        PB_RETURN_ERROR(stream, \"io error\");\n#else\n    *buf = *(const pb_byte_t*)stream->state;\n    stream->state = (pb_byte_t*)stream->state + 1;\n#endif\n\n    stream->bytes_left--;\n    \n    return true;    \n}\n\npb_istream_t pb_istream_from_buffer(const pb_byte_t *buf, size_t bufsize)\n{\n    pb_istream_t stream;\n    /* Cast away the const from buf without a compiler error.  We are\n     * careful to use it only in a const manner in the callbacks.\n     */\n    union {\n        void *state;\n        const void *c_state;\n    } state;\n#ifdef PB_BUFFER_ONLY\n    stream.callback = NULL;\n#else\n    stream.callback = &buf_read;\n#endif\n    state.c_state = buf;\n    stream.state = state.state;\n    stream.bytes_left = bufsize;\n#ifndef PB_NO_ERRMSG\n    stream.errmsg = NULL;\n#endif\n    return stream;\n}\n\n/********************\n * Helper functions *\n ********************/\n\nstatic bool checkreturn pb_decode_varint32_eof(pb_istream_t *stream, uint32_t *dest, bool *eof)\n{\n    pb_byte_t byte;\n    uint32_t result;\n    \n    if (!pb_readbyte(stream, &byte))\n    {\n        if (stream->bytes_left == 0)\n        {\n            if (eof)\n            {\n                *eof = true;\n            }\n        }\n\n        return false;\n    }\n    \n    if ((byte & 0x80) == 0)\n    {\n        /* Quick case, 1 byte value */\n        result = byte;\n    }\n    else\n    {\n        /* Multibyte case */\n        uint_fast8_t bitpos = 7;\n        result = byte & 0x7F;\n        \n        do\n        {\n            if (!pb_readbyte(stream, &byte))\n                return false;\n            \n            if (bitpos >= 32)\n            {\n                /* Note: The varint could have trailing 0x80 bytes, or 0xFF for negative. */\n                pb_byte_t sign_extension = (bitpos < 63) ? 0xFF : 0x01;\n                \n                if ((byte & 0x7F) != 0x00 && ((result >> 31) == 0 || byte != sign_extension))\n                {\n                    PB_RETURN_ERROR(stream, \"varint overflow\");\n                }\n            }\n            else\n            {\n                result |= (uint32_t)(byte & 0x7F) << bitpos;\n            }\n            bitpos = (uint_fast8_t)(bitpos + 7);\n        } while (byte & 0x80);\n        \n        if (bitpos == 35 && (byte & 0x70) != 0)\n        {\n            /* The last byte was at bitpos=28, so only bottom 4 bits fit. */\n            PB_RETURN_ERROR(stream, \"varint overflow\");\n        }\n   }\n   \n   *dest = result;\n   return true;\n}\n\nbool checkreturn pb_decode_varint32(pb_istream_t *stream, uint32_t *dest)\n{\n    return pb_decode_varint32_eof(stream, dest, NULL);\n}\n\n#ifndef PB_WITHOUT_64BIT\nbool checkreturn pb_decode_varint(pb_istream_t *stream, uint64_t *dest)\n{\n    pb_byte_t byte;\n    uint_fast8_t bitpos = 0;\n    uint64_t result = 0;\n    \n    do\n    {\n        if (bitpos >= 64)\n            PB_RETURN_ERROR(stream, \"varint overflow\");\n        \n        if (!pb_readbyte(stream, &byte))\n            return false;\n\n        result |= (uint64_t)(byte & 0x7F) << bitpos;\n        bitpos = (uint_fast8_t)(bitpos + 7);\n    } while (byte & 0x80);\n    \n    *dest = result;\n    return true;\n}\n#endif\n\nbool checkreturn pb_skip_varint(pb_istream_t *stream)\n{\n    pb_byte_t byte;\n    do\n    {\n        if (!pb_read(stream, &byte, 1))\n            return false;\n    } while (byte & 0x80);\n    return true;\n}\n\nbool checkreturn pb_skip_string(pb_istream_t *stream)\n{\n    uint32_t length;\n    if (!pb_decode_varint32(stream, &length))\n        return false;\n    \n    if ((size_t)length != length)\n    {\n        PB_RETURN_ERROR(stream, \"size too large\");\n    }\n\n    return pb_read(stream, NULL, (size_t)length);\n}\n\nbool checkreturn pb_decode_tag(pb_istream_t *stream, pb_wire_type_t *wire_type, uint32_t *tag, bool *eof)\n{\n    uint32_t temp;\n    *eof = false;\n    *wire_type = (pb_wire_type_t) 0;\n    *tag = 0;\n    \n    if (!pb_decode_varint32_eof(stream, &temp, eof))\n    {\n        return false;\n    }\n    \n    *tag = temp >> 3;\n    *wire_type = (pb_wire_type_t)(temp & 7);\n    return true;\n}\n\nbool checkreturn pb_skip_field(pb_istream_t *stream, pb_wire_type_t wire_type)\n{\n    switch (wire_type)\n    {\n        case PB_WT_VARINT: return pb_skip_varint(stream);\n        case PB_WT_64BIT: return pb_read(stream, NULL, 8);\n        case PB_WT_STRING: return pb_skip_string(stream);\n        case PB_WT_32BIT: return pb_read(stream, NULL, 4);\n        default: PB_RETURN_ERROR(stream, \"invalid wire_type\");\n    }\n}\n\n/* Read a raw value to buffer, for the purpose of passing it to callback as\n * a substream. Size is maximum size on call, and actual size on return.\n */\nstatic bool checkreturn read_raw_value(pb_istream_t *stream, pb_wire_type_t wire_type, pb_byte_t *buf, size_t *size)\n{\n    size_t max_size = *size;\n    switch (wire_type)\n    {\n        case PB_WT_VARINT:\n            *size = 0;\n            do\n            {\n                (*size)++;\n                if (*size > max_size)\n                    PB_RETURN_ERROR(stream, \"varint overflow\");\n\n                if (!pb_read(stream, buf, 1))\n                    return false;\n            } while (*buf++ & 0x80);\n            return true;\n            \n        case PB_WT_64BIT:\n            *size = 8;\n            return pb_read(stream, buf, 8);\n        \n        case PB_WT_32BIT:\n            *size = 4;\n            return pb_read(stream, buf, 4);\n        \n        case PB_WT_STRING:\n            /* Calling read_raw_value with a PB_WT_STRING is an error.\n             * Explicitly handle this case and fallthrough to default to avoid\n             * compiler warnings.\n             */\n\n        default: PB_RETURN_ERROR(stream, \"invalid wire_type\");\n    }\n}\n\n/* Decode string length from stream and return a substream with limited length.\n * Remember to close the substream using pb_close_string_substream().\n */\nbool checkreturn pb_make_string_substream(pb_istream_t *stream, pb_istream_t *substream)\n{\n    uint32_t size;\n    if (!pb_decode_varint32(stream, &size))\n        return false;\n    \n    *substream = *stream;\n    if (substream->bytes_left < size)\n        PB_RETURN_ERROR(stream, \"parent stream too short\");\n    \n    substream->bytes_left = (size_t)size;\n    stream->bytes_left -= (size_t)size;\n    return true;\n}\n\nbool checkreturn pb_close_string_substream(pb_istream_t *stream, pb_istream_t *substream)\n{\n    if (substream->bytes_left) {\n        if (!pb_read(substream, NULL, substream->bytes_left))\n            return false;\n    }\n\n    stream->state = substream->state;\n\n#ifndef PB_NO_ERRMSG\n    stream->errmsg = substream->errmsg;\n#endif\n    return true;\n}\n\n/*************************\n * Decode a single field *\n *************************/\n\nstatic bool checkreturn check_wire_type(pb_wire_type_t wire_type, pb_field_iter_t *field)\n{\n    switch (PB_LTYPE(field->type))\n    {\n        case PB_LTYPE_BOOL:\n        case PB_LTYPE_VARINT:\n        case PB_LTYPE_UVARINT:\n        case PB_LTYPE_SVARINT:\n            return wire_type == PB_WT_VARINT;\n\n        case PB_LTYPE_FIXED32:\n            return wire_type == PB_WT_32BIT;\n\n        case PB_LTYPE_FIXED64:\n            return wire_type == PB_WT_64BIT;\n\n        case PB_LTYPE_BYTES:\n        case PB_LTYPE_STRING:\n        case PB_LTYPE_SUBMESSAGE:\n        case PB_LTYPE_SUBMSG_W_CB:\n        case PB_LTYPE_FIXED_LENGTH_BYTES:\n            return wire_type == PB_WT_STRING;\n\n        default:\n            return false;\n    }\n}\n\nstatic bool checkreturn decode_basic_field(pb_istream_t *stream, pb_field_iter_t *field)\n{\n    switch (PB_LTYPE(field->type))\n    {\n        case PB_LTYPE_BOOL:\n            return pb_dec_bool(stream, field);\n\n        case PB_LTYPE_VARINT:\n        case PB_LTYPE_UVARINT:\n        case PB_LTYPE_SVARINT:\n            return pb_dec_varint(stream, field);\n\n        case PB_LTYPE_FIXED32:\n        case PB_LTYPE_FIXED64:\n            return pb_dec_fixed(stream, field);\n\n        case PB_LTYPE_BYTES:\n            return pb_dec_bytes(stream, field);\n\n        case PB_LTYPE_STRING:\n            return pb_dec_string(stream, field);\n\n        case PB_LTYPE_SUBMESSAGE:\n        case PB_LTYPE_SUBMSG_W_CB:\n            return pb_dec_submessage(stream, field);\n\n        case PB_LTYPE_FIXED_LENGTH_BYTES:\n            return pb_dec_fixed_length_bytes(stream, field);\n\n        default:\n            PB_RETURN_ERROR(stream, \"invalid field type\");\n    }\n}\n\nstatic bool checkreturn decode_static_field(pb_istream_t *stream, pb_wire_type_t wire_type, pb_field_iter_t *field)\n{\n    switch (PB_HTYPE(field->type))\n    {\n        case PB_HTYPE_REQUIRED:\n            if (!check_wire_type(wire_type, field))\n                PB_RETURN_ERROR(stream, \"wrong wire type\");\n\n            return decode_basic_field(stream, field);\n            \n        case PB_HTYPE_OPTIONAL:\n            if (!check_wire_type(wire_type, field))\n                PB_RETURN_ERROR(stream, \"wrong wire type\");\n\n            if (field->pSize != NULL)\n                *(bool*)field->pSize = true;\n            return decode_basic_field(stream, field);\n    \n        case PB_HTYPE_REPEATED:\n            if (wire_type == PB_WT_STRING\n                && PB_LTYPE(field->type) <= PB_LTYPE_LAST_PACKABLE)\n            {\n                /* Packed array */\n                bool status = true;\n                pb_istream_t substream;\n                pb_size_t *size = (pb_size_t*)field->pSize;\n                field->pData = (char*)field->pField + field->data_size * (*size);\n\n                if (!pb_make_string_substream(stream, &substream))\n                    return false;\n\n                while (substream.bytes_left > 0 && *size < field->array_size)\n                {\n                    if (!decode_basic_field(&substream, field))\n                    {\n                        status = false;\n                        break;\n                    }\n                    (*size)++;\n                    field->pData = (char*)field->pData + field->data_size;\n                }\n\n                if (substream.bytes_left != 0)\n                    PB_RETURN_ERROR(stream, \"array overflow\");\n                if (!pb_close_string_substream(stream, &substream))\n                    return false;\n\n                return status;\n            }\n            else\n            {\n                /* Repeated field */\n                pb_size_t *size = (pb_size_t*)field->pSize;\n                field->pData = (char*)field->pField + field->data_size * (*size);\n\n                if (!check_wire_type(wire_type, field))\n                    PB_RETURN_ERROR(stream, \"wrong wire type\");\n\n                if ((*size)++ >= field->array_size)\n                    PB_RETURN_ERROR(stream, \"array overflow\");\n\n                return decode_basic_field(stream, field);\n            }\n\n        case PB_HTYPE_ONEOF:\n            *(pb_size_t*)field->pSize = field->tag;\n            if (PB_LTYPE_IS_SUBMSG(field->type))\n            {\n                /* We memset to zero so that any callbacks are set to NULL.\n                 * This is because the callbacks might otherwise have values\n                 * from some other union field.\n                 * If callbacks are needed inside oneof field, use .proto\n                 * option submsg_callback to have a separate callback function\n                 * that can set the fields before submessage is decoded.\n                 * pb_dec_submessage() will set any default values. */\n                memset(field->pData, 0, (size_t)field->data_size);\n            }\n\n            if (!check_wire_type(wire_type, field))\n                PB_RETURN_ERROR(stream, \"wrong wire type\");\n\n            return decode_basic_field(stream, field);\n\n        default:\n            PB_RETURN_ERROR(stream, \"invalid field type\");\n    }\n}\n\n#ifdef PB_ENABLE_MALLOC\n/* Allocate storage for the field and store the pointer at iter->pData.\n * array_size is the number of entries to reserve in an array.\n * Zero size is not allowed, use pb_free() for releasing.\n */\nstatic bool checkreturn allocate_field(pb_istream_t *stream, void *pData, size_t data_size, size_t array_size)\n{    \n    void *ptr = *(void**)pData;\n    \n    if (data_size == 0 || array_size == 0)\n        PB_RETURN_ERROR(stream, \"invalid size\");\n    \n#ifdef __AVR__\n    /* Workaround for AVR libc bug 53284: http://savannah.nongnu.org/bugs/?53284\n     * Realloc to size of 1 byte can cause corruption of the malloc structures.\n     */\n    if (data_size == 1 && array_size == 1)\n    {\n        data_size = 2;\n    }\n#endif\n\n    /* Check for multiplication overflows.\n     * This code avoids the costly division if the sizes are small enough.\n     * Multiplication is safe as long as only half of bits are set\n     * in either multiplicand.\n     */\n    {\n        const size_t check_limit = (size_t)1 << (sizeof(size_t) * 4);\n        if (data_size >= check_limit || array_size >= check_limit)\n        {\n            const size_t size_max = (size_t)-1;\n            if (size_max / array_size < data_size)\n            {\n                PB_RETURN_ERROR(stream, \"size too large\");\n            }\n        }\n    }\n    \n    /* Allocate new or expand previous allocation */\n    /* Note: on failure the old pointer will remain in the structure,\n     * the message must be freed by caller also on error return. */\n    ptr = pb_realloc(ptr, array_size * data_size);\n    if (ptr == NULL)\n        PB_RETURN_ERROR(stream, \"realloc failed\");\n    \n    *(void**)pData = ptr;\n    return true;\n}\n\n/* Clear a newly allocated item in case it contains a pointer, or is a submessage. */\nstatic void initialize_pointer_field(void *pItem, pb_field_iter_t *field)\n{\n    if (PB_LTYPE(field->type) == PB_LTYPE_STRING ||\n        PB_LTYPE(field->type) == PB_LTYPE_BYTES)\n    {\n        *(void**)pItem = NULL;\n    }\n    else if (PB_LTYPE_IS_SUBMSG(field->type))\n    {\n        /* We memset to zero so that any callbacks are set to NULL.\n         * Then set any default values. */\n        pb_field_iter_t submsg_iter;\n        memset(pItem, 0, field->data_size);\n\n        if (pb_field_iter_begin(&submsg_iter, field->submsg_desc, pItem))\n        {\n            (void)pb_message_set_to_defaults(&submsg_iter);\n        }\n    }\n}\n#endif\n\nstatic bool checkreturn decode_pointer_field(pb_istream_t *stream, pb_wire_type_t wire_type, pb_field_iter_t *field)\n{\n#ifndef PB_ENABLE_MALLOC\n    PB_UNUSED(wire_type);\n    PB_UNUSED(field);\n    PB_RETURN_ERROR(stream, \"no malloc support\");\n#else\n    switch (PB_HTYPE(field->type))\n    {\n        case PB_HTYPE_REQUIRED:\n        case PB_HTYPE_OPTIONAL:\n        case PB_HTYPE_ONEOF:\n            if (!check_wire_type(wire_type, field))\n                PB_RETURN_ERROR(stream, \"wrong wire type\");\n\n            if (PB_LTYPE_IS_SUBMSG(field->type) && *(void**)field->pField != NULL)\n            {\n                /* Duplicate field, have to release the old allocation first. */\n                /* FIXME: Does this work correctly for oneofs? */\n                pb_release_single_field(field);\n            }\n        \n            if (PB_HTYPE(field->type) == PB_HTYPE_ONEOF)\n            {\n                *(pb_size_t*)field->pSize = field->tag;\n            }\n\n            if (PB_LTYPE(field->type) == PB_LTYPE_STRING ||\n                PB_LTYPE(field->type) == PB_LTYPE_BYTES)\n            {\n                /* pb_dec_string and pb_dec_bytes handle allocation themselves */\n                field->pData = field->pField;\n                return decode_basic_field(stream, field);\n            }\n            else\n            {\n                if (!allocate_field(stream, field->pField, field->data_size, 1))\n                    return false;\n                \n                field->pData = *(void**)field->pField;\n                initialize_pointer_field(field->pData, field);\n                return decode_basic_field(stream, field);\n            }\n    \n        case PB_HTYPE_REPEATED:\n            if (wire_type == PB_WT_STRING\n                && PB_LTYPE(field->type) <= PB_LTYPE_LAST_PACKABLE)\n            {\n                /* Packed array, multiple items come in at once. */\n                bool status = true;\n                pb_size_t *size = (pb_size_t*)field->pSize;\n                size_t allocated_size = *size;\n                pb_istream_t substream;\n                \n                if (!pb_make_string_substream(stream, &substream))\n                    return false;\n                \n                while (substream.bytes_left)\n                {\n                    if (*size == PB_SIZE_MAX)\n                    {\n#ifndef PB_NO_ERRMSG\n                        stream->errmsg = \"too many array entries\";\n#endif\n                        status = false;\n                        break;\n                    }\n\n                    if ((size_t)*size + 1 > allocated_size)\n                    {\n                        /* Allocate more storage. This tries to guess the\n                         * number of remaining entries. Round the division\n                         * upwards. */\n                        size_t remain = (substream.bytes_left - 1) / field->data_size + 1;\n                        if (remain < PB_SIZE_MAX - allocated_size)\n                            allocated_size += remain;\n                        else\n                            allocated_size += 1;\n                        \n                        if (!allocate_field(&substream, field->pField, field->data_size, allocated_size))\n                        {\n                            status = false;\n                            break;\n                        }\n                    }\n\n                    /* Decode the array entry */\n                    field->pData = *(char**)field->pField + field->data_size * (*size);\n                    initialize_pointer_field(field->pData, field);\n                    if (!decode_basic_field(&substream, field))\n                    {\n                        status = false;\n                        break;\n                    }\n                    \n                    (*size)++;\n                }\n                if (!pb_close_string_substream(stream, &substream))\n                    return false;\n                \n                return status;\n            }\n            else\n            {\n                /* Normal repeated field, i.e. only one item at a time. */\n                pb_size_t *size = (pb_size_t*)field->pSize;\n\n                if (*size == PB_SIZE_MAX)\n                    PB_RETURN_ERROR(stream, \"too many array entries\");\n                \n                if (!check_wire_type(wire_type, field))\n                    PB_RETURN_ERROR(stream, \"wrong wire type\");\n\n                if (!allocate_field(stream, field->pField, field->data_size, (size_t)(*size + 1)))\n                    return false;\n            \n                field->pData = *(char**)field->pField + field->data_size * (*size);\n                (*size)++;\n                initialize_pointer_field(field->pData, field);\n                return decode_basic_field(stream, field);\n            }\n\n        default:\n            PB_RETURN_ERROR(stream, \"invalid field type\");\n    }\n#endif\n}\n\nstatic bool checkreturn decode_callback_field(pb_istream_t *stream, pb_wire_type_t wire_type, pb_field_iter_t *field)\n{\n    if (!field->descriptor->field_callback)\n        return pb_skip_field(stream, wire_type);\n\n    if (wire_type == PB_WT_STRING)\n    {\n        pb_istream_t substream;\n        size_t prev_bytes_left;\n        \n        if (!pb_make_string_substream(stream, &substream))\n            return false;\n        \n        do\n        {\n            prev_bytes_left = substream.bytes_left;\n            if (!field->descriptor->field_callback(&substream, NULL, field))\n                PB_RETURN_ERROR(stream, \"callback failed\");\n        } while (substream.bytes_left > 0 && substream.bytes_left < prev_bytes_left);\n        \n        if (!pb_close_string_substream(stream, &substream))\n            return false;\n\n        return true;\n    }\n    else\n    {\n        /* Copy the single scalar value to stack.\n         * This is required so that we can limit the stream length,\n         * which in turn allows to use same callback for packed and\n         * not-packed fields. */\n        pb_istream_t substream;\n        pb_byte_t buffer[10];\n        size_t size = sizeof(buffer);\n        \n        if (!read_raw_value(stream, wire_type, buffer, &size))\n            return false;\n        substream = pb_istream_from_buffer(buffer, size);\n        \n        return field->descriptor->field_callback(&substream, NULL, field);\n    }\n}\n\nstatic bool checkreturn decode_field(pb_istream_t *stream, pb_wire_type_t wire_type, pb_field_iter_t *field)\n{\n#ifdef PB_ENABLE_MALLOC\n    /* When decoding an oneof field, check if there is old data that must be\n     * released first. */\n    if (PB_HTYPE(field->type) == PB_HTYPE_ONEOF)\n    {\n        if (!pb_release_union_field(stream, field))\n            return false;\n    }\n#endif\n\n    switch (PB_ATYPE(field->type))\n    {\n        case PB_ATYPE_STATIC:\n            return decode_static_field(stream, wire_type, field);\n        \n        case PB_ATYPE_POINTER:\n            return decode_pointer_field(stream, wire_type, field);\n        \n        case PB_ATYPE_CALLBACK:\n            return decode_callback_field(stream, wire_type, field);\n        \n        default:\n            PB_RETURN_ERROR(stream, \"invalid field type\");\n    }\n}\n\n/* Default handler for extension fields. Expects to have a pb_msgdesc_t\n * pointer in the extension->type->arg field, pointing to a message with\n * only one field in it.  */\nstatic bool checkreturn default_extension_decoder(pb_istream_t *stream,\n    pb_extension_t *extension, uint32_t tag, pb_wire_type_t wire_type)\n{\n    pb_field_iter_t iter;\n\n    if (!pb_field_iter_begin_extension(&iter, extension))\n        PB_RETURN_ERROR(stream, \"invalid extension\");\n\n    if (iter.tag != tag)\n        return true;\n\n    extension->found = true;\n    return decode_field(stream, wire_type, &iter);\n}\n\n/* Try to decode an unknown field as an extension field. Tries each extension\n * decoder in turn, until one of them handles the field or loop ends. */\nstatic bool checkreturn decode_extension(pb_istream_t *stream,\n    uint32_t tag, pb_wire_type_t wire_type, pb_field_iter_t *iter)\n{\n    pb_extension_t *extension = *(pb_extension_t* const *)iter->pData;\n    size_t pos = stream->bytes_left;\n    \n    while (extension != NULL && pos == stream->bytes_left)\n    {\n        bool status;\n        if (extension->type->decode)\n            status = extension->type->decode(stream, extension, tag, wire_type);\n        else\n            status = default_extension_decoder(stream, extension, tag, wire_type);\n\n        if (!status)\n            return false;\n        \n        extension = extension->next;\n    }\n    \n    return true;\n}\n\n/* Step through the iterator until an extension field is found or until all\n * entries have been checked. There can be only one extension field per\n * message. Returns false if no extension field is found. */\nstatic bool checkreturn find_extension_field(pb_field_iter_t *iter)\n{\n    pb_size_t start = iter->index;\n\n    do {\n        if (PB_LTYPE(iter->type) == PB_LTYPE_EXTENSION)\n            return true;\n        (void)pb_field_iter_next(iter);\n    } while (iter->index != start);\n    \n    return false;\n}\n\n/* Initialize message fields to default values, recursively */\nstatic bool pb_field_set_to_default(pb_field_iter_t *field)\n{\n    pb_type_t type;\n    type = field->type;\n\n    if (PB_LTYPE(type) == PB_LTYPE_EXTENSION)\n    {\n        pb_extension_t *ext = *(pb_extension_t* const *)field->pData;\n        while (ext != NULL)\n        {\n            pb_field_iter_t ext_iter;\n            if (pb_field_iter_begin_extension(&ext_iter, ext))\n            {\n                ext->found = false;\n                if (!pb_message_set_to_defaults(&ext_iter))\n                    return false;\n            }\n            ext = ext->next;\n        }\n    }\n    else if (PB_ATYPE(type) == PB_ATYPE_STATIC)\n    {\n        bool init_data = true;\n        if (PB_HTYPE(type) == PB_HTYPE_OPTIONAL && field->pSize != NULL)\n        {\n            /* Set has_field to false. Still initialize the optional field\n             * itself also. */\n            *(bool*)field->pSize = false;\n        }\n        else if (PB_HTYPE(type) == PB_HTYPE_REPEATED ||\n                 PB_HTYPE(type) == PB_HTYPE_ONEOF)\n        {\n            /* REPEATED: Set array count to 0, no need to initialize contents.\n               ONEOF: Set which_field to 0. */\n            *(pb_size_t*)field->pSize = 0;\n            init_data = false;\n        }\n\n        if (init_data)\n        {\n            if (PB_LTYPE_IS_SUBMSG(field->type))\n            {\n                /* Initialize submessage to defaults */\n                pb_field_iter_t submsg_iter;\n                if (pb_field_iter_begin(&submsg_iter, field->submsg_desc, field->pData))\n                {\n                    if (!pb_message_set_to_defaults(&submsg_iter))\n                        return false;\n                }\n            }\n            else\n            {\n                /* Initialize to zeros */\n                memset(field->pData, 0, (size_t)field->data_size);\n            }\n        }\n    }\n    else if (PB_ATYPE(type) == PB_ATYPE_POINTER)\n    {\n        /* Initialize the pointer to NULL. */\n        *(void**)field->pField = NULL;\n\n        /* Initialize array count to 0. */\n        if (PB_HTYPE(type) == PB_HTYPE_REPEATED ||\n            PB_HTYPE(type) == PB_HTYPE_ONEOF)\n        {\n            *(pb_size_t*)field->pSize = 0;\n        }\n    }\n    else if (PB_ATYPE(type) == PB_ATYPE_CALLBACK)\n    {\n        /* Don't overwrite callback */\n    }\n\n    return true;\n}\n\nstatic bool pb_message_set_to_defaults(pb_field_iter_t *iter)\n{\n    pb_istream_t defstream = PB_ISTREAM_EMPTY;\n    uint32_t tag = 0;\n    pb_wire_type_t wire_type = PB_WT_VARINT;\n    bool eof;\n\n    if (iter->descriptor->default_value)\n    {\n        defstream = pb_istream_from_buffer(iter->descriptor->default_value, (size_t)-1);\n        if (!pb_decode_tag(&defstream, &wire_type, &tag, &eof))\n            return false;\n    }\n\n    do\n    {\n        if (!pb_field_set_to_default(iter))\n            return false;\n\n        if (tag != 0 && iter->tag == tag)\n        {\n            /* We have a default value for this field in the defstream */\n            if (!decode_field(&defstream, wire_type, iter))\n                return false;\n            if (!pb_decode_tag(&defstream, &wire_type, &tag, &eof))\n                return false;\n\n            if (iter->pSize)\n                *(bool*)iter->pSize = false;\n        }\n    } while (pb_field_iter_next(iter));\n\n    return true;\n}\n\n/*********************\n * Decode all fields *\n *********************/\n\nstatic bool checkreturn pb_decode_inner(pb_istream_t *stream, const pb_msgdesc_t *fields, void *dest_struct, unsigned int flags)\n{\n    uint32_t extension_range_start = 0;\n\n    /* 'fixed_count_field' and 'fixed_count_size' track position of a repeated fixed\n     * count field. This can only handle _one_ repeated fixed count field that\n     * is unpacked and unordered among other (non repeated fixed count) fields.\n     */\n    pb_size_t fixed_count_field = PB_SIZE_MAX;\n    pb_size_t fixed_count_size = 0;\n    pb_size_t fixed_count_total_size = 0;\n\n    pb_fields_seen_t fields_seen = {{0, 0}};\n    const uint32_t allbits = ~(uint32_t)0;\n    pb_field_iter_t iter;\n\n    if (pb_field_iter_begin(&iter, fields, dest_struct))\n    {\n        if ((flags & PB_DECODE_NOINIT) == 0)\n        {\n            if (!pb_message_set_to_defaults(&iter))\n                PB_RETURN_ERROR(stream, \"failed to set defaults\");\n        }\n    }\n\n    while (stream->bytes_left)\n    {\n        uint32_t tag;\n        pb_wire_type_t wire_type;\n        bool eof;\n\n        if (!pb_decode_tag(stream, &wire_type, &tag, &eof))\n        {\n            if (eof)\n                break;\n            else\n                return false;\n        }\n\n        if (tag == 0)\n        {\n          if (flags & PB_DECODE_NULLTERMINATED)\n          {\n            break;\n          }\n          else\n          {\n            PB_RETURN_ERROR(stream, \"zero tag\");\n          }\n        }\n\n        if (!pb_field_iter_find(&iter, tag) || PB_LTYPE(iter.type) == PB_LTYPE_EXTENSION)\n        {\n            /* No match found, check if it matches an extension. */\n            if (tag >= extension_range_start)\n            {\n                if (!find_extension_field(&iter))\n                    extension_range_start = (uint32_t)-1;\n                else\n                    extension_range_start = iter.tag;\n\n                if (tag >= extension_range_start)\n                {\n                    size_t pos = stream->bytes_left;\n\n                    if (!decode_extension(stream, tag, wire_type, &iter))\n                        return false;\n\n                    if (pos != stream->bytes_left)\n                    {\n                        /* The field was handled */\n                        continue;\n                    }\n                }\n            }\n\n            /* No match found, skip data */\n            if (!pb_skip_field(stream, wire_type))\n                return false;\n            continue;\n        }\n\n        /* If a repeated fixed count field was found, get size from\n         * 'fixed_count_field' as there is no counter contained in the struct.\n         */\n        if (PB_HTYPE(iter.type) == PB_HTYPE_REPEATED && iter.pSize == &iter.array_size)\n        {\n            if (fixed_count_field != iter.index) {\n                /* If the new fixed count field does not match the previous one,\n                 * check that the previous one is NULL or that it finished\n                 * receiving all the expected data.\n                 */\n                if (fixed_count_field != PB_SIZE_MAX &&\n                    fixed_count_size != fixed_count_total_size)\n                {\n                    PB_RETURN_ERROR(stream, \"wrong size for fixed count field\");\n                }\n\n                fixed_count_field = iter.index;\n                fixed_count_size = 0;\n                fixed_count_total_size = iter.array_size;\n            }\n\n            iter.pSize = &fixed_count_size;\n        }\n\n        if (PB_HTYPE(iter.type) == PB_HTYPE_REQUIRED\n            && iter.required_field_index < PB_MAX_REQUIRED_FIELDS)\n        {\n            uint32_t tmp = ((uint32_t)1 << (iter.required_field_index & 31));\n            fields_seen.bitfield[iter.required_field_index >> 5] |= tmp;\n        }\n\n        if (!decode_field(stream, wire_type, &iter))\n            return false;\n    }\n\n    /* Check that all elements of the last decoded fixed count field were present. */\n    if (fixed_count_field != PB_SIZE_MAX &&\n        fixed_count_size != fixed_count_total_size)\n    {\n        PB_RETURN_ERROR(stream, \"wrong size for fixed count field\");\n    }\n\n    /* Check that all required fields were present. */\n    {\n        /* First figure out the number of required fields by\n         * seeking to the end of the field array. Usually we\n         * are already close to end after decoding.\n         */\n        pb_size_t req_field_count;\n        pb_type_t last_type;\n        pb_size_t i;\n        do {\n            req_field_count = iter.required_field_index;\n            last_type = iter.type;\n        } while (pb_field_iter_next(&iter));\n\n        /* Fixup if last field was also required. */\n        if (PB_HTYPE(last_type) == PB_HTYPE_REQUIRED && iter.tag != 0)\n            req_field_count++;\n\n        if (req_field_count > PB_MAX_REQUIRED_FIELDS)\n            req_field_count = PB_MAX_REQUIRED_FIELDS;\n\n        if (req_field_count > 0)\n        {\n            /* Check the whole words */\n            for (i = 0; i < (req_field_count >> 5); i++)\n            {\n                if (fields_seen.bitfield[i] != allbits)\n                    PB_RETURN_ERROR(stream, \"missing required field\");\n            }\n\n            /* Check the remaining bits (if any) */\n            if ((req_field_count & 31) != 0)\n            {\n                if (fields_seen.bitfield[req_field_count >> 5] !=\n                    (allbits >> (uint_least8_t)(32 - (req_field_count & 31))))\n                {\n                    PB_RETURN_ERROR(stream, \"missing required field\");\n                }\n            }\n        }\n    }\n\n    return true;\n}\n\nbool checkreturn pb_decode_ex(pb_istream_t *stream, const pb_msgdesc_t *fields, void *dest_struct, unsigned int flags)\n{\n    bool status;\n\n    if ((flags & PB_DECODE_DELIMITED) == 0)\n    {\n      status = pb_decode_inner(stream, fields, dest_struct, flags);\n    }\n    else\n    {\n      pb_istream_t substream;\n      if (!pb_make_string_substream(stream, &substream))\n        return false;\n\n      status = pb_decode_inner(&substream, fields, dest_struct, flags);\n\n      if (!pb_close_string_substream(stream, &substream))\n        return false;\n    }\n    \n#ifdef PB_ENABLE_MALLOC\n    if (!status)\n        pb_release(fields, dest_struct);\n#endif\n    \n    return status;\n}\n\nbool checkreturn pb_decode(pb_istream_t *stream, const pb_msgdesc_t *fields, void *dest_struct)\n{\n    bool status;\n\n    status = pb_decode_inner(stream, fields, dest_struct, 0);\n\n#ifdef PB_ENABLE_MALLOC\n    if (!status)\n        pb_release(fields, dest_struct);\n#endif\n\n    return status;\n}\n\n#ifdef PB_ENABLE_MALLOC\n/* Given an oneof field, if there has already been a field inside this oneof,\n * release it before overwriting with a different one. */\nstatic bool pb_release_union_field(pb_istream_t *stream, pb_field_iter_t *field)\n{\n    pb_field_iter_t old_field = *field;\n    pb_size_t old_tag = *(pb_size_t*)field->pSize; /* Previous which_ value */\n    pb_size_t new_tag = field->tag; /* New which_ value */\n\n    if (old_tag == 0)\n        return true; /* Ok, no old data in union */\n\n    if (old_tag == new_tag)\n        return true; /* Ok, old data is of same type => merge */\n\n    /* Release old data. The find can fail if the message struct contains\n     * invalid data. */\n    if (!pb_field_iter_find(&old_field, old_tag))\n        PB_RETURN_ERROR(stream, \"invalid union tag\");\n\n    pb_release_single_field(&old_field);\n\n    return true;\n}\n\nstatic void pb_release_single_field(pb_field_iter_t *field)\n{\n    pb_type_t type;\n    type = field->type;\n\n    if (PB_HTYPE(type) == PB_HTYPE_ONEOF)\n    {\n        if (*(pb_size_t*)field->pSize != field->tag)\n            return; /* This is not the current field in the union */\n    }\n\n    /* Release anything contained inside an extension or submsg.\n     * This has to be done even if the submsg itself is statically\n     * allocated. */\n    if (PB_LTYPE(type) == PB_LTYPE_EXTENSION)\n    {\n        /* Release fields from all extensions in the linked list */\n        pb_extension_t *ext = *(pb_extension_t**)field->pData;\n        while (ext != NULL)\n        {\n            pb_field_iter_t ext_iter;\n            if (pb_field_iter_begin_extension(&ext_iter, ext))\n            {\n                pb_release_single_field(&ext_iter);\n            }\n            ext = ext->next;\n        }\n    }\n    else if (PB_LTYPE_IS_SUBMSG(type) && PB_ATYPE(type) != PB_ATYPE_CALLBACK)\n    {\n        /* Release fields in submessage or submsg array */\n        pb_size_t count = 1;\n        \n        if (PB_ATYPE(type) == PB_ATYPE_POINTER)\n        {\n            field->pData = *(void**)field->pField;\n        }\n        else\n        {\n            field->pData = field->pField;\n        }\n        \n        if (PB_HTYPE(type) == PB_HTYPE_REPEATED)\n        {\n            count = *(pb_size_t*)field->pSize;\n\n            if (PB_ATYPE(type) == PB_ATYPE_STATIC && count > field->array_size)\n            {\n                /* Protect against corrupted _count fields */\n                count = field->array_size;\n            }\n        }\n        \n        if (field->pData)\n        {\n            while (count--)\n            {\n                pb_release(field->submsg_desc, field->pData);\n                field->pData = (char*)field->pData + field->data_size;\n            }\n        }\n    }\n    \n    if (PB_ATYPE(type) == PB_ATYPE_POINTER)\n    {\n        if (PB_HTYPE(type) == PB_HTYPE_REPEATED &&\n            (PB_LTYPE(type) == PB_LTYPE_STRING ||\n             PB_LTYPE(type) == PB_LTYPE_BYTES))\n        {\n            /* Release entries in repeated string or bytes array */\n            void **pItem = *(void***)field->pField;\n            pb_size_t count = *(pb_size_t*)field->pSize;\n            while (count--)\n            {\n                pb_free(*pItem);\n                *pItem++ = NULL;\n            }\n        }\n        \n        if (PB_HTYPE(type) == PB_HTYPE_REPEATED)\n        {\n            /* We are going to release the array, so set the size to 0 */\n            *(pb_size_t*)field->pSize = 0;\n        }\n        \n        /* Release main pointer */\n        pb_free(*(void**)field->pField);\n        *(void**)field->pField = NULL;\n    }\n}\n\nvoid pb_release(const pb_msgdesc_t *fields, void *dest_struct)\n{\n    pb_field_iter_t iter;\n    \n    if (!dest_struct)\n        return; /* Ignore NULL pointers, similar to free() */\n\n    if (!pb_field_iter_begin(&iter, fields, dest_struct))\n        return; /* Empty message type */\n    \n    do\n    {\n        pb_release_single_field(&iter);\n    } while (pb_field_iter_next(&iter));\n}\n#endif\n\n/* Field decoders */\n\nbool pb_decode_bool(pb_istream_t *stream, bool *dest)\n{\n    uint32_t value;\n    if (!pb_decode_varint32(stream, &value))\n        return false;\n\n    *(bool*)dest = (value != 0);\n    return true;\n}\n\nbool pb_decode_svarint(pb_istream_t *stream, pb_int64_t *dest)\n{\n    pb_uint64_t value;\n    if (!pb_decode_varint(stream, &value))\n        return false;\n    \n    if (value & 1)\n        *dest = (pb_int64_t)(~(value >> 1));\n    else\n        *dest = (pb_int64_t)(value >> 1);\n    \n    return true;\n}\n\nbool pb_decode_fixed32(pb_istream_t *stream, void *dest)\n{\n    union {\n        uint32_t fixed32;\n        pb_byte_t bytes[4];\n    } u;\n\n    if (!pb_read(stream, u.bytes, 4))\n        return false;\n\n#if defined(__BYTE_ORDER) && __BYTE_ORDER == __LITTLE_ENDIAN && CHAR_BIT == 8\n    /* fast path - if we know that we're on little endian, assign directly */\n    *(uint32_t*)dest = u.fixed32;\n#else\n    *(uint32_t*)dest = ((uint32_t)u.bytes[0] << 0) |\n                       ((uint32_t)u.bytes[1] << 8) |\n                       ((uint32_t)u.bytes[2] << 16) |\n                       ((uint32_t)u.bytes[3] << 24);\n#endif\n    return true;\n}\n\n#ifndef PB_WITHOUT_64BIT\nbool pb_decode_fixed64(pb_istream_t *stream, void *dest)\n{\n    union {\n        uint64_t fixed64;\n        pb_byte_t bytes[8];\n    } u;\n\n    if (!pb_read(stream, u.bytes, 8))\n        return false;\n\n#if defined(__BYTE_ORDER) && __BYTE_ORDER == __LITTLE_ENDIAN && CHAR_BIT == 8\n    /* fast path - if we know that we're on little endian, assign directly */\n    *(uint64_t*)dest = u.fixed64;\n#else\n    *(uint64_t*)dest = ((uint64_t)u.bytes[0] << 0) |\n                       ((uint64_t)u.bytes[1] << 8) |\n                       ((uint64_t)u.bytes[2] << 16) |\n                       ((uint64_t)u.bytes[3] << 24) |\n                       ((uint64_t)u.bytes[4] << 32) |\n                       ((uint64_t)u.bytes[5] << 40) |\n                       ((uint64_t)u.bytes[6] << 48) |\n                       ((uint64_t)u.bytes[7] << 56);\n#endif\n    return true;\n}\n#endif\n\nstatic bool checkreturn pb_dec_bool(pb_istream_t *stream, const pb_field_iter_t *field)\n{\n    return pb_decode_bool(stream, (bool*)field->pData);\n}\n\nstatic bool checkreturn pb_dec_varint(pb_istream_t *stream, const pb_field_iter_t *field)\n{\n    if (PB_LTYPE(field->type) == PB_LTYPE_UVARINT)\n    {\n        pb_uint64_t value, clamped;\n        if (!pb_decode_varint(stream, &value))\n            return false;\n\n        /* Cast to the proper field size, while checking for overflows */\n        if (field->data_size == sizeof(pb_uint64_t))\n            clamped = *(pb_uint64_t*)field->pData = value;\n        else if (field->data_size == sizeof(uint32_t))\n            clamped = *(uint32_t*)field->pData = (uint32_t)value;\n        else if (field->data_size == sizeof(uint_least16_t))\n            clamped = *(uint_least16_t*)field->pData = (uint_least16_t)value;\n        else if (field->data_size == sizeof(uint_least8_t))\n            clamped = *(uint_least8_t*)field->pData = (uint_least8_t)value;\n        else\n            PB_RETURN_ERROR(stream, \"invalid data_size\");\n\n        if (clamped != value)\n            PB_RETURN_ERROR(stream, \"integer too large\");\n\n        return true;\n    }\n    else\n    {\n        pb_uint64_t value;\n        pb_int64_t svalue;\n        pb_int64_t clamped;\n\n        if (PB_LTYPE(field->type) == PB_LTYPE_SVARINT)\n        {\n            if (!pb_decode_svarint(stream, &svalue))\n                return false;\n        }\n        else\n        {\n            if (!pb_decode_varint(stream, &value))\n                return false;\n\n            /* See issue 97: Google's C++ protobuf allows negative varint values to\n            * be cast as int32_t, instead of the int64_t that should be used when\n            * encoding. Previous nanopb versions had a bug in encoding. In order to\n            * not break decoding of such messages, we cast <=32 bit fields to\n            * int32_t first to get the sign correct.\n            */\n            if (field->data_size == sizeof(pb_int64_t))\n                svalue = (pb_int64_t)value;\n            else\n                svalue = (int32_t)value;\n        }\n\n        /* Cast to the proper field size, while checking for overflows */\n        if (field->data_size == sizeof(pb_int64_t))\n            clamped = *(pb_int64_t*)field->pData = svalue;\n        else if (field->data_size == sizeof(int32_t))\n            clamped = *(int32_t*)field->pData = (int32_t)svalue;\n        else if (field->data_size == sizeof(int_least16_t))\n            clamped = *(int_least16_t*)field->pData = (int_least16_t)svalue;\n        else if (field->data_size == sizeof(int_least8_t))\n            clamped = *(int_least8_t*)field->pData = (int_least8_t)svalue;\n        else\n            PB_RETURN_ERROR(stream, \"invalid data_size\");\n\n        if (clamped != svalue)\n            PB_RETURN_ERROR(stream, \"integer too large\");\n\n        return true;\n    }\n}\n\nstatic bool checkreturn pb_dec_fixed(pb_istream_t *stream, const pb_field_iter_t *field)\n{\n#ifdef PB_CONVERT_DOUBLE_FLOAT\n    if (field->data_size == sizeof(float) && PB_LTYPE(field->type) == PB_LTYPE_FIXED64)\n    {\n        return pb_decode_double_as_float(stream, (float*)field->pData);\n    }\n#endif\n\n    if (field->data_size == sizeof(uint32_t))\n    {\n        return pb_decode_fixed32(stream, field->pData);\n    }\n#ifndef PB_WITHOUT_64BIT\n    else if (field->data_size == sizeof(uint64_t))\n    {\n        return pb_decode_fixed64(stream, field->pData);\n    }\n#endif\n    else\n    {\n        PB_RETURN_ERROR(stream, \"invalid data_size\");\n    }\n}\n\nstatic bool checkreturn pb_dec_bytes(pb_istream_t *stream, const pb_field_iter_t *field)\n{\n    uint32_t size;\n    size_t alloc_size;\n    pb_bytes_array_t *dest;\n    \n    if (!pb_decode_varint32(stream, &size))\n        return false;\n    \n    if (size > PB_SIZE_MAX)\n        PB_RETURN_ERROR(stream, \"bytes overflow\");\n    \n    alloc_size = PB_BYTES_ARRAY_T_ALLOCSIZE(size);\n    if (size > alloc_size)\n        PB_RETURN_ERROR(stream, \"size too large\");\n    \n    if (PB_ATYPE(field->type) == PB_ATYPE_POINTER)\n    {\n#ifndef PB_ENABLE_MALLOC\n        PB_RETURN_ERROR(stream, \"no malloc support\");\n#else\n        if (stream->bytes_left < size)\n            PB_RETURN_ERROR(stream, \"end-of-stream\");\n\n        if (!allocate_field(stream, field->pData, alloc_size, 1))\n            return false;\n        dest = *(pb_bytes_array_t**)field->pData;\n#endif\n    }\n    else\n    {\n        if (alloc_size > field->data_size)\n            PB_RETURN_ERROR(stream, \"bytes overflow\");\n        dest = (pb_bytes_array_t*)field->pData;\n    }\n\n    dest->size = (pb_size_t)size;\n    return pb_read(stream, dest->bytes, (size_t)size);\n}\n\nstatic bool checkreturn pb_dec_string(pb_istream_t *stream, const pb_field_iter_t *field)\n{\n    uint32_t size;\n    size_t alloc_size;\n    pb_byte_t *dest = (pb_byte_t*)field->pData;\n\n    if (!pb_decode_varint32(stream, &size))\n        return false;\n\n    if (size == (uint32_t)-1)\n        PB_RETURN_ERROR(stream, \"size too large\");\n\n    /* Space for null terminator */\n    alloc_size = (size_t)(size + 1);\n\n    if (alloc_size < size)\n        PB_RETURN_ERROR(stream, \"size too large\");\n\n    if (PB_ATYPE(field->type) == PB_ATYPE_POINTER)\n    {\n#ifndef PB_ENABLE_MALLOC\n        PB_RETURN_ERROR(stream, \"no malloc support\");\n#else\n        if (stream->bytes_left < size)\n            PB_RETURN_ERROR(stream, \"end-of-stream\");\n\n        if (!allocate_field(stream, field->pData, alloc_size, 1))\n            return false;\n        dest = *(pb_byte_t**)field->pData;\n#endif\n    }\n    else\n    {\n        if (alloc_size > field->data_size)\n            PB_RETURN_ERROR(stream, \"string overflow\");\n    }\n    \n    dest[size] = 0;\n\n    if (!pb_read(stream, dest, (size_t)size))\n        return false;\n\n#ifdef PB_VALIDATE_UTF8\n    if (!pb_validate_utf8((const char*)dest))\n        PB_RETURN_ERROR(stream, \"invalid utf8\");\n#endif\n\n    return true;\n}\n\nstatic bool checkreturn pb_dec_submessage(pb_istream_t *stream, const pb_field_iter_t *field)\n{\n    bool status = true;\n    pb_istream_t substream;\n\n    if (!pb_make_string_substream(stream, &substream))\n        return false;\n    \n    if (field->submsg_desc == NULL)\n        PB_RETURN_ERROR(stream, \"invalid field descriptor\");\n    \n    /* New array entries need to be initialized, while required and optional\n     * submessages have already been initialized in the top-level pb_decode. */\n    if (PB_HTYPE(field->type) == PB_HTYPE_REPEATED ||\n        PB_HTYPE(field->type) == PB_HTYPE_ONEOF)\n    {\n        pb_field_iter_t submsg_iter;\n        if (pb_field_iter_begin(&submsg_iter, field->submsg_desc, field->pData))\n        {\n            if (!pb_message_set_to_defaults(&submsg_iter))\n                PB_RETURN_ERROR(stream, \"failed to set defaults\");\n        }\n    }\n\n    /* Submessages can have a separate message-level callback that is called\n     * before decoding the message. Typically it is used to set callback fields\n     * inside oneofs. */\n    if (PB_LTYPE(field->type) == PB_LTYPE_SUBMSG_W_CB && field->pSize != NULL)\n    {\n        /* Message callback is stored right before pSize. */\n        pb_callback_t *callback = (pb_callback_t*)field->pSize - 1;\n        if (callback->funcs.decode)\n        {\n            status = callback->funcs.decode(&substream, field, &callback->arg);\n        }\n    }\n\n    /* Now decode the submessage contents */\n    if (status)\n    {\n        status = pb_decode_inner(&substream, field->submsg_desc, field->pData, 0);\n    }\n    \n    if (!pb_close_string_substream(stream, &substream))\n        return false;\n\n    return status;\n}\n\nstatic bool checkreturn pb_dec_fixed_length_bytes(pb_istream_t *stream, const pb_field_iter_t *field)\n{\n    uint32_t size;\n\n    if (!pb_decode_varint32(stream, &size))\n        return false;\n\n    if (size > PB_SIZE_MAX)\n        PB_RETURN_ERROR(stream, \"bytes overflow\");\n\n    if (size == 0)\n    {\n        /* As a special case, treat empty bytes string as all zeros for fixed_length_bytes. */\n        memset(field->pData, 0, (size_t)field->data_size);\n        return true;\n    }\n\n    if (size != field->data_size)\n        PB_RETURN_ERROR(stream, \"incorrect fixed length bytes size\");\n\n    return pb_read(stream, (pb_byte_t*)field->pData, (size_t)field->data_size);\n}\n\n#ifdef PB_CONVERT_DOUBLE_FLOAT\nbool pb_decode_double_as_float(pb_istream_t *stream, float *dest)\n{\n    uint_least8_t sign;\n    int exponent;\n    uint32_t mantissa;\n    uint64_t value;\n    union { float f; uint32_t i; } out;\n\n    if (!pb_decode_fixed64(stream, &value))\n        return false;\n\n    /* Decompose input value */\n    sign = (uint_least8_t)((value >> 63) & 1);\n    exponent = (int)((value >> 52) & 0x7FF) - 1023;\n    mantissa = (value >> 28) & 0xFFFFFF; /* Highest 24 bits */\n\n    /* Figure if value is in range representable by floats. */\n    if (exponent == 1024)\n    {\n        /* Special value */\n        exponent = 128;\n    }\n    else if (exponent > 127)\n    {\n        /* Too large, convert to infinity */\n        exponent = 128;\n        mantissa = 0;\n    }\n    else if (exponent < -150)\n    {\n        /* Too small, convert to zero */\n        exponent = -127;\n        mantissa = 0;\n    }\n    else if (exponent < -126)\n    {\n        /* Denormalized */\n        mantissa |= 0x1000000;\n        mantissa >>= (-126 - exponent);\n        exponent = -127;\n    }\n\n    /* Round off mantissa */\n    mantissa = (mantissa + 1) >> 1;\n\n    /* Check if mantissa went over 2.0 */\n    if (mantissa & 0x800000)\n    {\n        exponent += 1;\n        mantissa &= 0x7FFFFF;\n        mantissa >>= 1;\n    }\n\n    /* Combine fields */\n    out.i = mantissa;\n    out.i |= (uint32_t)(exponent + 127) << 23;\n    out.i |= (uint32_t)sign << 31;\n\n    *dest = out.f;\n    return true;\n}\n#endif\n"
  },
  {
    "path": "c/core/src/pb_encode.c",
    "content": "/* pb_encode.c -- encode a protobuf using minimal resources\n *\n * 2011 Petteri Aimonen <jpa@kapsi.fi>\n */\n\n#include \"pb.h\"\n#include \"pb_encode.h\"\n#include \"pb_common.h\"\n\n/* Use the GCC warn_unused_result attribute to check that all return values\n * are propagated correctly. On other compilers and gcc before 3.4.0 just\n * ignore the annotation.\n */\n#if !defined(__GNUC__) || ( __GNUC__ < 3) || (__GNUC__ == 3 && __GNUC_MINOR__ < 4)\n    #define checkreturn\n#else\n    #define checkreturn __attribute__((warn_unused_result))\n#endif\n\n/**************************************\n * Declarations internal to this file *\n **************************************/\nstatic bool checkreturn buf_write(pb_ostream_t *stream, const pb_byte_t *buf, size_t count);\nstatic bool checkreturn encode_array(pb_ostream_t *stream, pb_field_iter_t *field);\nstatic bool checkreturn pb_check_proto3_default_value(const pb_field_iter_t *field);\nstatic bool checkreturn encode_basic_field(pb_ostream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn encode_callback_field(pb_ostream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn encode_field(pb_ostream_t *stream, pb_field_iter_t *field);\nstatic bool checkreturn encode_extension_field(pb_ostream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn default_extension_encoder(pb_ostream_t *stream, const pb_extension_t *extension);\nstatic bool checkreturn pb_encode_varint_32(pb_ostream_t *stream, uint32_t low, uint32_t high);\nstatic bool checkreturn pb_enc_bool(pb_ostream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn pb_enc_varint(pb_ostream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn pb_enc_fixed(pb_ostream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn pb_enc_bytes(pb_ostream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn pb_enc_string(pb_ostream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn pb_enc_submessage(pb_ostream_t *stream, const pb_field_iter_t *field);\nstatic bool checkreturn pb_enc_fixed_length_bytes(pb_ostream_t *stream, const pb_field_iter_t *field);\n\n#ifdef PB_WITHOUT_64BIT\n#define pb_int64_t int32_t\n#define pb_uint64_t uint32_t\n#else\n#define pb_int64_t int64_t\n#define pb_uint64_t uint64_t\n#endif\n\n/*******************************\n * pb_ostream_t implementation *\n *******************************/\n\nstatic bool checkreturn buf_write(pb_ostream_t *stream, const pb_byte_t *buf, size_t count)\n{\n    size_t i;\n    pb_byte_t *dest = (pb_byte_t*)stream->state;\n    stream->state = dest + count;\n    \n    for (i = 0; i < count; i++)\n        dest[i] = buf[i];\n    \n    return true;\n}\n\npb_ostream_t pb_ostream_from_buffer(pb_byte_t *buf, size_t bufsize)\n{\n    pb_ostream_t stream;\n#ifdef PB_BUFFER_ONLY\n    stream.callback = (void*)1; /* Just a marker value */\n#else\n    stream.callback = &buf_write;\n#endif\n    stream.state = buf;\n    stream.max_size = bufsize;\n    stream.bytes_written = 0;\n#ifndef PB_NO_ERRMSG\n    stream.errmsg = NULL;\n#endif\n    return stream;\n}\n\nbool checkreturn pb_write(pb_ostream_t *stream, const pb_byte_t *buf, size_t count)\n{\n    if (count > 0 && stream->callback != NULL)\n    {\n        if (stream->bytes_written + count > stream->max_size)\n            PB_RETURN_ERROR(stream, \"stream full\");\n\n#ifdef PB_BUFFER_ONLY\n        if (!buf_write(stream, buf, count))\n            PB_RETURN_ERROR(stream, \"io error\");\n#else        \n        if (!stream->callback(stream, buf, count))\n            PB_RETURN_ERROR(stream, \"io error\");\n#endif\n    }\n    \n    stream->bytes_written += count;\n    return true;\n}\n\n/*************************\n * Encode a single field *\n *************************/\n\n/* Read a bool value without causing undefined behavior even if the value\n * is invalid. See issue #434 and\n * https://stackoverflow.com/questions/27661768/weird-results-for-conditional\n */\nstatic bool safe_read_bool(const void *pSize)\n{\n    const char *p = (const char *)pSize;\n    size_t i;\n    for (i = 0; i < sizeof(bool); i++)\n    {\n        if (p[i] != 0)\n            return true;\n    }\n    return false;\n}\n\n/* Encode a static array. Handles the size calculations and possible packing. */\nstatic bool checkreturn encode_array(pb_ostream_t *stream, pb_field_iter_t *field)\n{\n    pb_size_t i;\n    pb_size_t count;\n#ifndef PB_ENCODE_ARRAYS_UNPACKED\n    size_t size;\n#endif\n\n    count = *(pb_size_t*)field->pSize;\n\n    if (count == 0)\n        return true;\n\n    if (PB_ATYPE(field->type) != PB_ATYPE_POINTER && count > field->array_size)\n        PB_RETURN_ERROR(stream, \"array max size exceeded\");\n    \n#ifndef PB_ENCODE_ARRAYS_UNPACKED\n    /* We always pack arrays if the datatype allows it. */\n    if (PB_LTYPE(field->type) <= PB_LTYPE_LAST_PACKABLE)\n    {\n        if (!pb_encode_tag(stream, PB_WT_STRING, field->tag))\n            return false;\n        \n        /* Determine the total size of packed array. */\n        if (PB_LTYPE(field->type) == PB_LTYPE_FIXED32)\n        {\n            size = 4 * (size_t)count;\n        }\n        else if (PB_LTYPE(field->type) == PB_LTYPE_FIXED64)\n        {\n            size = 8 * (size_t)count;\n        }\n        else\n        { \n            pb_ostream_t sizestream = PB_OSTREAM_SIZING;\n            void *pData_orig = field->pData;\n            for (i = 0; i < count; i++)\n            {\n                if (!pb_enc_varint(&sizestream, field))\n                    PB_RETURN_ERROR(stream, PB_GET_ERROR(&sizestream));\n                field->pData = (char*)field->pData + field->data_size;\n            }\n            field->pData = pData_orig;\n            size = sizestream.bytes_written;\n        }\n        \n        if (!pb_encode_varint(stream, (pb_uint64_t)size))\n            return false;\n        \n        if (stream->callback == NULL)\n            return pb_write(stream, NULL, size); /* Just sizing.. */\n        \n        /* Write the data */\n        for (i = 0; i < count; i++)\n        {\n            if (PB_LTYPE(field->type) == PB_LTYPE_FIXED32 || PB_LTYPE(field->type) == PB_LTYPE_FIXED64)\n            {\n                if (!pb_enc_fixed(stream, field))\n                    return false;\n            }\n            else\n            {\n                if (!pb_enc_varint(stream, field))\n                    return false;\n            }\n\n            field->pData = (char*)field->pData + field->data_size;\n        }\n    }\n    else /* Unpacked fields */\n#endif\n    {\n        for (i = 0; i < count; i++)\n        {\n            /* Normally the data is stored directly in the array entries, but\n             * for pointer-type string and bytes fields, the array entries are\n             * actually pointers themselves also. So we have to dereference once\n             * more to get to the actual data. */\n            if (PB_ATYPE(field->type) == PB_ATYPE_POINTER &&\n                (PB_LTYPE(field->type) == PB_LTYPE_STRING ||\n                 PB_LTYPE(field->type) == PB_LTYPE_BYTES))\n            {\n                bool status;\n                void *pData_orig = field->pData;\n                field->pData = *(void* const*)field->pData;\n\n                if (!field->pData)\n                {\n                    /* Null pointer in array is treated as empty string / bytes */\n                    status = pb_encode_tag_for_field(stream, field) &&\n                             pb_encode_varint(stream, 0);\n                }\n                else\n                {\n                    status = encode_basic_field(stream, field);\n                }\n\n                field->pData = pData_orig;\n\n                if (!status)\n                    return false;\n            }\n            else\n            {\n                if (!encode_basic_field(stream, field))\n                    return false;\n            }\n            field->pData = (char*)field->pData + field->data_size;\n        }\n    }\n    \n    return true;\n}\n\n/* In proto3, all fields are optional and are only encoded if their value is \"non-zero\".\n * This function implements the check for the zero value. */\nstatic bool checkreturn pb_check_proto3_default_value(const pb_field_iter_t *field)\n{\n    pb_type_t type = field->type;\n\n    if (PB_ATYPE(type) == PB_ATYPE_STATIC)\n    {\n        if (PB_HTYPE(type) == PB_HTYPE_REQUIRED)\n        {\n            /* Required proto2 fields inside proto3 submessage, pretty rare case */\n            return false;\n        }\n        else if (PB_HTYPE(type) == PB_HTYPE_REPEATED)\n        {\n            /* Repeated fields inside proto3 submessage: present if count != 0 */\n            return *(const pb_size_t*)field->pSize == 0;\n        }\n        else if (PB_HTYPE(type) == PB_HTYPE_ONEOF)\n        {\n            /* Oneof fields */\n            return *(const pb_size_t*)field->pSize == 0;\n        }\n        else if (PB_HTYPE(type) == PB_HTYPE_OPTIONAL && field->pSize != NULL)\n        {\n            /* Proto2 optional fields inside proto3 message, or proto3\n             * submessage fields. */\n            return safe_read_bool(field->pSize) == false;\n        }\n\n        /* Rest is proto3 singular fields */\n        if (PB_LTYPE(type) == PB_LTYPE_BYTES)\n        {\n            const pb_bytes_array_t *bytes = (const pb_bytes_array_t*)field->pData;\n            return bytes->size == 0;\n        }\n        else if (PB_LTYPE(type) == PB_LTYPE_STRING)\n        {\n            return *(const char*)field->pData == '\\0';\n        }\n        else if (PB_LTYPE(type) == PB_LTYPE_FIXED_LENGTH_BYTES)\n        {\n            /* Fixed length bytes is only empty if its length is fixed\n             * as 0. Which would be pretty strange, but we can check\n             * it anyway. */\n            return field->data_size == 0;\n        }\n        else if (PB_LTYPE_IS_SUBMSG(type))\n        {\n            /* Check all fields in the submessage to find if any of them\n             * are non-zero. The comparison cannot be done byte-per-byte\n             * because the C struct may contain padding bytes that must\n             * be skipped. Note that usually proto3 submessages have\n             * a separate has_field that is checked earlier in this if.\n             */\n            pb_field_iter_t iter;\n            if (pb_field_iter_begin(&iter, field->submsg_desc, field->pData))\n            {\n                do\n                {\n                    if (!pb_check_proto3_default_value(&iter))\n                    {\n                        return false;\n                    }\n                } while (pb_field_iter_next(&iter));\n            }\n            return true;\n        }\n    }\n    \n    {\n        /* Catch-all branch that does byte-per-byte comparison for zero value.\n         *\n         * This is for all pointer fields, and for static PB_LTYPE_VARINT,\n         * UVARINT, SVARINT, FIXED32, FIXED64, EXTENSION fields, and also\n         * callback fields. These all have integer or pointer value which\n         * can be compared with 0.\n         */\n        pb_size_t i;\n        const char *p = (const char*)field->pData;\n        for (i = 0; i < field->data_size; i++)\n        {\n            if (p[i] != 0)\n            {\n                return false;\n            }\n        }\n\n        return true;\n    }\n}\n\n/* Encode a field with static or pointer allocation, i.e. one whose data\n * is available to the encoder directly. */\nstatic bool checkreturn encode_basic_field(pb_ostream_t *stream, const pb_field_iter_t *field)\n{\n    if (!field->pData)\n    {\n        /* Missing pointer field */\n        return true;\n    }\n\n    if (!pb_encode_tag_for_field(stream, field))\n        return false;\n\n    switch (PB_LTYPE(field->type))\n    {\n        case PB_LTYPE_BOOL:\n            return pb_enc_bool(stream, field);\n\n        case PB_LTYPE_VARINT:\n        case PB_LTYPE_UVARINT:\n        case PB_LTYPE_SVARINT:\n            return pb_enc_varint(stream, field);\n\n        case PB_LTYPE_FIXED32:\n        case PB_LTYPE_FIXED64:\n            return pb_enc_fixed(stream, field);\n\n        case PB_LTYPE_BYTES:\n            return pb_enc_bytes(stream, field);\n\n        case PB_LTYPE_STRING:\n            return pb_enc_string(stream, field);\n\n        case PB_LTYPE_SUBMESSAGE:\n        case PB_LTYPE_SUBMSG_W_CB:\n            return pb_enc_submessage(stream, field);\n\n        case PB_LTYPE_FIXED_LENGTH_BYTES:\n            return pb_enc_fixed_length_bytes(stream, field);\n\n        default:\n            PB_RETURN_ERROR(stream, \"invalid field type\");\n    }\n}\n\n/* Encode a field with callback semantics. This means that a user function is\n * called to provide and encode the actual data. */\nstatic bool checkreturn encode_callback_field(pb_ostream_t *stream, const pb_field_iter_t *field)\n{\n    if (field->descriptor->field_callback != NULL)\n    {\n        if (!field->descriptor->field_callback(NULL, stream, field))\n            PB_RETURN_ERROR(stream, \"callback error\");\n    }\n    return true;\n}\n\n/* Encode a single field of any callback, pointer or static type. */\nstatic bool checkreturn encode_field(pb_ostream_t *stream, pb_field_iter_t *field)\n{\n    /* Check field presence */\n    if (PB_HTYPE(field->type) == PB_HTYPE_ONEOF)\n    {\n        if (*(const pb_size_t*)field->pSize != field->tag)\n        {\n            /* Different type oneof field */\n            return true;\n        }\n    }\n    else if (PB_HTYPE(field->type) == PB_HTYPE_OPTIONAL)\n    {\n        if (field->pSize)\n        {\n            if (safe_read_bool(field->pSize) == false)\n            {\n                /* Missing optional field */\n                return true;\n            }\n        }\n        else if (PB_ATYPE(field->type) == PB_ATYPE_STATIC)\n        {\n            /* Proto3 singular field */\n            if (pb_check_proto3_default_value(field))\n                return true;\n        }\n    }\n\n    if (!field->pData)\n    {\n        if (PB_HTYPE(field->type) == PB_HTYPE_REQUIRED)\n            PB_RETURN_ERROR(stream, \"missing required field\");\n\n        /* Pointer field set to NULL */\n        return true;\n    }\n\n    /* Then encode field contents */\n    if (PB_ATYPE(field->type) == PB_ATYPE_CALLBACK)\n    {\n        return encode_callback_field(stream, field);\n    }\n    else if (PB_HTYPE(field->type) == PB_HTYPE_REPEATED)\n    {\n        return encode_array(stream, field);\n    }\n    else\n    {\n        return encode_basic_field(stream, field);\n    }\n}\n\n/* Default handler for extension fields. Expects to have a pb_msgdesc_t\n * pointer in the extension->type->arg field, pointing to a message with\n * only one field in it.  */\nstatic bool checkreturn default_extension_encoder(pb_ostream_t *stream, const pb_extension_t *extension)\n{\n    pb_field_iter_t iter;\n\n    if (!pb_field_iter_begin_extension_const(&iter, extension))\n        PB_RETURN_ERROR(stream, \"invalid extension\");\n\n    return encode_field(stream, &iter);\n}\n\n\n/* Walk through all the registered extensions and give them a chance\n * to encode themselves. */\nstatic bool checkreturn encode_extension_field(pb_ostream_t *stream, const pb_field_iter_t *field)\n{\n    const pb_extension_t *extension = *(const pb_extension_t* const *)field->pData;\n\n    while (extension)\n    {\n        bool status;\n        if (extension->type->encode)\n            status = extension->type->encode(stream, extension);\n        else\n            status = default_extension_encoder(stream, extension);\n\n        if (!status)\n            return false;\n        \n        extension = extension->next;\n    }\n    \n    return true;\n}\n\n/*********************\n * Encode all fields *\n *********************/\n\nbool checkreturn pb_encode(pb_ostream_t *stream, const pb_msgdesc_t *fields, const void *src_struct)\n{\n    pb_field_iter_t iter;\n    if (!pb_field_iter_begin_const(&iter, fields, src_struct))\n        return true; /* Empty message type */\n    \n    do {\n        if (PB_LTYPE(iter.type) == PB_LTYPE_EXTENSION)\n        {\n            /* Special case for the extension field placeholder */\n            if (!encode_extension_field(stream, &iter))\n                return false;\n        }\n        else\n        {\n            /* Regular field */\n            if (!encode_field(stream, &iter))\n                return false;\n        }\n    } while (pb_field_iter_next(&iter));\n    \n    return true;\n}\n\nbool checkreturn pb_encode_ex(pb_ostream_t *stream, const pb_msgdesc_t *fields, const void *src_struct, unsigned int flags)\n{\n  if ((flags & PB_ENCODE_DELIMITED) != 0)\n  {\n    return pb_encode_submessage(stream, fields, src_struct);\n  }\n  else if ((flags & PB_ENCODE_NULLTERMINATED) != 0)\n  {\n    const pb_byte_t zero = 0;\n\n    if (!pb_encode(stream, fields, src_struct))\n        return false;\n\n    return pb_write(stream, &zero, 1);\n  }\n  else\n  {\n    return pb_encode(stream, fields, src_struct);\n  }\n}\n\nbool pb_get_encoded_size(size_t *size, const pb_msgdesc_t *fields, const void *src_struct)\n{\n    pb_ostream_t stream = PB_OSTREAM_SIZING;\n    \n    if (!pb_encode(&stream, fields, src_struct))\n        return false;\n    \n    *size = stream.bytes_written;\n    return true;\n}\n\n/********************\n * Helper functions *\n ********************/\n\n/* This function avoids 64-bit shifts as they are quite slow on many platforms. */\nstatic bool checkreturn pb_encode_varint_32(pb_ostream_t *stream, uint32_t low, uint32_t high)\n{\n    size_t i = 0;\n    pb_byte_t buffer[10];\n    pb_byte_t byte = (pb_byte_t)(low & 0x7F);\n    low >>= 7;\n\n    while (i < 4 && (low != 0 || high != 0))\n    {\n        byte |= 0x80;\n        buffer[i++] = byte;\n        byte = (pb_byte_t)(low & 0x7F);\n        low >>= 7;\n    }\n\n    if (high)\n    {\n        byte = (pb_byte_t)(byte | ((high & 0x07) << 4));\n        high >>= 3;\n\n        while (high)\n        {\n            byte |= 0x80;\n            buffer[i++] = byte;\n            byte = (pb_byte_t)(high & 0x7F);\n            high >>= 7;\n        }\n    }\n\n    buffer[i++] = byte;\n\n    return pb_write(stream, buffer, i);\n}\n\nbool checkreturn pb_encode_varint(pb_ostream_t *stream, pb_uint64_t value)\n{\n    if (value <= 0x7F)\n    {\n        /* Fast path: single byte */\n        pb_byte_t byte = (pb_byte_t)value;\n        return pb_write(stream, &byte, 1);\n    }\n    else\n    {\n#ifdef PB_WITHOUT_64BIT\n        return pb_encode_varint_32(stream, value, 0);\n#else\n        return pb_encode_varint_32(stream, (uint32_t)value, (uint32_t)(value >> 32));\n#endif\n    }\n}\n\nbool checkreturn pb_encode_svarint(pb_ostream_t *stream, pb_int64_t value)\n{\n    pb_uint64_t zigzagged;\n    if (value < 0)\n        zigzagged = ~((pb_uint64_t)value << 1);\n    else\n        zigzagged = (pb_uint64_t)value << 1;\n    \n    return pb_encode_varint(stream, zigzagged);\n}\n\nbool checkreturn pb_encode_fixed32(pb_ostream_t *stream, const void *value)\n{\n    uint32_t val = *(const uint32_t*)value;\n    pb_byte_t bytes[4];\n    bytes[0] = (pb_byte_t)(val & 0xFF);\n    bytes[1] = (pb_byte_t)((val >> 8) & 0xFF);\n    bytes[2] = (pb_byte_t)((val >> 16) & 0xFF);\n    bytes[3] = (pb_byte_t)((val >> 24) & 0xFF);\n    return pb_write(stream, bytes, 4);\n}\n\n#ifndef PB_WITHOUT_64BIT\nbool checkreturn pb_encode_fixed64(pb_ostream_t *stream, const void *value)\n{\n    uint64_t val = *(const uint64_t*)value;\n    pb_byte_t bytes[8];\n    bytes[0] = (pb_byte_t)(val & 0xFF);\n    bytes[1] = (pb_byte_t)((val >> 8) & 0xFF);\n    bytes[2] = (pb_byte_t)((val >> 16) & 0xFF);\n    bytes[3] = (pb_byte_t)((val >> 24) & 0xFF);\n    bytes[4] = (pb_byte_t)((val >> 32) & 0xFF);\n    bytes[5] = (pb_byte_t)((val >> 40) & 0xFF);\n    bytes[6] = (pb_byte_t)((val >> 48) & 0xFF);\n    bytes[7] = (pb_byte_t)((val >> 56) & 0xFF);\n    return pb_write(stream, bytes, 8);\n}\n#endif\n\nbool checkreturn pb_encode_tag(pb_ostream_t *stream, pb_wire_type_t wiretype, uint32_t field_number)\n{\n    pb_uint64_t tag = ((pb_uint64_t)field_number << 3) | wiretype;\n    return pb_encode_varint(stream, tag);\n}\n\nbool pb_encode_tag_for_field ( pb_ostream_t* stream, const pb_field_iter_t* field )\n{\n    pb_wire_type_t wiretype;\n    switch (PB_LTYPE(field->type))\n    {\n        case PB_LTYPE_BOOL:\n        case PB_LTYPE_VARINT:\n        case PB_LTYPE_UVARINT:\n        case PB_LTYPE_SVARINT:\n            wiretype = PB_WT_VARINT;\n            break;\n        \n        case PB_LTYPE_FIXED32:\n            wiretype = PB_WT_32BIT;\n            break;\n        \n        case PB_LTYPE_FIXED64:\n            wiretype = PB_WT_64BIT;\n            break;\n        \n        case PB_LTYPE_BYTES:\n        case PB_LTYPE_STRING:\n        case PB_LTYPE_SUBMESSAGE:\n        case PB_LTYPE_SUBMSG_W_CB:\n        case PB_LTYPE_FIXED_LENGTH_BYTES:\n            wiretype = PB_WT_STRING;\n            break;\n        \n        default:\n            PB_RETURN_ERROR(stream, \"invalid field type\");\n    }\n    \n    return pb_encode_tag(stream, wiretype, field->tag);\n}\n\nbool checkreturn pb_encode_string(pb_ostream_t *stream, const pb_byte_t *buffer, size_t size)\n{\n    if (!pb_encode_varint(stream, (pb_uint64_t)size))\n        return false;\n    \n    return pb_write(stream, buffer, size);\n}\n\nbool checkreturn pb_encode_submessage(pb_ostream_t *stream, const pb_msgdesc_t *fields, const void *src_struct)\n{\n    /* First calculate the message size using a non-writing substream. */\n    pb_ostream_t substream = PB_OSTREAM_SIZING;\n    size_t size;\n    bool status;\n    \n    if (!pb_encode(&substream, fields, src_struct))\n    {\n#ifndef PB_NO_ERRMSG\n        stream->errmsg = substream.errmsg;\n#endif\n        return false;\n    }\n    \n    size = substream.bytes_written;\n    \n    if (!pb_encode_varint(stream, (pb_uint64_t)size))\n        return false;\n    \n    if (stream->callback == NULL)\n        return pb_write(stream, NULL, size); /* Just sizing */\n    \n    if (stream->bytes_written + size > stream->max_size)\n        PB_RETURN_ERROR(stream, \"stream full\");\n        \n    /* Use a substream to verify that a callback doesn't write more than\n     * what it did the first time. */\n    substream.callback = stream->callback;\n    substream.state = stream->state;\n    substream.max_size = size;\n    substream.bytes_written = 0;\n#ifndef PB_NO_ERRMSG\n    substream.errmsg = NULL;\n#endif\n    \n    status = pb_encode(&substream, fields, src_struct);\n    \n    stream->bytes_written += substream.bytes_written;\n    stream->state = substream.state;\n#ifndef PB_NO_ERRMSG\n    stream->errmsg = substream.errmsg;\n#endif\n    \n    if (substream.bytes_written != size)\n        PB_RETURN_ERROR(stream, \"submsg size changed\");\n    \n    return status;\n}\n\n/* Field encoders */\n\nstatic bool checkreturn pb_enc_bool(pb_ostream_t *stream, const pb_field_iter_t *field)\n{\n    uint32_t value = safe_read_bool(field->pData) ? 1 : 0;\n    PB_UNUSED(field);\n    return pb_encode_varint(stream, value);\n}\n\nstatic bool checkreturn pb_enc_varint(pb_ostream_t *stream, const pb_field_iter_t *field)\n{\n    if (PB_LTYPE(field->type) == PB_LTYPE_UVARINT)\n    {\n        /* Perform unsigned integer extension */\n        pb_uint64_t value = 0;\n\n        if (field->data_size == sizeof(uint_least8_t))\n            value = *(const uint_least8_t*)field->pData;\n        else if (field->data_size == sizeof(uint_least16_t))\n            value = *(const uint_least16_t*)field->pData;\n        else if (field->data_size == sizeof(uint32_t))\n            value = *(const uint32_t*)field->pData;\n        else if (field->data_size == sizeof(pb_uint64_t))\n            value = *(const pb_uint64_t*)field->pData;\n        else\n            PB_RETURN_ERROR(stream, \"invalid data_size\");\n\n        return pb_encode_varint(stream, value);\n    }\n    else\n    {\n        /* Perform signed integer extension */\n        pb_int64_t value = 0;\n\n        if (field->data_size == sizeof(int_least8_t))\n            value = *(const int_least8_t*)field->pData;\n        else if (field->data_size == sizeof(int_least16_t))\n            value = *(const int_least16_t*)field->pData;\n        else if (field->data_size == sizeof(int32_t))\n            value = *(const int32_t*)field->pData;\n        else if (field->data_size == sizeof(pb_int64_t))\n            value = *(const pb_int64_t*)field->pData;\n        else\n            PB_RETURN_ERROR(stream, \"invalid data_size\");\n\n        if (PB_LTYPE(field->type) == PB_LTYPE_SVARINT)\n            return pb_encode_svarint(stream, value);\n#ifdef PB_WITHOUT_64BIT\n        else if (value < 0)\n            return pb_encode_varint_32(stream, (uint32_t)value, (uint32_t)-1);\n#endif\n        else\n            return pb_encode_varint(stream, (pb_uint64_t)value);\n\n    }\n}\n\nstatic bool checkreturn pb_enc_fixed(pb_ostream_t *stream, const pb_field_iter_t *field)\n{\n#ifdef PB_CONVERT_DOUBLE_FLOAT\n    if (field->data_size == sizeof(float) && PB_LTYPE(field->type) == PB_LTYPE_FIXED64)\n    {\n        return pb_encode_float_as_double(stream, *(float*)field->pData);\n    }\n#endif\n\n    if (field->data_size == sizeof(uint32_t))\n    {\n        return pb_encode_fixed32(stream, field->pData);\n    }\n#ifndef PB_WITHOUT_64BIT\n    else if (field->data_size == sizeof(uint64_t))\n    {\n        return pb_encode_fixed64(stream, field->pData);\n    }\n#endif\n    else\n    {\n        PB_RETURN_ERROR(stream, \"invalid data_size\");\n    }\n}\n\nstatic bool checkreturn pb_enc_bytes(pb_ostream_t *stream, const pb_field_iter_t *field)\n{\n    const pb_bytes_array_t *bytes = NULL;\n\n    bytes = (const pb_bytes_array_t*)field->pData;\n    \n    if (bytes == NULL)\n    {\n        /* Treat null pointer as an empty bytes field */\n        return pb_encode_string(stream, NULL, 0);\n    }\n    \n    if (PB_ATYPE(field->type) == PB_ATYPE_STATIC &&\n        PB_BYTES_ARRAY_T_ALLOCSIZE(bytes->size) > field->data_size)\n    {\n        PB_RETURN_ERROR(stream, \"bytes size exceeded\");\n    }\n    \n    return pb_encode_string(stream, bytes->bytes, (size_t)bytes->size);\n}\n\nstatic bool checkreturn pb_enc_string(pb_ostream_t *stream, const pb_field_iter_t *field)\n{\n    size_t size = 0;\n    size_t max_size = (size_t)field->data_size;\n    const char *str = (const char*)field->pData;\n    \n    if (PB_ATYPE(field->type) == PB_ATYPE_POINTER)\n    {\n        max_size = (size_t)-1;\n    }\n    else\n    {\n        /* pb_dec_string() assumes string fields end with a null\n         * terminator when the type isn't PB_ATYPE_POINTER, so we\n         * shouldn't allow more than max-1 bytes to be written to\n         * allow space for the null terminator.\n         */\n        if (max_size == 0)\n            PB_RETURN_ERROR(stream, \"zero-length string\");\n\n        max_size -= 1;\n    }\n\n\n    if (str == NULL)\n    {\n        size = 0; /* Treat null pointer as an empty string */\n    }\n    else\n    {\n        const char *p = str;\n\n        /* strnlen() is not always available, so just use a loop */\n        while (size < max_size && *p != '\\0')\n        {\n            size++;\n            p++;\n        }\n\n        if (*p != '\\0')\n        {\n            PB_RETURN_ERROR(stream, \"unterminated string\");\n        }\n    }\n\n#ifdef PB_VALIDATE_UTF8\n    if (!pb_validate_utf8(str))\n        PB_RETURN_ERROR(stream, \"invalid utf8\");\n#endif\n\n    return pb_encode_string(stream, (const pb_byte_t*)str, size);\n}\n\nstatic bool checkreturn pb_enc_submessage(pb_ostream_t *stream, const pb_field_iter_t *field)\n{\n    if (field->submsg_desc == NULL)\n        PB_RETURN_ERROR(stream, \"invalid field descriptor\");\n\n    if (PB_LTYPE(field->type) == PB_LTYPE_SUBMSG_W_CB && field->pSize != NULL)\n    {\n        /* Message callback is stored right before pSize. */\n        pb_callback_t *callback = (pb_callback_t*)field->pSize - 1;\n        if (callback->funcs.encode)\n        {\n            if (!callback->funcs.encode(stream, field, &callback->arg))\n                return false;\n        }\n    }\n    \n    return pb_encode_submessage(stream, field->submsg_desc, field->pData);\n}\n\nstatic bool checkreturn pb_enc_fixed_length_bytes(pb_ostream_t *stream, const pb_field_iter_t *field)\n{\n    return pb_encode_string(stream, (const pb_byte_t*)field->pData, (size_t)field->data_size);\n}\n\n#ifdef PB_CONVERT_DOUBLE_FLOAT\nbool pb_encode_float_as_double(pb_ostream_t *stream, float value)\n{\n    union { float f; uint32_t i; } in;\n    uint_least8_t sign;\n    int exponent;\n    uint64_t mantissa;\n\n    in.f = value;\n\n    /* Decompose input value */\n    sign = (uint_least8_t)((in.i >> 31) & 1);\n    exponent = (int)((in.i >> 23) & 0xFF) - 127;\n    mantissa = in.i & 0x7FFFFF;\n\n    if (exponent == 128)\n    {\n        /* Special value (NaN etc.) */\n        exponent = 1024;\n    }\n    else if (exponent == -127)\n    {\n        if (!mantissa)\n        {\n            /* Zero */\n            exponent = -1023;\n        }\n        else\n        {\n            /* Denormalized */\n            mantissa <<= 1;\n            while (!(mantissa & 0x800000))\n            {\n                mantissa <<= 1;\n                exponent--;\n            }\n            mantissa &= 0x7FFFFF;\n        }\n    }\n\n    /* Combine fields */\n    mantissa <<= 29;\n    mantissa |= (uint64_t)(exponent + 1023) << 52;\n    mantissa |= (uint64_t)sign << 63;\n\n    return pb_encode_fixed64(stream, &mantissa);\n}\n#endif\n"
  },
  {
    "path": "c/core/src/tahu.c",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2019 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\n#include <stdio.h>\n#include <stdlib.h>\n#include <stdbool.h>\n#include <pb_decode.h>\n#include <pb_encode.h>\n#include <tahu.h>\n#include <tahu.pb.h>\n\nstatic uint8_t payload_sequence;\n\nint add_metadata_to_metric(org_eclipse_tahu_protobuf_Payload_Metric *metric,\n                           org_eclipse_tahu_protobuf_Payload_MetaData *metadata) {\n    DEBUG_PRINT(\"Adding metadata...\\n\");\n    metric->has_metadata = true;\n    memcpy(&metric->metadata, metadata, sizeof(metric->metadata));\n    return 0;\n}\n\nint add_metric_to_payload(org_eclipse_tahu_protobuf_Payload *payload,\n                          org_eclipse_tahu_protobuf_Payload_Metric *metric) {\n    DEBUG_PRINT(\"Adding metric to payload...\\n\");\n    const int old_count = payload->metrics_count;\n    const int new_count = (old_count + 1);\n    const size_t new_allocation_size = sizeof(org_eclipse_tahu_protobuf_Payload_Metric) * new_count;\n    void *realloc_result = realloc(payload->metrics, new_allocation_size);\n    //DEBUG_PRINT(\"realloc_result=%p\\n\", realloc_result);\n    if (realloc_result == NULL) {\n        fprintf(stderr, \"realloc failed in add_metric_to_payload\\n\");\n        return -1;\n    }\n    payload->metrics = realloc_result;\n    payload->metrics_count = new_count;\n    memcpy(&payload->metrics[old_count], metric, sizeof(org_eclipse_tahu_protobuf_Payload_Metric));\n    return 0;\n}\n\nint set_propertyvalue(org_eclipse_tahu_protobuf_Payload_PropertyValue *propertyvalue,\n                      uint32_t datatype,\n                      const void *value,\n                      size_t size) {\n    DEBUG_PRINT(\"Set property value...\\n\");\n    switch (datatype) {\n    case PROPERTY_DATA_TYPE_INT8:\n        propertyvalue->which_value = org_eclipse_tahu_protobuf_Payload_PropertyValue_int_value_tag;\n        propertyvalue->value.int_value = *(int8_t *)value;\n        break;\n    case PROPERTY_DATA_TYPE_INT16:\n        propertyvalue->which_value = org_eclipse_tahu_protobuf_Payload_PropertyValue_int_value_tag;\n        propertyvalue->value.int_value = *(int16_t *)value;\n        break;\n    case PROPERTY_DATA_TYPE_INT32:\n        propertyvalue->which_value = org_eclipse_tahu_protobuf_Payload_PropertyValue_int_value_tag;\n        propertyvalue->value.int_value = *(int32_t *)value;\n        break;\n    case PROPERTY_DATA_TYPE_INT64:\n        propertyvalue->which_value = org_eclipse_tahu_protobuf_Payload_PropertyValue_long_value_tag;\n        propertyvalue->value.long_value = *(int64_t *)value;\n        break;\n    case PROPERTY_DATA_TYPE_UINT8:\n        propertyvalue->which_value = org_eclipse_tahu_protobuf_Payload_PropertyValue_int_value_tag;\n        propertyvalue->value.int_value = *(uint8_t *)value;\n        break;\n    case PROPERTY_DATA_TYPE_UINT16:\n        propertyvalue->which_value = org_eclipse_tahu_protobuf_Payload_PropertyValue_int_value_tag;\n        propertyvalue->value.int_value = *(uint16_t *)value;\n        break;\n    case PROPERTY_DATA_TYPE_UINT32:\n        propertyvalue->which_value = org_eclipse_tahu_protobuf_Payload_PropertyValue_long_value_tag;\n        propertyvalue->value.long_value = *(uint32_t *)value;\n        break;\n    case PROPERTY_DATA_TYPE_UINT64:\n    case PROPERTY_DATA_TYPE_DATETIME:\n        propertyvalue->which_value = org_eclipse_tahu_protobuf_Payload_PropertyValue_long_value_tag;\n        propertyvalue->value.long_value = *(uint64_t *)value;\n        break;\n    case PROPERTY_DATA_TYPE_FLOAT:\n        propertyvalue->which_value = org_eclipse_tahu_protobuf_Payload_PropertyValue_float_value_tag;\n        propertyvalue->value.float_value = *(float *)value;\n        break;\n    case PROPERTY_DATA_TYPE_DOUBLE:\n        propertyvalue->which_value = org_eclipse_tahu_protobuf_Payload_PropertyValue_double_value_tag;\n        propertyvalue->value.double_value = *(double *)value;\n        break;\n    case PROPERTY_DATA_TYPE_BOOLEAN:\n        propertyvalue->which_value = org_eclipse_tahu_protobuf_Payload_PropertyValue_boolean_value_tag;\n        propertyvalue->value.boolean_value = *(bool *)value;\n        break;\n    case PROPERTY_DATA_TYPE_STRING:\n    case PROPERTY_DATA_TYPE_TEXT:\n        propertyvalue->which_value = org_eclipse_tahu_protobuf_Payload_PropertyValue_string_value_tag;\n        propertyvalue->value.string_value = strndup(value, size);\n        break;\n    default:\n        fprintf(stderr, \"Invalid datatype(%u) in set_propertyvalue\\n\", datatype);\n        return -1;\n    }\n    return 0;\n}\n\nint add_property_to_set(org_eclipse_tahu_protobuf_Payload_PropertySet *propertyset,\n                        const char *key,\n                        uint32_t datatype,\n                        const void *value,\n                        size_t size_of_value) {\n    DEBUG_PRINT(\"Add property to set...\\n\");\n    if (propertyset->keys_count != propertyset->values_count) {\n        fprintf(stderr, \"Mismatched key/value counts in add_property_to_set\\n\");\n        return -1;\n    }\n    const int old_count = propertyset->keys_count;\n    const int new_count = (old_count + 1);\n    const size_t key_allocation_size = sizeof(char *) * new_count;\n    const size_t value_allocation_size = sizeof(org_eclipse_tahu_protobuf_Payload_PropertyValue) * new_count;\n    void *key_allocation_result = realloc(propertyset->keys, key_allocation_size);\n    void *value_allocation_result = realloc(propertyset->values, value_allocation_size);\n    //DEBUG_PRINT(\"key=%p value=%p\\n\", key_allocation_result, value_allocation_result);\n    if ((key_allocation_result == NULL) || (value_allocation_result == NULL)) {\n        fprintf(stderr, \"realloc failed in add_metric_to_payload\\n\");\n        return -1;\n    }\n    propertyset->keys = key_allocation_result;\n    propertyset->keys_count = new_count;\n    propertyset->values = value_allocation_result;\n    propertyset->values_count = new_count;\n    propertyset->keys[old_count] = strdup(key);\n    if (propertyset->keys[old_count] == NULL) {\n        fprintf(stderr, \"strdup failed in add_metric_to_payload\\n\");\n        return -1;\n    }\n    memset(&propertyset->values[old_count], 0, sizeof(org_eclipse_tahu_protobuf_Payload_PropertyValue));\n    propertyset->values[old_count].has_type = true;\n    propertyset->values[old_count].type = datatype;\n    if (value == NULL) {\n        propertyset->values[old_count].has_is_null = true;\n        propertyset->values[old_count].is_null = true;\n    } else {\n        set_propertyvalue(&propertyset->values[old_count], datatype, value, size_of_value);\n    }\n    return 0;\n}\n\nint add_propertyset_to_metric(org_eclipse_tahu_protobuf_Payload_Metric *metric,\n                              org_eclipse_tahu_protobuf_Payload_PropertySet *properties) {\n    DEBUG_PRINT(\"Add propertyset to metric...\\n\");\n    metric->has_properties = true;\n    memcpy(&metric->properties, properties, sizeof(metric->properties));\n    return 0;\n}\n\nint set_metric_value(org_eclipse_tahu_protobuf_Payload_Metric *metric, uint32_t datatype, const void *value, size_t size) {\n    DEBUG_PRINT(\"Set metric value...\\n\");\n    switch (datatype) {\n    case METRIC_DATA_TYPE_INT8:\n        metric->which_value = org_eclipse_tahu_protobuf_Payload_Metric_int_value_tag;\n        metric->value.int_value = *(int8_t *)value;\n        break;\n    case METRIC_DATA_TYPE_INT16:\n        metric->which_value = org_eclipse_tahu_protobuf_Payload_Metric_int_value_tag;\n        metric->value.int_value = *(int16_t *)value;\n        break;\n    case METRIC_DATA_TYPE_INT32:\n        metric->which_value = org_eclipse_tahu_protobuf_Payload_Metric_int_value_tag;\n        metric->value.int_value = *(int32_t *)value;\n        break;\n    case METRIC_DATA_TYPE_INT64:\n        metric->which_value = org_eclipse_tahu_protobuf_Payload_Metric_long_value_tag;\n        metric->value.long_value = *(int64_t *)value;\n        break;\n    case METRIC_DATA_TYPE_UINT8:\n        metric->which_value = org_eclipse_tahu_protobuf_Payload_Metric_int_value_tag;\n        metric->value.int_value = *(uint8_t *)value;\n        break;\n    case METRIC_DATA_TYPE_UINT16:\n        metric->which_value = org_eclipse_tahu_protobuf_Payload_Metric_int_value_tag;\n        metric->value.int_value = *(uint16_t *)value;\n        break;\n    case METRIC_DATA_TYPE_UINT32:\n        metric->which_value = org_eclipse_tahu_protobuf_Payload_Metric_long_value_tag;\n        metric->value.long_value = *(uint32_t *)value;\n        break;\n    case METRIC_DATA_TYPE_UINT64:\n    case METRIC_DATA_TYPE_DATETIME:\n        metric->which_value = org_eclipse_tahu_protobuf_Payload_Metric_long_value_tag;\n        metric->value.long_value = *(uint64_t *)value;\n        break;\n    case METRIC_DATA_TYPE_FLOAT:\n        metric->which_value = org_eclipse_tahu_protobuf_Payload_Metric_float_value_tag;\n        metric->value.float_value = *(float *)value;\n        break;\n    case METRIC_DATA_TYPE_DOUBLE:\n        metric->which_value = org_eclipse_tahu_protobuf_Payload_Metric_double_value_tag;\n        metric->value.double_value = *(double *)value;\n        break;\n    case METRIC_DATA_TYPE_BOOLEAN:\n        metric->which_value = org_eclipse_tahu_protobuf_Payload_Metric_boolean_value_tag;\n        metric->value.boolean_value = *(bool *)value;\n        break;\n    case METRIC_DATA_TYPE_STRING:\n    case METRIC_DATA_TYPE_TEXT:\n    case METRIC_DATA_TYPE_UUID:\n        metric->which_value = org_eclipse_tahu_protobuf_Payload_Metric_string_value_tag;\n        metric->value.string_value = strndup(value, size);\n        break;\n    case METRIC_DATA_TYPE_DATASET:\n        metric->which_value = org_eclipse_tahu_protobuf_Payload_Metric_dataset_value_tag;\n        memcpy(&metric->value.dataset_value, value, sizeof(metric->value.dataset_value));\n        break;\n    case METRIC_DATA_TYPE_TEMPLATE:\n        metric->which_value = org_eclipse_tahu_protobuf_Payload_Metric_template_value_tag;\n        memcpy(&metric->value.template_value, value, sizeof(metric->value.template_value));\n        break;\n    case METRIC_DATA_TYPE_BYTES:\n    case METRIC_DATA_TYPE_FILE:\n    case METRIC_DATA_TYPE_UNKNOWN:\n    default:\n        fprintf(stderr, \"Unhandled datatype(%u) in set_metric_value\\n\", datatype);\n        return -1;\n    }\n    return 0;\n}\n\nint add_simple_metric(org_eclipse_tahu_protobuf_Payload *payload,\n                      const char *name,\n                      bool has_alias,\n                      uint64_t alias,\n                      uint64_t datatype,\n                      bool is_historical,\n                      bool is_transient,\n                      const void *value,\n                      size_t size_of_value) {\n    DEBUG_PRINT(\"Add simple metric...\\n\");\n    org_eclipse_tahu_protobuf_Payload_Metric new_metric;\n    memset(&new_metric, 0, sizeof(new_metric));\n    if (name != NULL) {\n        new_metric.name = strdup(name);\n        if (new_metric.name == NULL) {\n            fprintf(stderr, \"strdup name failed in add_simple_metric\\n\");\n            return -1;\n        }\n    }\n    new_metric.has_alias = has_alias;\n    new_metric.alias = alias;\n    new_metric.has_timestamp = true;\n    new_metric.timestamp = get_current_timestamp();\n    new_metric.has_datatype = true;\n    new_metric.datatype = datatype;\n    if (is_historical) {\n        new_metric.has_is_historical = true;\n        new_metric.is_historical = true;\n    }\n    if (is_transient) {\n        new_metric.has_is_transient = true;\n        new_metric.is_transient = true;\n    }\n    if (value == NULL) {\n        new_metric.has_is_null = true;\n        new_metric.is_null = true;\n    } else {\n        set_metric_value(&new_metric, datatype, value, size_of_value);\n    }\n    add_metric_to_payload(payload, &new_metric);\n    return 0;\n}\n\nssize_t encode_payload(uint8_t *out_buffer,\n                       size_t buffer_length,\n                       const org_eclipse_tahu_protobuf_Payload *payload) {\n    // Use a different stream if the user wants a normal encode or just a size check\n    pb_ostream_t sizing_stream = PB_OSTREAM_SIZING;\n    pb_ostream_t buffer_stream = pb_ostream_from_buffer(out_buffer, buffer_length);\n    pb_ostream_t *node_stream = ((out_buffer == NULL) ? &sizing_stream : &buffer_stream);\n\n    // Encode the payload\n    DEBUG_PRINT(\"Encoding payload...\\n\");\n    const bool encode_result = pb_encode(node_stream, org_eclipse_tahu_protobuf_Payload_fields, payload);\n    const size_t message_length = node_stream->bytes_written;\n    DEBUG_PRINT(\"Message length: %zd\\n\", message_length);\n\n    // Error Check\n    if (!encode_result) {\n        fprintf(stderr, \"Encoding failed: %s\\n\", PB_GET_ERROR(node_stream));\n        return -1;\n    }\n\n    DEBUG_PRINT(\"Encoding succeeded\\n\");\n    return message_length;\n}\n\nssize_t decode_payload(org_eclipse_tahu_protobuf_Payload *payload,\n                       const uint8_t *in_buffer,\n                       size_t buffer_length) {\n    DEBUG_PRINT(\"Decoding payload...\\n\");\n    pb_istream_t node_stream = pb_istream_from_buffer(in_buffer, buffer_length);\n    memset(payload, 0, sizeof(org_eclipse_tahu_protobuf_Payload));\n    const bool decode_result = pb_decode(&node_stream, org_eclipse_tahu_protobuf_Payload_fields, payload);\n\n    if (!decode_result) {\n        fprintf(stderr, \"Decoding failed: %s\\n\", PB_GET_ERROR(&node_stream));\n        return -1;\n    }\n\n#ifdef SPARKPLUG_DEBUG\n    // Print the message data\n    print_payload(payload);\n#endif\n\n    return node_stream.bytes_left;\n}\n\nint free_payload(org_eclipse_tahu_protobuf_Payload *payload) {\n    DEBUG_PRINT(\"Free payload memory...\\n\");\n    pb_release(org_eclipse_tahu_protobuf_Payload_fields, payload);\n    return 0;\n}\n\nuint64_t get_current_timestamp() {\n    // Set the timestamp\n    struct timespec ts;\n#ifdef __MACH__ // OS X does not have clock_gettime, use clock_get_time\n    clock_serv_t cclock;\n    mach_timespec_t mts;\n    host_get_clock_service(mach_host_self(), CALENDAR_CLOCK, &cclock);\n    clock_get_time(cclock, &mts);\n    mach_port_deallocate(mach_task_self(), cclock);\n    ts.tv_sec = mts.tv_sec;\n    ts.tv_nsec = mts.tv_nsec;\n#else\n    clock_gettime(CLOCK_REALTIME, &ts);\n#endif\n    return ts.tv_sec * UINT64_C(1000) + ts.tv_nsec / 1000000;\n}\n\nvoid reset_sparkplug_sequence(void) {\n    payload_sequence = 0;\n}\n\nint get_next_payload(org_eclipse_tahu_protobuf_Payload *payload) {\n\n    // Initialize payload\n    DEBUG_PRINT(\"Current Sequence Number: %u\\n\", payload_sequence);\n    memset(payload, 0, sizeof(org_eclipse_tahu_protobuf_Payload));\n    payload->has_timestamp = true;\n    payload->timestamp = get_current_timestamp();\n    payload->has_seq = true;\n    payload->seq = payload_sequence;\n\n    // Increment/wrap the sequence number (stored in a U8, so it\n    // will wrap 255-to-0 automatically)\n    payload_sequence++;\n    return 0;\n}\n\nint init_dataset(org_eclipse_tahu_protobuf_Payload_DataSet *dataset,\n                 uint64_t num_of_rows,\n                 uint64_t num_of_columns,\n                 const uint32_t datatypes[],\n                 const char *column_keys[],\n                 const org_eclipse_tahu_protobuf_Payload_DataSet_Row row_data[]) {\n    DEBUG_PRINT(\"Init dataset...\\n\");\n    memset(dataset, 0, sizeof(org_eclipse_tahu_protobuf_Payload_DataSet));\n    dataset->has_num_of_columns = true;\n    dataset->num_of_columns = num_of_columns;\n    dataset->columns_count = num_of_columns;\n    const size_t key_size = num_of_columns * sizeof(char *);\n    dataset->columns = malloc(key_size);\n    if (dataset->columns == NULL) {\n        fprintf(stderr, \"malloc(%lu) failure in init_dataset\\n\", key_size);\n        return -1;\n    }\n    for (int i = 0; i < num_of_columns; i++) {\n        dataset->columns[i] = strdup(column_keys[i]);\n        if (dataset->columns[i] == NULL) {\n            fprintf(stderr, \"strdup failed in init_dataset\\n\");\n            return -1;\n        }\n    }\n    dataset->types_count = num_of_columns;\n    const size_t datatypes_size = num_of_columns * sizeof(uint32_t);\n    dataset->types = malloc(datatypes_size);\n    if (dataset->types == NULL) {\n        fprintf(stderr, \"malloc(%lu) failure in init_dataset\\n\", datatypes_size);\n        return -1;\n    }\n    memcpy(dataset->types, datatypes, datatypes_size);\n    dataset->rows_count = num_of_rows;\n    const size_t row_data_size = num_of_rows * sizeof(org_eclipse_tahu_protobuf_Payload_DataSet_Row);\n    dataset->rows = malloc(row_data_size);\n    if (dataset->rows == NULL) {\n        fprintf(stderr, \"malloc(%lu) failure in init_dataset\\n\", row_data_size);\n        return -1;\n    }\n    memcpy(dataset->rows, row_data, row_data_size);\n    return 0;\n}\n\nint init_metric(org_eclipse_tahu_protobuf_Payload_Metric *metric,\n                const char *name,\n                bool has_alias,\n                uint64_t alias,\n                uint64_t datatype,\n                bool is_historical,\n                bool is_transient,\n                const void *value,\n                size_t size_of_value) {\n    DEBUG_PRINT(\"Init metric...\\n\");\n    memset(metric, 0, sizeof(org_eclipse_tahu_protobuf_Payload_Metric));\n    if (name != NULL) {\n        metric->name = strdup(name);\n        if (metric->name == NULL) {\n            fprintf(stderr, \"strdup failed to copy metric name\\n\");\n            return -1;\n        }\n    }\n    if (has_alias) {\n        metric->has_alias = true;\n        metric->alias = alias;\n    }\n    if (is_historical && !is_transient) {\n        metric->has_timestamp = true;\n        metric->timestamp = get_current_timestamp();\n    }\n    metric->has_datatype = true;\n    metric->datatype = datatype;\n    if (is_historical) {\n        metric->has_is_historical = true;\n        metric->is_historical = true;\n    }\n    if (is_transient) {\n        metric->has_is_transient = true;\n        metric->is_transient = true;\n    }\n    if (value == NULL) {\n        metric->has_is_null = true;\n        metric->is_null = true;\n    } else {\n        return set_metric_value(metric, datatype, value, size_of_value);\n    }\n    // No support for metadata or properties in this function...\n    return 0;\n}\n\n/*\n * Display a full Sparkplug Payload\n */\n#define PP(...) fprintf(stdout,__VA_ARGS__)\n#define EMPTY_PREFIX \"\"\nvoid print_metadata(const char *prefix, org_eclipse_tahu_protobuf_Payload_MetaData *metadata);\nvoid print_propertyvalue(const char *prefix, org_eclipse_tahu_protobuf_Payload_PropertyValue *value);\nvoid print_propertyset(const char *prefix, org_eclipse_tahu_protobuf_Payload_PropertySet *properties);\nvoid print_propertysetlist(const char *prefix, org_eclipse_tahu_protobuf_Payload_PropertySetList *propertysetlist);\nvoid print_dataset_row(const char *prefix, org_eclipse_tahu_protobuf_Payload_DataSet_Row *row);\nvoid print_datasetvalue(const char *prefix, org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue *dsvalue);\nvoid print_dataset(const char *prefix, org_eclipse_tahu_protobuf_Payload_DataSet *dataset_value);\nvoid print_template_parameter(const char *prefix, org_eclipse_tahu_protobuf_Payload_Template_Parameter *template_parameter);\nvoid print_template(const char *prefix, org_eclipse_tahu_protobuf_Payload_Template *template);\nvoid print_metric(const char *prefix, org_eclipse_tahu_protobuf_Payload_Metric *metric);\nvoid print_payload(org_eclipse_tahu_protobuf_Payload *payload);\n\nvoid print_metadata(const char *prefix, org_eclipse_tahu_protobuf_Payload_MetaData *metadata) {\n    if (prefix == NULL) {\n        prefix = EMPTY_PREFIX;\n    }\n    if (metadata->has_is_multi_part) {\n        PP(\"%sis_multi_part=%u\\n\", prefix, metadata->is_multi_part);\n    }\n    if (metadata->content_type != NULL) {\n        PP(\"%scontent_type=%s [%p]\\n\", prefix, metadata->content_type, metadata->content_type);\n    }\n    if (metadata->has_size) {\n        PP(\"%shas_size=%lu\\n\", prefix, metadata->size);\n    }\n    if (metadata->has_seq) {\n        PP(\"%sseq=%lu\\n\", prefix, metadata->seq);\n    }\n    if (metadata->file_name != NULL) {\n        PP(\"%sfile_name=%s [%p]\\n\", prefix, metadata->file_name, metadata->file_name);\n    }\n    if (metadata->file_type != NULL) {\n        PP(\"%sfile_type=%s [%p]\\n\", prefix, metadata->file_type, metadata->file_type);\n    }\n    if (metadata->md5 != NULL) {\n        PP(\"%smd5=%s [%p]\\n\", prefix, metadata->md5, metadata->md5);\n    }\n    if (metadata->description != NULL) {\n        PP(\"%sdescription=%s [%p]\\n\", prefix, metadata->description, metadata->description);\n    }\n    if (metadata->extensions != NULL) {\n        PP(\"%sextensions=[%p] (display not supported)\\n\", prefix, metadata->extensions);\n    }\n}\n\nvoid print_propertyvalue(const char *prefix, org_eclipse_tahu_protobuf_Payload_PropertyValue *value) {\n    char temp[64];\n    if (prefix == NULL) {\n        prefix = EMPTY_PREFIX;\n    }\n    if (value->has_type) {\n        PP(\"%stype=%u\\n\", prefix, value->type);\n    }\n    if (value->has_is_null) {\n        PP(\"%sis_null=%u\\n\", prefix, value->is_null);\n    }\n    switch (value->which_value) {\n    case org_eclipse_tahu_protobuf_Payload_PropertyValue_int_value_tag:\n        PP(\"%sint_value=%d\\n\", prefix, value->value.int_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_PropertyValue_long_value_tag:\n        PP(\"%slong_value=%ld\\n\", prefix, value->value.long_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_PropertyValue_float_value_tag:\n        PP(\"%sfloat_value=%f\\n\", prefix, value->value.float_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_PropertyValue_double_value_tag:\n        PP(\"%sdouble_value=%f\\n\", prefix, value->value.double_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_PropertyValue_boolean_value_tag:\n        PP(\"%sboolean_value=%u\\n\", prefix, value->value.boolean_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_PropertyValue_string_value_tag:\n        PP(\"%sstring_value=%s [%p]\\n\", prefix, value->value.string_value, value->value.string_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_PropertyValue_propertyset_value_tag:\n        snprintf(temp, sizeof(temp), \"%spropertyset.\", prefix);\n        print_propertyset(temp, &value->value.propertyset_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_PropertyValue_propertysets_value_tag:\n        snprintf(temp, sizeof(temp), \"%spropertysets.\", prefix);\n        print_propertysetlist(temp, &value->value.propertysets_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_PropertyValue_extension_value_tag:\n        PP(\"%sextension_value=[%p] (display not supported)\\n\", prefix, value->value.extension_value.extensions);\n        break;\n    default:\n        PP(\"%sinvalid which_value=%u\\n\", prefix, value->which_value);\n    }\n}\n\nvoid print_propertyset(const char *prefix, org_eclipse_tahu_protobuf_Payload_PropertySet *properties) {\n    char temp[64];\n    if (prefix == NULL) {\n        prefix = EMPTY_PREFIX;\n    }\n    PP(\"%skeys=[%p] (count=%u)\\n\", prefix, properties->keys, properties->keys_count);\n    for (int i = 0; i < properties->keys_count; i++) {\n        PP(\"%s keys[%u]=%s [%p]\\n\", prefix, i, properties->keys[i], properties->keys[i]);\n    }\n    PP(\"%svalues=[%p] (count=%u)\\n\", prefix, properties->values, properties->values_count);\n    for (int i = 0; i < properties->values_count; i++) {\n        snprintf(temp, sizeof(temp), \"%svalues[%u].\", prefix, i);\n        print_propertyvalue(temp, &properties->values[i]);\n    }\n    if (properties->extensions != NULL) {\n        PP(\"%sextension=[%p] (display not supported)\\n\", prefix, properties->extensions);\n    }\n}\n\nvoid print_propertysetlist(const char *prefix, org_eclipse_tahu_protobuf_Payload_PropertySetList *propertysetlist) {\n    char temp[64];\n    if (prefix == NULL) {\n        prefix = EMPTY_PREFIX;\n    }\n    // pb_size_t propertyset_count;\n    // struct _org_eclipse_tahu_protobuf_Payload_PropertySet *propertyset;\n    PP(\"%spropertyset=[%p] (count=%u)\\n\", prefix, propertysetlist->propertyset, propertysetlist->propertyset_count);\n    for (int i = 0; i < propertysetlist->propertyset_count; i++) {\n        snprintf(temp, sizeof(temp), \"%s.propertyset[%u].\", prefix, i);\n        print_propertyset(temp, &propertysetlist->propertyset[i]);\n    }\n    if (propertysetlist->extensions != NULL) {\n        PP(\"%sextensions=[%p] (display not supported)\\n\", prefix, propertysetlist->extensions);\n    }\n}\n\nvoid print_dataset_row(const char *prefix, org_eclipse_tahu_protobuf_Payload_DataSet_Row *row) {\n    char temp[64];\n    if (prefix == NULL) {\n        prefix = EMPTY_PREFIX;\n    }\n    PP(\"%selements=[%p] (count=%u)\\n\", prefix, row->elements, row->elements_count);\n    for (int i = 0; i < row->elements_count; i++) {\n        snprintf(temp, sizeof(temp), \"%selements[%u].\", prefix, i);\n        print_datasetvalue(temp, &row->elements[i]);\n    }\n    if (row->extensions != NULL) {\n        PP(\"%selements=[%p] (display not supported)\\n\", prefix, row->extensions);\n    }\n}\n\nvoid print_datasetvalue(const char *prefix, org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue *dsvalue) {\n    switch (dsvalue->which_value) {\n    case org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_int_value_tag:\n        PP(\"%sint_value=%d\\n\", prefix, dsvalue->value.int_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_long_value_tag:\n        PP(\"%slong_value=%ld\\n\", prefix, dsvalue->value.long_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_float_value_tag:\n        PP(\"%sfloat_value=%f\\n\", prefix, dsvalue->value.float_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_double_value_tag:\n        PP(\"%sdouble_value=%f\\n\", prefix, dsvalue->value.double_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_boolean_value_tag:\n        PP(\"%sboolean_value=%u\\n\", prefix, dsvalue->value.boolean_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_string_value_tag:\n        PP(\"%sstring_value=%s [%p]\\n\", prefix, dsvalue->value.string_value, dsvalue->value.string_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_extension_value_tag:\n        PP(\"%sextension_value=[%p] (display not supported)\\n\", prefix, dsvalue->value.extension_value.extensions);\n        break;\n    default:\n        PP(\"%sinvalid which_value=%u\\n\", prefix, dsvalue->which_value);\n    }\n}\n\nvoid print_dataset(const char *prefix, org_eclipse_tahu_protobuf_Payload_DataSet *dataset_value) {\n    char temp[64];\n    if (prefix == NULL) {\n        prefix = EMPTY_PREFIX;\n    }\n    if (dataset_value->has_num_of_columns) {\n        PP(\"%snum_of_columns=%lu\\n\", prefix, dataset_value->num_of_columns);\n    }\n    PP(\"%scolumns=[%p] (count=%u)\\n\", prefix, dataset_value->columns, dataset_value->columns_count);\n    for (int i = 0; i < dataset_value->columns_count; i++) {\n        PP(\"%scolumn[%u]=%s [%p]\\n\", prefix, i, dataset_value->columns[i], dataset_value->columns[i]);\n    }\n    PP(\"%stypes=[%p] (count=%u)\\n\", prefix, dataset_value->types, dataset_value->types_count);\n    for (int i = 0; i < dataset_value->types_count; i++) {\n        PP(\"%stype[%u]=%u\\n\", prefix, i, dataset_value->types[i]);\n    }\n    PP(\"%srows=[%p] (count=%u)\\n\", prefix, dataset_value->rows, dataset_value->rows_count);\n    for (int i = 0; i < dataset_value->rows_count; i++) {\n        snprintf(temp, sizeof(temp), \"%srow[%u].\", prefix, i);\n        print_dataset_row(temp, &dataset_value->rows[i]);\n    }\n    if (dataset_value->extensions != NULL) {\n        PP(\"%sextensions=[%p]\\n\", prefix, dataset_value->extensions);\n    }\n}\n\nvoid print_template_parameter(const char *prefix, org_eclipse_tahu_protobuf_Payload_Template_Parameter *template_parameter) {\n    if (prefix == NULL) {\n        prefix = EMPTY_PREFIX;\n    }\n    // char *name;\n    if (template_parameter->name != NULL) {\n        PP(\"%sname=%s [%p]\\n\", prefix, template_parameter->name, template_parameter->name);\n    }\n    if (template_parameter->has_type) {\n        PP(\"%stype=%u\\n\", prefix, template_parameter->type);\n    }\n    // pb_size_t which_value;\n    switch (template_parameter->which_value) {\n    case org_eclipse_tahu_protobuf_Payload_Template_Parameter_int_value_tag:\n        PP(\"%sint_value=%d\\n\", prefix, template_parameter->value.int_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_Template_Parameter_long_value_tag:\n        PP(\"%slong_value=%ld\\n\", prefix, template_parameter->value.long_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_Template_Parameter_float_value_tag:\n        PP(\"%sfloat_value=%f\\n\", prefix, template_parameter->value.float_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_Template_Parameter_double_value_tag:\n        PP(\"%sdouble_value=%f\\n\", prefix, template_parameter->value.double_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_Template_Parameter_boolean_value_tag:\n        PP(\"%sboolean_value=%u\\n\", prefix, template_parameter->value.boolean_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_Template_Parameter_string_value_tag:\n        PP(\"%sstring_value=%s [%p]\\n\", prefix, template_parameter->value.string_value, template_parameter->value.string_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_Template_Parameter_extension_value_tag:\n        PP(\"%sextension_value=[%p] (display not supported)\\n\", prefix, template_parameter->value.extension_value.extensions);\n        break;\n    default:\n        PP(\"%sinvalid which_value=%u\\n\", prefix, template_parameter->which_value);\n    }\n}\n\nvoid print_template(const char *prefix, org_eclipse_tahu_protobuf_Payload_Template *template) {\n    char temp[64];\n    if (prefix == NULL) {\n        prefix = EMPTY_PREFIX;\n    }\n    if (template->version != NULL) {\n        PP(\"%sversion=%s [%p]\\n\", prefix, template->version, template->version);\n    }\n    PP(\"%smetrics=[%p] (count=%u)\\n\", prefix, template->metrics, template->metrics_count);\n    for (int i = 0; i < template->metrics_count; i++) {\n        snprintf(temp, sizeof(temp), \"%smetric[%u].\", prefix, i);\n        print_metric(temp, &template->metrics[i]);\n    }\n    PP(\"%sparameters=[%p] (count=%u)\\n\", prefix, template->parameters, template->parameters_count);\n    for (int i = 0; i < template->parameters_count; i++) {\n        snprintf(temp, sizeof(temp), \"%sparameter[%u].\", prefix, i);\n        print_template_parameter(temp, &template->parameters[i]);\n    }\n    if (template->template_ref != NULL) {\n        PP(\"%stemplate_ref=%s [%p]\\n\", prefix, template->template_ref, template->template_ref);\n    }\n    if (template->has_is_definition) {\n        PP(\"%sis_definition=%u\\n\", prefix, template->is_definition);\n    }\n    if (template->extensions != NULL) {\n        PP(\"%sextensions=[%p] (display not supported)\\n\", prefix, template->extensions);\n    }\n}\n\nvoid print_metric(const char *prefix, org_eclipse_tahu_protobuf_Payload_Metric *metric) {\n    char temp[64];\n    if (prefix == NULL) {\n        prefix = EMPTY_PREFIX;\n    }\n    if (metric->name != NULL) {\n        PP(\"%sname=%s [%p]\\n\", prefix, metric->name, metric->name);\n    }\n    if (metric->has_alias) {\n        PP(\"%salias=%ld\\n\", prefix, metric->alias);\n    }\n    if (metric->has_timestamp) {\n        PP(\"%stimestamp=%ld\\n\", prefix, metric->timestamp);\n    }\n    if (metric->has_datatype) {\n        PP(\"%sdatatype=%u\\n\", prefix, metric->datatype);\n    }\n    if (metric->has_is_historical) {\n        PP(\"%sis_historical=%u\\n\", prefix, metric->is_historical);\n    }\n    if (metric->has_is_transient) {\n        PP(\"%sis_transient=%u\\n\", prefix, metric->is_transient);\n    }\n    if (metric->has_is_null) {\n        PP(\"%sis_null=%u\\n\", prefix, metric->is_null);\n    }\n    if (metric->has_metadata) {\n        snprintf(temp, sizeof(temp), \"%smetadata.\", prefix);\n        print_metadata(temp, &metric->metadata);\n    }\n    if (metric->has_properties) {\n        snprintf(temp, sizeof(temp), \"%sproperties.\", prefix);\n        print_propertyset(temp, &metric->properties);\n    }\n    switch (metric->which_value) {\n    case org_eclipse_tahu_protobuf_Payload_Metric_int_value_tag:\n        PP(\"%sint_value=%d\\n\", prefix, metric->value.int_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_Metric_long_value_tag:\n        PP(\"%slong_value=%ld\\n\", prefix, metric->value.long_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_Metric_float_value_tag:\n        PP(\"%sfloat_value=%f\\n\", prefix, metric->value.float_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_Metric_double_value_tag:\n        PP(\"%sdouble_value=%f\\n\", prefix, metric->value.double_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_Metric_boolean_value_tag:\n        PP(\"%sboolean_value=%d\\n\", prefix, metric->value.boolean_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_Metric_string_value_tag:\n        PP(\"%sstring_value=%s [%p]\\n\", prefix, metric->value.string_value, metric->value.string_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_Metric_dataset_value_tag:\n        snprintf(temp, sizeof(temp), \"%sdataset.\", prefix);\n        print_dataset(temp, &metric->value.dataset_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_Metric_bytes_value_tag:\n        PP(\"%sbytes_value=[%p] (display not supported)\\n\", prefix, metric->value.bytes_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_Metric_template_value_tag:\n        snprintf(temp, sizeof(temp), \"%stemplate.\", prefix);\n        print_template(temp, &metric->value.template_value);\n        break;\n    case org_eclipse_tahu_protobuf_Payload_Metric_extension_value_tag:\n        PP(\"%sextension_value=[%p] (display not supported)\\n\", prefix, metric->value.extension_value.extensions);\n        break;\n    default:\n        PP(\"%sinvalid which_type=%u\", prefix, metric->which_value);\n        break;\n    }\n}\n\nvoid print_payload(org_eclipse_tahu_protobuf_Payload *payload) {\n    char temp[64];\n    PP(\"-----PAYLOAD BEGIN-----\\n\");\n    if (payload->has_timestamp) {\n        PP(\"timestamp=%ld\\n\", payload->timestamp);\n    }\n    if (payload->has_seq) {\n        PP(\"seq=%ld\\n\", payload->seq);\n    }\n    if (payload->uuid != NULL) {\n        PP(\"uuid=%s [%p]\\n\", payload->uuid, payload->uuid);\n    }\n    if (payload->body != NULL) {\n        PP(\"body=[%p] (display not supported)\\n\", payload->body);\n    }\n    if (payload->extensions != NULL) {\n        PP(\"extensions=[%p] (display not supported)\\n\", payload->extensions);\n    }\n    PP(\"metrics=[%p] (count=%u)\\n\", payload->metrics, payload->metrics_count);\n    for (int i = 0; i < payload->metrics_count; i++) {\n        snprintf(temp, sizeof(temp), \"metric[%u].\", i);\n        print_metric(temp, &payload->metrics[i]);\n    }\n    PP(\"-----PAYLOAD END-----\\n\");\n}\n"
  },
  {
    "path": "c/core/src/tahu.pb.c",
    "content": "/* Automatically generated nanopb constant definitions */\n/* Generated by nanopb-0.4.1 */\n\n#include \"tahu.pb.h\"\n#if PB_PROTO_HEADER_VERSION != 40\n#error Regenerate this file with the current version of nanopb generator.\n#endif\n\nPB_BIND(org_eclipse_tahu_protobuf_Payload, org_eclipse_tahu_protobuf_Payload, AUTO)\n\n\nPB_BIND(org_eclipse_tahu_protobuf_Payload_Template, org_eclipse_tahu_protobuf_Payload_Template, AUTO)\n\n\nPB_BIND(org_eclipse_tahu_protobuf_Payload_Template_Parameter, org_eclipse_tahu_protobuf_Payload_Template_Parameter, AUTO)\n\n\nPB_BIND(org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension, org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension, AUTO)\n\n\nPB_BIND(org_eclipse_tahu_protobuf_Payload_DataSet, org_eclipse_tahu_protobuf_Payload_DataSet, AUTO)\n\n\nPB_BIND(org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue, org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue, AUTO)\n\n\nPB_BIND(org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension, org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension, AUTO)\n\n\nPB_BIND(org_eclipse_tahu_protobuf_Payload_DataSet_Row, org_eclipse_tahu_protobuf_Payload_DataSet_Row, AUTO)\n\n\nPB_BIND(org_eclipse_tahu_protobuf_Payload_PropertyValue, org_eclipse_tahu_protobuf_Payload_PropertyValue, AUTO)\n\n\nPB_BIND(org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension, org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension, AUTO)\n\n\nPB_BIND(org_eclipse_tahu_protobuf_Payload_PropertySet, org_eclipse_tahu_protobuf_Payload_PropertySet, AUTO)\n\n\nPB_BIND(org_eclipse_tahu_protobuf_Payload_PropertySetList, org_eclipse_tahu_protobuf_Payload_PropertySetList, AUTO)\n\n\nPB_BIND(org_eclipse_tahu_protobuf_Payload_MetaData, org_eclipse_tahu_protobuf_Payload_MetaData, AUTO)\n\n\nPB_BIND(org_eclipse_tahu_protobuf_Payload_Metric, org_eclipse_tahu_protobuf_Payload_Metric, 2)\n\n\nPB_BIND(org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension, org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension, AUTO)\n\n\n\n"
  },
  {
    "path": "c/core/tahu.options",
    "content": "org.eclipse.tahu.protobuf.Payload.body\t\t\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.metrics\t\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.uuid\t\t\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.DataSet.columns\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.DataSet.rows\t\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.DataSet.types\t\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.string_value\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.DataSet.Row.elements\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.MetaData.content_type\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.MetaData.description\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.MetaData.file_name\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.MetaData.file_type\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.MetaData.md5\t\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.Metric.bytes_value\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.Metric.name\t\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.Metric.string_value\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.PropertySet.keys\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.PropertySet.values\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.PropertySetList.propertyset\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.PropertyValue.string_value\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.Template.metrics\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.Template.template_ref\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.Template.version\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.Template.parameters\t\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.Template.Parameter.name\t\ttype:FT_POINTER\norg.eclipse.tahu.protobuf.Payload.Template.Parameter.string_value\ttype:FT_POINTER\n"
  },
  {
    "path": "c/core/test/.gitignore",
    "content": "test_dynamic.dSYM\ntest_static.dSYM\n"
  },
  {
    "path": "c/core/test/test.c",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2019 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\n#include <stdio.h>\n#include <stdlib.h>\n#include <stdbool.h>\n#include <math.h>\n#include <unistd.h>\n#include <tahu.h>\n#include <tahu.pb.h>\n#include <pb_decode.h>\n#include <pb_encode.h>\n#include <mosquitto.h>\n#include <inttypes.h>\n\n/* Local Functions */\nvoid publisher(struct mosquitto *mosq, char *topic, void *buf, unsigned len);\nvoid publish_births(struct mosquitto *mosq);\nvoid publish_node_birth(struct mosquitto *mosq);\nvoid publish_device_birth(struct mosquitto *mosq);\nvoid publish_ddata_message(struct mosquitto *mosq);\n\n/* Mosquitto Callbacks */\nvoid my_message_callback(struct mosquitto *mosq, void *userdata, const struct mosquitto_message *message);\nvoid my_connect_callback(struct mosquitto *mosq, void *userdata, int result);\nvoid my_subscribe_callback(struct mosquitto *mosq, void *userdata, int mid, int qos_count, const int *granted_qos);\nvoid my_log_callback(struct mosquitto *mosq, void *userdata, int level, const char *str);\n\nuint64_t ALIAS_NODE_CONTROL_NEXT_SERVER = 0;\nuint64_t ALIAS_NODE_CONTROL_REBIRTH     = 1;\nuint64_t ALIAS_NODE_CONTROL_REBOOT      = 2;\nuint64_t ALIAS_NODE_METRIC_0            = 3;\nuint64_t ALIAS_NODE_METRIC_1            = 4;\nuint64_t ALIAS_NODE_METRIC_UINT32       = 5;\nuint64_t ALIAS_NODE_METRIC_FLOAT        = 6;\nuint64_t ALIAS_NODE_METRIC_DOUBLE       = 7;\nuint64_t ALIAS_NODE_METRIC_DATASET      = 8;\nuint64_t ALIAS_NODE_METRIC_2            = 9;\nuint64_t ALIAS_DEVICE_METRIC_0          = 10;\nuint64_t ALIAS_DEVICE_METRIC_1          = 11;\nuint64_t ALIAS_DEVICE_METRIC_2          = 12;\nuint64_t ALIAS_DEVICE_METRIC_3          = 13;\nuint64_t ALIAS_DEVICE_METRIC_UDT_INST   = 14;\nuint64_t ALIAS_DEVICE_METRIC_INT8       = 15;\nuint64_t ALIAS_DEVICE_METRIC_UINT32     = 16;\nuint64_t ALIAS_DEVICE_METRIC_FLOAT      = 17;\nuint64_t ALIAS_DEVICE_METRIC_DOUBLE     = 18;\nuint64_t ALIAS_NODE_METRIC_I8       = 19;\nuint64_t ALIAS_NODE_METRIC_I16      = 20;\nuint64_t ALIAS_NODE_METRIC_I32      = 21;\nuint64_t ALIAS_NODE_METRIC_I64      = 22;\nuint64_t ALIAS_NODE_METRIC_UI8      = 23;\nuint64_t ALIAS_NODE_METRIC_UI16     = 24;\nuint64_t ALIAS_NODE_METRIC_UI32     = 25;\nuint64_t ALIAS_NODE_METRIC_UI64     = 26;\n\nint main(int argc, char *argv[]) {\n\n    // MQTT Parameters\n    char *host = \"broker.hivemq.com\";\n    int port = 1883;\n    int keepalive = 60;\n    bool clean_session = true;\n    struct mosquitto *mosq = NULL;\n\n    // MQTT Setup\n    srand(time(NULL));\n    mosquitto_lib_init();\n    mosq = mosquitto_new(NULL, clean_session, NULL);\n    if (!mosq) {\n        fprintf(stderr, \"Error: Out of memory.\\n\");\n        return 1;\n    }\n\n    fprintf(stdout, \"Setting up callbacks\\n\");\n    mosquitto_log_callback_set(mosq, my_log_callback);\n    mosquitto_connect_callback_set(mosq, my_connect_callback);\n    mosquitto_message_callback_set(mosq, my_message_callback);\n    mosquitto_subscribe_callback_set(mosq, my_subscribe_callback);\n\n    mosquitto_username_pw_set(mosq, \"admin\", \"changeme\");\n    mosquitto_will_set(mosq, \"spBv1.0/Sparkplug B Devices/NDEATH/C Edge Node 1\", 0, NULL, 0, false);\n\n    // Optional 'self-signed' SSL parameters for MQTT\n    //mosquitto_tls_insecure_set(mosq, true);\n    //mosquitto_tls_opts_set(mosq, 0, \"tlsv1.2\", NULL);               // 0 is DO NOT SSL_VERIFY_PEER\n\n    // Optional 'real' SSL parameters for MQTT\n    //mosquitto_tls_set(mosq, NULL, \"/etc/ssl/certs/\", NULL, NULL, NULL);   // Necessary if the CA or other certs need to be picked up elsewhere on the local filesystem\n    //mosquitto_tls_insecure_set(mosq, false);\n    //mosquitto_tls_opts_set(mosq, 1, \"tlsv1.2\", NULL);               // 1 is SSL_VERIFY_PEER\n\n    // MQTT Connect\n    fprintf(stdout, \"Starting connection...\\n\");\n    if (mosquitto_connect(mosq, host, port, keepalive)) {\n        fprintf(stderr, \"Unable to connect.\\n\");\n        return 1;\n    }\n\n    // Publish the NBIRTH and DBIRTH Sparkplug messages (Birth Certificates)\n    publish_births(mosq);\n\n    // Loop and publish more DDATA messages every 5 seconds.  Note this should only be done in real/production\n    // scenarios with change events on inputs.  Because Sparkplug ensures state there is no reason to send DDATA\n    // messages unless the state of a I/O point has changed.\n    int i;\n    for (i = 0; i < 100; i++) {\n        publish_ddata_message(mosq);\n        int j;\n        for (j = 0; j < 50; j++) {\n            usleep(100000);\n            mosquitto_loop(mosq, -1, 1);\n        }\n    }\n\n    //mosquitto_loop_forever(mosq, -1, 1);\n\n\n    // Close and cleanup\n    mosquitto_destroy(mosq);\n    mosquitto_lib_cleanup();\n    return 0;\n}\n\n/*\n * Callback for incoming MQTT messages. Since this is a Sparkplug implementation these will be NCMD and DCMD messages\n */\nvoid my_message_callback(struct mosquitto *mosq, void *userdata, const struct mosquitto_message *message) {\n\n    if (message->payloadlen) {\n        fprintf(stdout, \"%s :: %d\\n\", message->topic, message->payloadlen);\n    } else {\n        fprintf(stdout, \"%s (null)\\n\", message->topic);\n    }\n    fflush(stdout);\n\n    // Decode the payload\n    org_eclipse_tahu_protobuf_Payload inbound_payload = org_eclipse_tahu_protobuf_Payload_init_zero;\n    if (decode_payload(&inbound_payload, message->payload, message->payloadlen) < 0) {\n        fprintf(stderr, \"Failed to decode the payload\\n\");\n    }\n\n    print_payload(&inbound_payload);\n\n    // Get the number of metrics in the payload and iterate over them handling them as needed\n    for (int i = 0; i < inbound_payload.metrics_count; i++) {\n        // Handle the incoming message as necessary - start with the 'Node Control' metrics\n        if (inbound_payload.metrics[i].alias == ALIAS_NODE_CONTROL_NEXT_SERVER) {\n            // 'Node Control/Next Server' is an NCMD used to tell the device/client application to\n            // disconnect from the current MQTT server and connect to the next MQTT server in the\n            // list of available servers.  This is used for clients that have a pool of MQTT servers\n            // to connect to.\n            fprintf(stderr, \"'Node Control/Next Server' is not implemented in this example\\n\");\n        } else if (inbound_payload.metrics[i].alias == ALIAS_NODE_CONTROL_REBIRTH) {\n            // 'Node Control/Rebirth' is an NCMD used to tell the device/client application to resend\n            // its full NBIRTH and DBIRTH again.  MQTT Engine will send this NCMD to a device/client\n            // application if it receives an NDATA or DDATA with a metric that was not published in the\n            // original NBIRTH or DBIRTH.  This is why the application must send all known metrics in\n            // its original NBIRTH and DBIRTH messages.\n            publish_births(mosq);\n        } else if (inbound_payload.metrics[i].alias == ALIAS_NODE_CONTROL_REBOOT) {\n            // 'Node Control/Reboot' is an NCMD used to tell a device/client application to reboot\n            // This can be used for devices that need a full application reset via a soft reboot.\n            // In this case, we fake a full reboot with a republishing of the NBIRTH and DBIRTH\n            // messages.\n            publish_births(mosq);\n        } else if (inbound_payload.metrics[i].alias == ALIAS_DEVICE_METRIC_2) {\n            // This is a metric we declared in our DBIRTH message and we're emulating an output.\n            // So, on incoming 'writes' to the output we must publish a DDATA with the new output\n            // value.  If this were a real output we'd write to the output and then read it back\n            // before publishing a DDATA message.\n\n            // We know this is an Int16 because of how we declated it in the DBIRTH\n            uint32_t new_value = inbound_payload.metrics[i].value.int_value;\n            fprintf(stdout, \"CMD message for output/Device Metric2 - New Value: %d\\n\", new_value);\n\n            // Create the DDATA payload\n            org_eclipse_tahu_protobuf_Payload ddata_payload;\n            get_next_payload(&ddata_payload);\n            // Note the Metric name 'output/Device Metric2' is not needed because we're using aliases\n            add_simple_metric(&ddata_payload, NULL, true, ALIAS_DEVICE_METRIC_2, METRIC_DATA_TYPE_INT16, false, false, &new_value, sizeof(new_value));\n\n            // Encode the payload into a binary format so it can be published in the MQTT message.\n            // The binary_buffer must be large enough to hold the contents of the binary payload\n            size_t buffer_length = 128;\n            uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n            size_t message_length = encode_payload(binary_buffer, buffer_length, &ddata_payload);\n\n            // Publish the DDATA on the appropriate topic\n            mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DDATA/C Edge Node 1/Emulated Device\", message_length, binary_buffer, 0, false);\n\n            // Free the memory\n            free(binary_buffer);\n            free_payload(&ddata_payload);\n        } else if (inbound_payload.metrics[i].alias == ALIAS_DEVICE_METRIC_3) {\n            // This is a metric we declared in our DBIRTH message and we're emulating an output.\n            // So, on incoming 'writes' to the output we must publish a DDATA with the new output\n            // value.  If this were a real output we'd write to the output and then read it back\n            // before publishing a DDATA message.\n\n            // We know this is an Boolean because of how we declared it in the DBIRTH\n            bool new_value = inbound_payload.metrics[i].value.boolean_value;\n            fprintf(stdout, \"CMD message for output/Device Metric3 - New Value: %s\\n\", new_value ? \"true\" : \"false\");\n\n            // Create the DDATA payload\n            org_eclipse_tahu_protobuf_Payload ddata_payload;\n            get_next_payload(&ddata_payload);\n            // Note the Metric name 'output/Device Metric3' is not needed because we're using aliases\n            add_simple_metric(&ddata_payload, NULL, true, ALIAS_DEVICE_METRIC_3, METRIC_DATA_TYPE_BOOLEAN, false, false, &new_value, sizeof(new_value));\n\n            // Encode the payload into a binary format so it can be published in the MQTT message.\n            // The binary_buffer must be large enough to hold the contents of the binary payload\n            size_t buffer_length = 128;\n            uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n            size_t message_length = encode_payload(binary_buffer, buffer_length, &ddata_payload);\n\n            // Publish the DDATA on the appropriate topic\n            mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DDATA/C Edge Node 1/Emulated Device\", message_length, binary_buffer, 0, false);\n\n            // Free the memory\n            free(binary_buffer);\n            free_payload(&ddata_payload);\n        } else if (inbound_payload.metrics[i].alias == ALIAS_DEVICE_METRIC_FLOAT) {\n            // This is a metric we declared in our DBIRTH message and we're emulating an output.\n            // So, on incoming 'writes' to the output we must publish a DDATA with the new output\n            // value.  If this were a real output we'd write to the output and then read it back\n            // before publishing a DDATA message.\n\n            // We know this is an float because of how we declared it in the DBIRTH\n            float new_value = inbound_payload.metrics[i].value.float_value;\n            fprintf(stdout, \"CMD message for Device Metric FLOAT - New Value: %f\\n\", new_value);\n\n            // Create the DDATA payload\n            org_eclipse_tahu_protobuf_Payload ddata_payload;\n            get_next_payload(&ddata_payload);\n            // Note the Metric name 'output/Device Metric FLOAT' is not needed because we're using aliases\n            add_simple_metric(&ddata_payload, NULL, true, ALIAS_DEVICE_METRIC_FLOAT, METRIC_DATA_TYPE_FLOAT, false, false, &new_value, sizeof(new_value));\n\n            // Encode the payload into a binary format so it can be published in the MQTT message.\n            // The binary_buffer must be large enough to hold the contents of the binary payload\n            size_t buffer_length = 128;\n            uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n            size_t message_length = encode_payload(binary_buffer, buffer_length, &ddata_payload);\n\n            // Publish the DDATA on the appropriate topic\n            mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DDATA/C Edge Node 1/Emulated Device\", message_length, binary_buffer, 0, false);\n\n            // Free the memory\n            free(binary_buffer);\n            free_payload(&ddata_payload);\n        } else if (inbound_payload.metrics[i].alias == ALIAS_DEVICE_METRIC_DOUBLE) {\n            // This is a metric we declared in our DBIRTH message and we're emulating an output.\n            // So, on incoming 'writes' to the output we must publish a DDATA with the new output\n            // value.  If this were a real output we'd write to the output and then read it back\n            // before publishing a DDATA message.\n\n            // We know this is an double because of how we declared it in the DBIRTH\n            double new_value = inbound_payload.metrics[i].value.double_value;\n            fprintf(stdout, \"CMD message for Device Metric DOUBLE - New Value: %f\\n\", new_value);\n\n            // Create the DDATA payload\n            org_eclipse_tahu_protobuf_Payload ddata_payload;\n            get_next_payload(&ddata_payload);\n            // Note the Metric name 'output/Device Metric DOUBLE' is not needed because we're using aliases\n            add_simple_metric(&ddata_payload, NULL, true, ALIAS_DEVICE_METRIC_DOUBLE, METRIC_DATA_TYPE_DOUBLE, false, false, &new_value, sizeof(new_value));\n\n            // Encode the payload into a binary format so it can be published in the MQTT message.\n            // The binary_buffer must be large enough to hold the contents of the binary payload\n            size_t buffer_length = 128;\n            uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n            size_t message_length = encode_payload(binary_buffer, buffer_length, &ddata_payload);\n\n            // Publish the DDATA on the appropriate topic\n            mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DDATA/C Edge Node 1/Emulated Device\", message_length, binary_buffer, 0, false);\n\n            // Free the memory\n            free(binary_buffer);\n            free_payload(&ddata_payload);\n        } else {\n            fprintf(stderr, \"Unknown CMD: %s\\n\", inbound_payload.metrics[i].name);\n        }\n    }\n    free_payload(&inbound_payload);\n}\n\n/*\n * Callback for successful or unsuccessful MQTT connect.  Upon successful connect, subscribe to our Sparkplug NCMD and DCMD messages.\n * A production application should handle MQTT connect failures and reattempt as necessary.\n */\nvoid my_connect_callback(struct mosquitto *mosq, void *userdata, int result) {\n    if (!result) {\n        // Subscribe to commands\n        fprintf(stdout, \"Subscribing on CMD topics\\n\");\n        mosquitto_subscribe(mosq, NULL, \"spBv1.0/Sparkplug B Devices/NCMD/C Edge Node 1/#\", 0);\n        mosquitto_subscribe(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DCMD/C Edge Node 1/#\", 0);\n    } else {\n        fprintf(stderr, \"MQTT Connect failed\\n\");\n    }\n}\n\n/*\n * Callback for successful MQTT subscriptions.\n */\nvoid my_subscribe_callback(struct mosquitto *mosq, void *userdata, int mid, int qos_count, const int *granted_qos) {\n    int i;\n\n    fprintf(stdout, \"Subscribed (mid: %d): %d\", mid, granted_qos[0]);\n    for (i = 1; i < qos_count; i++) {\n        fprintf(stdout, \", %d\", granted_qos[i]);\n    }\n    fprintf(stdout, \"\\n\");\n}\n\n/*\n * MQTT logger callback\n */\nvoid my_log_callback(struct mosquitto *mosq, void *userdata, int level, const char *str) {\n    // Print all log messages regardless of level.\n    fprintf(stdout, \"%s\\n\", str);\n}\n\n/*\n * Helper function to publish MQTT messages to the MQTT server\n */\nvoid publisher(struct mosquitto *mosq, char *topic, void *buf, unsigned len) {\n    // publish the data\n    mosquitto_publish(mosq, NULL, topic, len, buf, 0, false);\n}\n\n/*\n * Helper to publish the Sparkplug NBIRTH and DBIRTH messages after initial MQTT connect.\n * This is also used for Rebirth requests from the backend.\n */\nvoid publish_births(struct mosquitto *mosq) {\n    // Initialize the sequence number for Sparkplug MQTT messages\n    // This must be zero on every NBIRTH publish\n    reset_sparkplug_sequence();\n\n    // Publish the NBIRTH\n    publish_node_birth(mosq);\n\n    // Publish the DBIRTH\n    publish_device_birth(mosq);\n}\n\n/*\n * Helper function to publish a NBIRTH message.  The NBIRTH should include all 'node control' metrics that denote device capability.\n * In addition, it should include every node metric that may ever be published from this edge node.  If any NDATA messages arrive at\n * MQTT Engine that were not included in the NBIRTH, MQTT Engine will request a Rebirth from the device.\n */\nvoid publish_node_birth(struct mosquitto *mosq) {\n    // Create the NBIRTH payload\n    org_eclipse_tahu_protobuf_Payload nbirth_payload;\n    get_next_payload(&nbirth_payload);\n    nbirth_payload.uuid = strdup(\"MyUUID\");\n\n    // Add node control metrics\n    fprintf(stdout, \"Adding metric: 'Node Control/Next Server'\\n\");\n    bool next_server_value = false;\n    add_simple_metric(&nbirth_payload, \"Node Control/Next Server\", true, ALIAS_NODE_CONTROL_NEXT_SERVER, METRIC_DATA_TYPE_BOOLEAN, false, false, &next_server_value, sizeof(next_server_value));\n    fprintf(stdout, \"Adding metric: 'Node Control/Rebirth'\\n\");\n    bool rebirth_value = false;\n    add_simple_metric(&nbirth_payload, \"Node Control/Rebirth\", true, ALIAS_NODE_CONTROL_REBIRTH, METRIC_DATA_TYPE_BOOLEAN, false, false, &rebirth_value, sizeof(rebirth_value));\n    fprintf(stdout, \"Adding metric: 'Node Control/Reboot'\\n\");\n    bool reboot_value = false;\n    add_simple_metric(&nbirth_payload, \"Node Control/Reboot\", true, ALIAS_NODE_CONTROL_REBOOT, METRIC_DATA_TYPE_BOOLEAN, false, false, &reboot_value, sizeof(reboot_value));\n\n    // Add some regular node metrics\n    fprintf(stdout, \"Adding metric: 'Node Metric0'\\n\");\n    const char *nbirth_metric_zero_value = \"hello node\";\n    add_simple_metric(&nbirth_payload, \"Node Metric0\", true, ALIAS_NODE_METRIC_0, METRIC_DATA_TYPE_STRING, false, false, nbirth_metric_zero_value, sizeof(nbirth_metric_zero_value));\n    fprintf(stdout, \"Adding metric: 'Node Metric1'\\n\");\n    bool nbirth_metric_one_value = true;\n    add_simple_metric(&nbirth_payload, \"Node Metric1\", true, ALIAS_NODE_METRIC_1, METRIC_DATA_TYPE_BOOLEAN, false, false, &nbirth_metric_one_value, sizeof(nbirth_metric_one_value));\n    fprintf(stdout, \"Adding metric: 'Node Metric UINT32'\\n\");\n    uint32_t nbirth_metric_uint32_value = 100;\n    add_simple_metric(&nbirth_payload, \"Node Metric UINT32\", true, ALIAS_NODE_METRIC_UINT32, METRIC_DATA_TYPE_UINT32, false, false, &nbirth_metric_uint32_value, sizeof(nbirth_metric_uint32_value));\n    fprintf(stdout, \"Adding metric: 'Node Metric FLOAT'\\n\");\n    float nbirth_metric_float_value = 100.12;\n    add_simple_metric(&nbirth_payload, \"Node Metric FLOAT\", true, ALIAS_NODE_METRIC_FLOAT, METRIC_DATA_TYPE_FLOAT, false, false, &nbirth_metric_float_value, sizeof(nbirth_metric_float_value));\n    double nbirth_metric_double_value = 1000.123456789;\n    add_simple_metric(&nbirth_payload, \"Node Metric DOUBLE\", true, ALIAS_NODE_METRIC_DOUBLE, METRIC_DATA_TYPE_DOUBLE, false, false, &nbirth_metric_double_value, sizeof(nbirth_metric_double_value));\n\n    // All INT Types\n    fprintf(stdout, \"Adding metric: 'Node Metric I8'\\n\");\n    int8_t nbirth_metric_i8_value = 100;\n    add_simple_metric(&nbirth_payload, \"Node Metric I8\", true, ALIAS_NODE_METRIC_I8, METRIC_DATA_TYPE_INT8, false, false, &nbirth_metric_i8_value, sizeof(nbirth_metric_i8_value));\n    fprintf(stdout, \"Adding metric: 'Node Metric I16'\\n\");\n    int16_t nbirth_metric_i16_value = 100;\n    add_simple_metric(&nbirth_payload, \"Node Metric I16\", true, ALIAS_NODE_METRIC_I16, METRIC_DATA_TYPE_INT16, false, false, &nbirth_metric_i16_value, sizeof(nbirth_metric_i16_value));\n    fprintf(stdout, \"Adding metric: 'Node Metric I32'\\n\");\n    int32_t nbirth_metric_i32_value = 100;\n    add_simple_metric(&nbirth_payload, \"Node Metric I32\", true, ALIAS_NODE_METRIC_I32, METRIC_DATA_TYPE_INT32, false, false, &nbirth_metric_i32_value, sizeof(nbirth_metric_i32_value));\n    fprintf(stdout, \"Adding metric: 'Node Metric I64'\\n\");\n    int64_t nbirth_metric_i64_value = 100;\n    add_simple_metric(&nbirth_payload, \"Node Metric I64\", true, ALIAS_NODE_METRIC_I64, METRIC_DATA_TYPE_INT64, false, false, &nbirth_metric_i64_value, sizeof(nbirth_metric_i64_value));\n\n    // All UINT Types\n    fprintf(stdout, \"Adding metric: 'Node Metric UI8'\\n\");\n    uint8_t nbirth_metric_ui8_value = 200;\n    add_simple_metric(&nbirth_payload, \"Node Metric UI8\", true, ALIAS_NODE_METRIC_UI8, METRIC_DATA_TYPE_UINT8, false, false, &nbirth_metric_ui8_value, sizeof(nbirth_metric_ui8_value));\n    fprintf(stdout, \"Adding metric: 'Node Metric UI16'\\n\");\n    uint16_t nbirth_metric_ui16_value = 200;\n    add_simple_metric(&nbirth_payload, \"Node Metric UI16\", true, ALIAS_NODE_METRIC_UI16, METRIC_DATA_TYPE_UINT16, false, false, &nbirth_metric_ui16_value, sizeof(nbirth_metric_ui16_value));\n    fprintf(stdout, \"Adding metric: 'Node Metric UI32'\\n\");\n    uint32_t nbirth_metric_ui32_value = 200;\n    add_simple_metric(&nbirth_payload, \"Node Metric UI32\", true, ALIAS_NODE_METRIC_UI32, METRIC_DATA_TYPE_UINT32, false, false, &nbirth_metric_ui32_value, sizeof(nbirth_metric_ui32_value));\n    fprintf(stdout, \"Adding metric: 'Node Metric UI64'\\n\");\n    uint64_t nbirth_metric_ui64_value = 200;\n    add_simple_metric(&nbirth_payload, \"Node Metric UI64\", true, ALIAS_NODE_METRIC_UI64, METRIC_DATA_TYPE_UINT64, false, false, &nbirth_metric_ui64_value, sizeof(nbirth_metric_ui64_value));\n\n    // Create a DataSet\n    org_eclipse_tahu_protobuf_Payload_DataSet dataset = org_eclipse_tahu_protobuf_Payload_DataSet_init_default;\n    uint32_t datatypes[] = { DATA_SET_DATA_TYPE_INT8, DATA_SET_DATA_TYPE_INT16, DATA_SET_DATA_TYPE_INT32 };\n    const char *column_keys[] = { \"Int8s\", \"Int16s\", \"Int32s\" };\n    org_eclipse_tahu_protobuf_Payload_DataSet_Row *row_data = (org_eclipse_tahu_protobuf_Payload_DataSet_Row *)\n        calloc(2, sizeof(org_eclipse_tahu_protobuf_Payload_DataSet_Row));\n    row_data[0].elements_count = 3;\n    row_data[0].elements = (org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue *)\n        calloc(3, sizeof(org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue));\n    row_data[0].elements[0].which_value = org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_int_value_tag;\n    row_data[0].elements[0].value.int_value = 0;\n    row_data[0].elements[1].which_value = org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_int_value_tag;\n    row_data[0].elements[1].value.int_value = 1;\n    row_data[0].elements[2].which_value = org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_int_value_tag;\n    row_data[0].elements[2].value.int_value = 2;\n    row_data[1].elements_count = 3;\n    row_data[1].elements = (org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue *)\n        calloc(3, sizeof(org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue));\n    row_data[1].elements[0].which_value = org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_int_value_tag;\n    row_data[1].elements[0].value.int_value = 3;\n    row_data[1].elements[1].which_value = org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_int_value_tag;\n    row_data[1].elements[1].value.int_value = 4;\n    row_data[1].elements[2].which_value = org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_int_value_tag;\n    row_data[1].elements[2].value.int_value = 5;\n    init_dataset(&dataset, 2, 3, datatypes, column_keys, row_data);\n    free(row_data);\n\n    // Create the a Metric with the DataSet value and add it to the payload\n    fprintf(stdout, \"Adding metric: 'DataSet'\\n\");\n    org_eclipse_tahu_protobuf_Payload_Metric dataset_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    init_metric(&dataset_metric, \"DataSet\", true, ALIAS_NODE_METRIC_DATASET, METRIC_DATA_TYPE_DATASET, false, false, &dataset, sizeof(dataset));\n    add_metric_to_payload(&nbirth_payload, &dataset_metric);\n\n    // Add a metric with a custom property\n    fprintf(stdout, \"Adding metric: 'Node Metric2'\\n\");\n    org_eclipse_tahu_protobuf_Payload_Metric prop_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    uint32_t nbirth_metric_two_value = 13;\n    init_metric(&prop_metric, \"Node Metric2\", true, ALIAS_NODE_METRIC_2, METRIC_DATA_TYPE_INT16, false, false, &nbirth_metric_two_value, sizeof(nbirth_metric_two_value));\n    org_eclipse_tahu_protobuf_Payload_PropertySet properties = org_eclipse_tahu_protobuf_Payload_PropertySet_init_default;\n    add_property_to_set(&properties, \"engUnit\", PROPERTY_DATA_TYPE_STRING, \"MyCustomUnits\", sizeof(\"MyCustomUnits\"));\n    add_propertyset_to_metric(&prop_metric, &properties);\n    add_metric_to_payload(&nbirth_payload, &prop_metric);\n\n    // Create a metric called RPMs which is a member of the UDT definition - note aliases do not apply to UDT members\n    org_eclipse_tahu_protobuf_Payload_Metric rpms_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    uint32_t rpms_value = 0;\n    init_metric(&rpms_metric, \"RPMs\", false, 0, METRIC_DATA_TYPE_INT32, false, false, &rpms_value, sizeof(rpms_value));\n\n    // Create a metric called AMPs which is a member of the UDT definition - note aliases do not apply to UDT members\n    org_eclipse_tahu_protobuf_Payload_Metric amps_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    uint32_t amps_value = 0;\n    init_metric(&amps_metric, \"AMPs\", false, 0, METRIC_DATA_TYPE_INT32, false, false, &amps_value, sizeof(amps_value));\n\n    // Create a Template/UDT Parameter - this is purely for example of including parameters and is not actually used by UDT instances\n    org_eclipse_tahu_protobuf_Payload_Template_Parameter parameter = org_eclipse_tahu_protobuf_Payload_Template_Parameter_init_default;\n    parameter.name = strdup(\"Index\");\n    parameter.has_type = true;\n    parameter.type = PARAMETER_DATA_TYPE_STRING;\n    parameter.which_value = org_eclipse_tahu_protobuf_Payload_Template_Parameter_string_value_tag;\n    parameter.value.string_value = strdup(\"0\");\n\n    // Create the UDT definition value which includes the UDT members and parameters\n    org_eclipse_tahu_protobuf_Payload_Template udt_template = org_eclipse_tahu_protobuf_Payload_Template_init_default;\n    udt_template.metrics_count = 2;\n    udt_template.metrics = (org_eclipse_tahu_protobuf_Payload_Metric *)calloc(2, sizeof(org_eclipse_tahu_protobuf_Payload_Metric));\n    udt_template.metrics[0] = rpms_metric;\n    udt_template.metrics[1] = amps_metric;\n    udt_template.parameters_count = 1;\n    udt_template.parameters = (org_eclipse_tahu_protobuf_Payload_Template_Parameter *)calloc(1, sizeof(org_eclipse_tahu_protobuf_Payload_Template_Parameter));\n    udt_template.parameters[0] = parameter;\n    udt_template.template_ref = NULL;\n    udt_template.has_is_definition = true;\n    udt_template.is_definition = true;\n\n    // Create the root UDT definition and add the UDT definition value which includes the UDT members and parameters\n    org_eclipse_tahu_protobuf_Payload_Metric metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    init_metric(&metric, \"_types_/Custom_Motor\", false, 0, METRIC_DATA_TYPE_TEMPLATE, false, false, &udt_template, sizeof(udt_template));\n\n    // Add the UDT to the payload\n    add_metric_to_payload(&nbirth_payload, &metric);\n\n#ifdef SPARKPLUG_DEBUG\n    // Print the payload for debug\n    print_payload(&nbirth_payload);\n#endif\n\n    // Encode the payload into a binary format so it can be published in the MQTT message.\n    // The binary_buffer must be large enough to hold the contents of the binary payload\n    size_t buffer_length = 1024;\n    uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n    size_t message_length = encode_payload(binary_buffer, buffer_length, &nbirth_payload);\n\n    // Publish the NBIRTH on the appropriate topic\n    mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/NBIRTH/C Edge Node 1\", message_length, binary_buffer, 0, false);\n\n    // Free the memory\n    free(binary_buffer);\n    free_payload(&nbirth_payload);\n}\n\nvoid publish_device_birth(struct mosquitto *mosq) {\n    // Create the DBIRTH payload\n    org_eclipse_tahu_protobuf_Payload dbirth_payload;\n    get_next_payload(&dbirth_payload);\n\n    // Add some device metrics\n    fprintf(stdout, \"Adding metric: 'input/Device Metric0'\\n\");\n    char dbirth_metric_zero_value[] = \"hello device\";\n    add_simple_metric(&dbirth_payload, \"input/Device Metric0\", true, ALIAS_DEVICE_METRIC_0, METRIC_DATA_TYPE_STRING, false, false, &dbirth_metric_zero_value, sizeof(dbirth_metric_zero_value));\n    fprintf(stdout, \"Adding metric: 'input/Device Metric1'\\n\");\n    bool dbirth_metric_one_value = true;\n    add_simple_metric(&dbirth_payload, \"input/Device Metric1\", true, ALIAS_DEVICE_METRIC_1, METRIC_DATA_TYPE_BOOLEAN, false, false, &dbirth_metric_one_value, sizeof(dbirth_metric_one_value));\n    fprintf(stdout, \"Adding metric: 'output/Device Metric2'\\n\");\n    uint32_t dbirth_metric_two_value = 16;\n    add_simple_metric(&dbirth_payload, \"output/Device Metric2\", true, ALIAS_DEVICE_METRIC_2, METRIC_DATA_TYPE_INT16, false, false, &dbirth_metric_two_value, sizeof(dbirth_metric_two_value));\n    fprintf(stdout, \"Adding metric: 'output/Device Metric3'\\n\");\n    bool dbirth_metric_three_value = true;\n    add_simple_metric(&dbirth_payload, \"output/Device Metric3\", true, ALIAS_DEVICE_METRIC_3, METRIC_DATA_TYPE_BOOLEAN, false, false, &dbirth_metric_three_value, sizeof(dbirth_metric_three_value));\n    fprintf(stdout, \"Adding metric: 'Device Metric INT8'\\n\");\n    int dbirth_metric_int8_value = 100;\n    add_simple_metric(&dbirth_payload, \"Device Metric INT8\", true, ALIAS_DEVICE_METRIC_INT8, METRIC_DATA_TYPE_INT8, false, false, &dbirth_metric_int8_value, sizeof(dbirth_metric_int8_value));\n    fprintf(stdout, \"Adding metric: 'Device Metric UINT32'\\n\");\n    int dbirth_metric_uint32_value = 100;\n    add_simple_metric(&dbirth_payload, \"Device Metric UINT32\", true, ALIAS_DEVICE_METRIC_UINT32, METRIC_DATA_TYPE_UINT32, false, false, &dbirth_metric_uint32_value, sizeof(dbirth_metric_uint32_value));\n    fprintf(stdout, \"Adding metric: 'Device Metric FLOAT'\\n\");\n    float dbirth_metric_float_value = 100.12;\n    add_simple_metric(&dbirth_payload, \"Device Metric FLOAT\", true, ALIAS_DEVICE_METRIC_FLOAT, METRIC_DATA_TYPE_FLOAT, false, false, &dbirth_metric_float_value, sizeof(dbirth_metric_float_value));\n    double dbirth_metric_double_value = 1000.123;\n    add_simple_metric(&dbirth_payload, \"Device Metric DOUBLE\", true, ALIAS_DEVICE_METRIC_DOUBLE, METRIC_DATA_TYPE_DOUBLE, false, false, &dbirth_metric_double_value, sizeof(dbirth_metric_double_value));\n\n    // Create a metric called RPMs for the UDT instance\n    org_eclipse_tahu_protobuf_Payload_Metric rpms_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    uint32_t rpms_value = 123;\n    init_metric(&rpms_metric, \"RPMs\", false, 0, METRIC_DATA_TYPE_INT32, false, false, &rpms_value, sizeof(rpms_value));\n\n    // Create a metric called AMPs for the UDT instance and create a custom property (milliamps) for it\n    org_eclipse_tahu_protobuf_Payload_Metric amps_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    uint32_t amps_value = 456;\n    init_metric(&amps_metric, \"AMPs\", false, 0, METRIC_DATA_TYPE_INT32, false, false, &amps_value, sizeof(amps_value));\n    org_eclipse_tahu_protobuf_Payload_PropertySet properties = org_eclipse_tahu_protobuf_Payload_PropertySet_init_default;\n    add_property_to_set(&properties, \"engUnit\", PROPERTY_DATA_TYPE_STRING, \"milliamps\", sizeof(\"milliamps\"));\n    add_propertyset_to_metric(&amps_metric, &properties);\n\n    // Create a Template/UDT instance Parameter - this is purely for example of including parameters and is not actually used by UDT instances\n    org_eclipse_tahu_protobuf_Payload_Template_Parameter parameter = org_eclipse_tahu_protobuf_Payload_Template_Parameter_init_default;\n    parameter.name = strdup(\"Index\");\n    parameter.has_type = true;\n    parameter.type = PARAMETER_DATA_TYPE_STRING;\n    parameter.which_value = org_eclipse_tahu_protobuf_Payload_Template_Parameter_string_value_tag;\n    parameter.value.string_value = strdup(\"1\");\n\n    // Create the UDT instance value which includes the UDT members and parameters\n    org_eclipse_tahu_protobuf_Payload_Template udt_template = org_eclipse_tahu_protobuf_Payload_Template_init_default;\n    udt_template.version = NULL;\n    udt_template.metrics_count = 2;\n    udt_template.metrics = (org_eclipse_tahu_protobuf_Payload_Metric *)calloc(2, sizeof(org_eclipse_tahu_protobuf_Payload_Metric));\n    udt_template.metrics[0] = rpms_metric;\n    udt_template.metrics[1] = amps_metric;\n    udt_template.parameters_count = 1;\n    udt_template.parameters = (org_eclipse_tahu_protobuf_Payload_Template_Parameter *)calloc(1, sizeof(org_eclipse_tahu_protobuf_Payload_Template_Parameter));\n    udt_template.parameters[0] = parameter;\n    udt_template.template_ref = strdup(\"Custom_Motor\");\n    udt_template.has_is_definition = true;\n    udt_template.is_definition = false;\n\n    // Create the root UDT instance and add the UDT instance value\n    org_eclipse_tahu_protobuf_Payload_Metric metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    init_metric(&metric, \"My_Custom_Motor\", true, ALIAS_DEVICE_METRIC_UDT_INST, METRIC_DATA_TYPE_TEMPLATE, false, false, &udt_template, sizeof(udt_template));\n\n    // Add the UDT Instance to the payload\n    add_metric_to_payload(&dbirth_payload, &metric);\n\n#ifdef SPARKPLUG_DEBUG\n    // Print the payload\n    print_payload(&dbirth_payload);\n#endif\n\n    // Encode the payload into a binary format so it can be published in the MQTT message.\n    // The binary_buffer must be large enough to hold the contents of the binary payload\n    size_t buffer_length = 1024;\n    uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n    size_t message_length = encode_payload(binary_buffer, buffer_length, &dbirth_payload);\n\n    // Publish the DBIRTH on the appropriate topic\n    mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DBIRTH/C Edge Node 1/Emulated Device\", message_length, binary_buffer, 0, false);\n\n    // Free the memory\n    free(binary_buffer);\n    free_payload(&dbirth_payload);\n}\n\nvoid publish_ddata_message(struct mosquitto *mosq) {\n    // Create the DDATA payload\n    org_eclipse_tahu_protobuf_Payload ddata_payload;\n    get_next_payload(&ddata_payload);\n\n    // Add some device metrics to denote changed values on inputs\n    fprintf(stdout, \"Adding metric: 'input/Device Metric0'\\n\");\n    char ddata_metric_zero_value[15];\n    snprintf(ddata_metric_zero_value, sizeof(ddata_metric_zero_value), \"%04X-%04X-%04X\", (rand() % 0x10000), (rand() % 0x10000), (rand() % 0x10000));\n    // Note the Metric name 'input/Device Metric0' is not needed because we're using aliases\n    add_simple_metric(&ddata_payload, NULL, true, ALIAS_DEVICE_METRIC_0, METRIC_DATA_TYPE_STRING, false, false, ddata_metric_zero_value, sizeof(ddata_metric_zero_value));\n    fprintf(stdout, \"Adding metric: 'input/Device Metric1'\\n\");\n    bool ddata_metric_one_value = rand() % 2;\n    // Note the Metric name 'input/Device Metric1' is not needed because we're using aliases\n    add_simple_metric(&ddata_payload, NULL, true, ALIAS_DEVICE_METRIC_1, METRIC_DATA_TYPE_BOOLEAN, false, false, &ddata_metric_one_value, sizeof(ddata_metric_one_value));\n\n    fprintf(stdout, \"Adding metric: 'Device Metric INT8'\\n\");\n    int ddata_metric_int8_value = rand() % 100;\n    add_simple_metric(&ddata_payload, NULL, true, ALIAS_DEVICE_METRIC_INT8, METRIC_DATA_TYPE_INT8, false, false, &ddata_metric_int8_value, sizeof(ddata_metric_int8_value));\n\n    fprintf(stdout, \"Adding metric: 'Device Metric UINT32'\\n\");\n    int ddata_metric_uint32_value = rand() % 1000;\n    add_simple_metric(&ddata_payload, NULL, true, ALIAS_DEVICE_METRIC_UINT32, METRIC_DATA_TYPE_UINT32, false, false, &ddata_metric_uint32_value, sizeof(ddata_metric_uint32_value));\n\n    fprintf(stdout, \"Adding metric: 'Device Metric FLOAT'\\n\");\n    float ddata_metric_float_value = ((float)rand() / (float)(RAND_MAX)) * 5.0;\n    add_simple_metric(&ddata_payload, NULL, true, ALIAS_DEVICE_METRIC_FLOAT, METRIC_DATA_TYPE_FLOAT, false, false, &ddata_metric_float_value, sizeof(ddata_metric_float_value));\n\n#ifdef SPARKPLUG_DEBUG\n    // Print the payload\n    print_payload(&ddata_payload);\n#endif\n\n    // Encode the payload into a binary format so it can be published in the MQTT message.\n    // The binary_buffer must be large enough to hold the contents of the binary payload\n    size_t buffer_length = 1024;\n    uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n    size_t message_length = encode_payload(binary_buffer, buffer_length, &ddata_payload);\n\n    // Publish the DDATA on the appropriate topic\n    mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DDATA/C Edge Node 1/Emulated Device\", message_length, binary_buffer, 0, false);\n\n    // Free the memory\n    free(binary_buffer);\n    free_payload(&ddata_payload);\n}\n"
  },
  {
    "path": "c/core/test.sh",
    "content": "#/********************************************************************************\n# * Copyright (c) 2014-2019 Cirrus Link Solutions and others\n# *\n# * This program and the accompanying materials are made available under the\n# * terms of the Eclipse Public License 2.0 which is available at\n# * http://www.eclipse.org/legal/epl-2.0.\n# *\n# * SPDX-License-Identifier: EPL-2.0\n# *\n# * Contributors:\n# *   Cirrus Link Solutions - initial implementation\n# ********************************************************************************/\n\n#!/bin/sh\n\n#echo \"Running static example...\"\n#./test/test_static\n\necho \"\"\necho \"Running dynamic example...\"\n#echo \"Starting LD_LIBRARY_PATH:  ${LD_LIBRARY_PATH}\"\nPWD=`pwd`\nexport LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:${PWD}/lib\n#echo \"New LD_LIBRARY_PATH:       ${LD_LIBRARY_PATH}\"\n./test/test_dynamic\n"
  },
  {
    "path": "c/examples/template_as_custom_props/Makefile",
    "content": "#/********************************************************************************\n# * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n# *\n# * This program and the accompanying materials are made available under the\n# * terms of the Eclipse Public License 2.0 which is available at\n# * http://www.eclipse.org/legal/epl-2.0.\n# *\n# * SPDX-License-Identifier: EPL-2.0\n# *\n# * Contributors:\n# *   Cirrus Link Solutions - initial implementation\n# ********************************************************************************/\n\nTARGET = example\nLIBS = ../../../../client_libraries/c/lib/libtahu.a -Llib -L/usr/local/lib -lmosquitto\nCC = gcc\nCFLAGS = -g -Wall -I../../../../client_libraries/c/include/\n\n.PHONY: default all clean\n\ndefault: $(TARGET)\nall: default\n\nOBJECTS = $(patsubst %.c, %.o, $(wildcard *.c))\nHEADERS = $(wildcard *.h)\n\n%.o: %.c $(HEADERS)\n\t$(CC) $(CFLAGS) -c $< -o $@\n\n.PRECIOUS: $(TARGET) $(OBJECTS)\n\n$(TARGET): $(OBJECTS)\n\t$(CC) $(OBJECTS) -Wall $(LIBS) -o $@\n\nclean:\n\t-rm -f *.o\n\t-rm -f $(TARGET)\n"
  },
  {
    "path": "c/examples/template_as_custom_props/example.c",
    "content": "/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n#include <stdio.h>\n#include <stdlib.h>\n#include <stdbool.h>\n#include <unistd.h>\n#include <tahu.h>\n#include <tahu.pb.h>\n#include <pb_decode.h>\n#include <pb_encode.h>\n#include <mosquitto.h>\n#include <inttypes.h>\n\n/* Mosquitto Callbacks */\nvoid my_message_callback(struct mosquitto *mosq, void *userdata, const struct mosquitto_message *message);\nvoid my_connect_callback(struct mosquitto *mosq, void *userdata, int result);\nvoid my_subscribe_callback(struct mosquitto *mosq, void *userdata, int mid, int qos_count, const int *granted_qos);\nvoid my_log_callback(struct mosquitto *mosq, void *userdata, int level, const char *str);\n\n/* Local Functions */\nvoid publisher(struct mosquitto *mosq, char *topic, void *buf, unsigned len);\nvoid publish_births(struct mosquitto *mosq);\nvoid publish_node_birth(struct mosquitto *mosq);\nvoid publish_device_birth(struct mosquitto *mosq);\nvoid publish_ddata_message(struct mosquitto *mosq);\n\nint main(int argc, char *argv[]) {\n\n    // MQTT Parameters\n    char *host = \"localhost\";\n    int port = 1883;\n    int keepalive = 60;\n    bool clean_session = true;\n    struct mosquitto *mosq = NULL;\n\n    // MQTT Setup\n    srand(time(NULL));\n    mosquitto_lib_init();\n    mosq = mosquitto_new(NULL, clean_session, NULL);\n    if (!mosq) {\n        fprintf(stderr, \"Error: Out of memory.\\n\");\n        return 1;\n    }\n    mosquitto_log_callback_set(mosq, my_log_callback);\n    mosquitto_connect_callback_set(mosq, my_connect_callback);\n    mosquitto_message_callback_set(mosq, my_message_callback);\n    mosquitto_subscribe_callback_set(mosq, my_subscribe_callback);\n    mosquitto_username_pw_set(mosq, \"admin\", \"changeme\");\n    mosquitto_will_set(mosq, \"spBv1.0/Sparkplug B Devices/NDEATH/C Edge Node 1\", 0, NULL, 0, false);\n\n    // Optional SSL parameters for MQTT\n    //mosquitto_tls_insecure_set(mosq, true);\n    //mosquitto_tls_opts_set(mosq, 0, \"tlsv1.2\", NULL);               // 0 is DO NOT SSL_VERIFY_PEER\n\n    // MQTT Connect\n    if (mosquitto_connect(mosq, host, port, keepalive)) {\n        fprintf(stderr, \"Unable to connect.\\n\");\n        return 1;\n    }\n\n    // Publish the NBIRTH and DBIRTH Sparkplug messages (Birth Certificates)\n    publish_births(mosq);\n\n    // Loop and publish more DDATA messages every 5 seconds.  Note this should only be done in real/production\n    // scenarios with change events on inputs.  Because Sparkplug ensures state there is no reason to send DDATA\n    // messages unless the state of a I/O point has changed.\n    int i;\n    for (i = 0; i < 100; i++) {\n        publish_ddata_message(mosq);\n        int j;\n        for (j = 0; j < 50; j++) {\n            usleep(100000);\n            mosquitto_loop(mosq, 0, 1);\n        }\n    }\n\n    // Close and cleanup\n    mosquitto_destroy(mosq);\n    mosquitto_lib_cleanup();\n    return 0;\n}\n\n/*\n * Helper function to publish MQTT messages to the MQTT server\n */\nvoid publisher(struct mosquitto *mosq, char *topic, void *buf, unsigned len) {\n    // publish the data\n    mosquitto_publish(mosq, NULL, topic, len, buf, 0, false);\n}\n\n/*\n * Callback for incoming MQTT messages. Since this is a Sparkplug implementation these will be NCMD and DCMD messages\n */\nvoid my_message_callback(struct mosquitto *mosq, void *userdata, const struct mosquitto_message *message) {\n\n    if (message->payloadlen) {\n        fprintf(stdout, \"%s :: %d\\n\", message->topic, message->payloadlen);\n    } else {\n        fprintf(stdout, \"%s (null)\\n\", message->topic);\n    }\n    fflush(stdout);\n\n    // Decode the payload\n    org_eclipse_tahu_protobuf_Payload inbound_payload = org_eclipse_tahu_protobuf_Payload_init_zero;\n    if (decode_payload(&inbound_payload, message->payload, message->payloadlen)) {\n    } else {\n        fprintf(stderr, \"Failed to decode the payload\\n\");\n        return;  // JPL 04/06/17...\n    }\n\n    // Get the number of metrics in the payload and iterate over them handling them as needed\n    int i;\n    for (i = 0; i < inbound_payload.metrics_count; i++) {\n        // Handle the incoming message as necessary - start with the 'Node Control' metrics\n        // JPL 04/06/17... Handle ALIAS metrics versus text-name based metrics\n        if (inbound_payload.metrics[i].name == NULL) {  // alias 0 to 2\n            switch (inbound_payload.metrics[i].alias) {\n            case 0:  // Next Server\n                fprintf(stderr, \"Using Next Configured MQtt Server\\n\");\n                break;\n\n            case 1:  // Resend Births\n                fprintf(stderr, \"Resend Birth Certificates\\n\");\n                publish_births(mosq);\n                break;\n\n            case 2:  // Next Server\n                fprintf(stderr, \"REBOOT Operating system\\n\");\n                //system(\"reboot\");\n                break;\n            }\n        } else if (strcmp(inbound_payload.metrics[i].name, \"Node Control/Next Server\") == 0) {\n            // 'Node Control/Next Server' is an NCMD used to tell the device/client application to\n            // disconnect from the current MQTT server and connect to the next MQTT server in the\n            // list of available servers.  This is used for clients that have a pool of MQTT servers\n            // to connect to.\n            fprintf(stderr, \"'Node Control/Next Server' is not implemented in this example\\n\");\n        } else if (strcmp(inbound_payload.metrics[i].name, \"Node Control/Rebirth\") == 0) {\n            // 'Node Control/Rebirth' is an NCMD used to tell the device/client application to resend\n            // its full NBIRTH and DBIRTH again.  MQTT Engine will send this NCMD to a device/client\n            // application if it receives an NDATA or DDATA with a metric that was not published in the\n            // original NBIRTH or DBIRTH.  This is why the application must send all known metrics in\n            // its original NBIRTH and DBIRTH messages.\n            publish_births(mosq);\n        } else if (strcmp(inbound_payload.metrics[i].name, \"Node Control/Reboot\") == 0) {\n            // 'Node Control/Reboot' is an NCMD used to tell a device/client application to reboot\n            // This can be used for devices that need a full application reset via a soft reboot.\n            // In this case, we fake a full reboot with a republishing of the NBIRTH and DBIRTH\n            // messages.\n            publish_births(mosq);\n        } else {\n            fprintf(stderr, \"Unknown CMD: %s\\n\", inbound_payload.metrics[i].name);\n        }\n    }\n    free_payload(&inbound_payload);  // JPL 04/06/17...\n}\n\n/*\n * Callback for successful or unsuccessful MQTT connect.  Upon successful connect, subscribe to our Sparkplug NCMD and DCMD messages.\n * A production application should handle MQTT connect failures and reattempt as necessary.\n */\nvoid my_connect_callback(struct mosquitto *mosq, void *userdata, int result) {\n    if (!result) {\n        // Subscribe to commands\n        mosquitto_subscribe(mosq, NULL, \"spBv1.0/Sparkplug B Devices/NCMD/C Edge Node 1/#\", 0);\n        mosquitto_subscribe(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DCMD/C Edge Node 1/#\", 0);\n    } else {\n        fprintf(stderr, \"MQTT Connect failed\\n\");\n    }\n}\n\n/*\n * Callback for successful MQTT subscriptions.\n */\nvoid my_subscribe_callback(struct mosquitto *mosq, void *userdata, int mid, int qos_count, const int *granted_qos) {\n    int i;\n\n    fprintf(stdout, \"Subscribed (mid: %d): %d\", mid, granted_qos[0]);\n    for (i = 1; i < qos_count; i++) {\n        fprintf(stdout, \", %d\", granted_qos[i]);\n    }\n    fprintf(stdout, \"\\n\");\n}\n\n/*\n * MQTT logger callback\n */\nvoid my_log_callback(struct mosquitto *mosq, void *userdata, int level, const char *str) {\n    // Print all log messages regardless of level.\n    fprintf(stdout, \"%s\\n\", str);\n}\n\n/*\n * Helper to publish the Sparkplug NBIRTH and DBIRTH messages after initial MQTT connect.\n * This is also used for Rebirth requests from the backend.\n */\nvoid publish_births(struct mosquitto *mosq) {\n    // Initialize the sequence number for Sparkplug MQTT messages\n    // This must be zero on every NBIRTH publish\n\n    // Publish the NBIRTH\n    publish_node_birth(mosq);\n\n    // Publish the DBIRTH\n    publish_device_birth(mosq);\n}\n\n/*\n * Helper function to publish a NBIRTH message.  The NBIRTH should include all 'node control' metrics that denote device capability.\n * In addition, it should include every node metric that may ever be published from this edge node.  If any NDATA messages arrive at\n * MQTT Engine that were not included in the NBIRTH, MQTT Engine will request a Rebirth from the device.\n */\nvoid publish_node_birth(struct mosquitto *mosq) {\n    // Create the NBIRTH payload\n    org_eclipse_tahu_protobuf_Payload nbirth_payload;\n    // Initialize the sequence number for Sparkplug MQTT messages\n    // This must be zero on every NBIRTH publish\n    reset_sparkplug_sequence();\n    get_next_payload(&nbirth_payload);\n\n    // Add node control metrics\n    fprintf(stdout, \"Adding metric: 'Node Control/Next Server'\\n\");\n    bool next_server_value = false;\n    add_simple_metric(&nbirth_payload, \"Node Control/Next Server\", true, 0, METRIC_DATA_TYPE_BOOLEAN, false, false, &next_server_value, sizeof(next_server_value));\n    fprintf(stdout, \"Adding metric: 'Node Control/Rebirth'\\n\");\n    bool rebirth_value = false;\n    add_simple_metric(&nbirth_payload, \"Node Control/Rebirth\", true, 1, METRIC_DATA_TYPE_BOOLEAN, false, false, &rebirth_value, sizeof(rebirth_value));\n    fprintf(stdout, \"Adding metric: 'Node Control/Reboot'\\n\");\n    bool reboot_value = false;\n    add_simple_metric(&nbirth_payload, \"Node Control/Reboot\", true, 2, METRIC_DATA_TYPE_BOOLEAN, false, false, &reboot_value, sizeof(reboot_value));\n\n    // Create a metric called 'My Real Metric' which will be a member of the Template definition - note aliases do not apply to Template members\n    org_eclipse_tahu_protobuf_Payload_Metric my_real_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    uint32_t my_real_metric_value = 0;      // Default value\n    init_metric(&my_real_metric, \"My Real Metric\", false, 0, METRIC_DATA_TYPE_INT32, false, false, &my_real_metric_value, sizeof(my_real_metric_value));\n\n    // Create some Template Parameters - In this example we're using them as custom properties of a regular metric via a Template\n    org_eclipse_tahu_protobuf_Payload_Template_Parameter parameter_one = org_eclipse_tahu_protobuf_Payload_Template_Parameter_init_default;\n\n    parameter_one.name = strdup(\"MyPropKey1\");\n    parameter_one.has_type = true;\n    parameter_one.type = PARAMETER_DATA_TYPE_STRING;\n    parameter_one.which_value = org_eclipse_tahu_protobuf_Payload_Template_Parameter_string_value_tag;\n\n    parameter_one.value.string_value = strdup(\"MyDefaultPropValue1\");\n    org_eclipse_tahu_protobuf_Payload_Template_Parameter parameter_two = org_eclipse_tahu_protobuf_Payload_Template_Parameter_init_default;\n    parameter_two.name = strdup(\"MyPropKey2\");\n    parameter_two.has_type = true;\n    parameter_two.type = PARAMETER_DATA_TYPE_INT32;\n    parameter_two.which_value = org_eclipse_tahu_protobuf_Payload_Template_Parameter_int_value_tag;\n    parameter_two.value.int_value = 0;      // Default value\n\n    org_eclipse_tahu_protobuf_Payload_Template_Parameter parameter_three = org_eclipse_tahu_protobuf_Payload_Template_Parameter_init_default;\n    parameter_three.name = strdup(\"MyPropKey3\");\n    parameter_three.has_type = true;\n    parameter_three.type = PARAMETER_DATA_TYPE_FLOAT;\n    parameter_three.which_value = org_eclipse_tahu_protobuf_Payload_Template_Parameter_float_value_tag;\n    parameter_three.value.float_value = 0.0;    // Default value\n\n    // Create the Template definition value which includes the single Template members and parameters which are custom properties of the 'real metric'\n    org_eclipse_tahu_protobuf_Payload_Template udt_template = org_eclipse_tahu_protobuf_Payload_Template_init_default;\n    udt_template.metrics_count = 1;\n    udt_template.metrics = (org_eclipse_tahu_protobuf_Payload_Metric *)calloc(1, sizeof(org_eclipse_tahu_protobuf_Payload_Metric));\n    udt_template.metrics[0] = my_real_metric;\n    udt_template.parameters_count = 3;\n    udt_template.parameters = (org_eclipse_tahu_protobuf_Payload_Template_Parameter *)calloc(3, sizeof(org_eclipse_tahu_protobuf_Payload_Template_Parameter));\n    udt_template.parameters[0] = parameter_one;\n    udt_template.parameters[1] = parameter_two;\n    udt_template.parameters[2] = parameter_three;\n    udt_template.template_ref = NULL;\n    udt_template.has_is_definition = true;\n    udt_template.is_definition = true;\n\n    // Create the root Template definition and add the Template definition value which includes the Template members and parameters\n    org_eclipse_tahu_protobuf_Payload_Metric metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    init_metric(&metric, \"_types_/My Metric Definition\", true, 3, METRIC_DATA_TYPE_TEMPLATE, false, false, &udt_template, sizeof(udt_template));\n\n    // Add the Template to the payload\n    add_metric_to_payload(&nbirth_payload, &metric);\n\n#ifdef SPARKPLUG_DEBUG\n    // Print the payload for debug\n    print_payload(&nbirth_payload);\n#endif\n\n    // Encode the payload into a binary format so it can be published in the MQTT message.\n    // The binary_buffer must be large enough to hold the contents of the binary payload\n    size_t buffer_length = 1024;\n    uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n    size_t message_length = encode_payload(binary_buffer, buffer_length, &nbirth_payload);\n\n    // Publish the NBIRTH on the appropriate topic\n    mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/NBIRTH/C Edge Node 1\", message_length, binary_buffer, 0, false);\n\n    // Free the memory\n    free(binary_buffer);\n    free_payload(&nbirth_payload);\n}\n\nvoid publish_device_birth(struct mosquitto *mosq) {\n    // Create the DBIRTH payload\n    org_eclipse_tahu_protobuf_Payload dbirth_payload;\n    get_next_payload(&dbirth_payload);\n\n    // Add a metric with a custom property.  For use with Ignition, in order to see this as a Tag Property - it must be a known Ignition Tag Property.\n    fprintf(stdout, \"Adding metric: 'Device Metric1'\\n\");\n    org_eclipse_tahu_protobuf_Payload_Metric prop_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    uint32_t nbirth_metric_two_value = 13;\n    init_metric(&prop_metric, \"Device Metric1\", true, 4, METRIC_DATA_TYPE_INT16, false, false, &nbirth_metric_two_value, sizeof(nbirth_metric_two_value));\n    org_eclipse_tahu_protobuf_Payload_PropertySet properties = org_eclipse_tahu_protobuf_Payload_PropertySet_init_default;\n    add_property_to_set(&properties, \"engUnit\", PROPERTY_DATA_TYPE_STRING, \"MyCustomUnits\", sizeof(\"MyCustomUnits\"));\n    add_propertyset_to_metric(&prop_metric, &properties);\n    add_metric_to_payload(&dbirth_payload, &prop_metric);\n\n    // Create a metric called 'My Real Metric' for the Template instance\n    org_eclipse_tahu_protobuf_Payload_Metric my_real_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    uint32_t my_real_metric_value = 123;    // Not a default - this is the actual value of the instance\n    init_metric(&my_real_metric, \"My Real Metric\", false, 0, METRIC_DATA_TYPE_INT32, false, false, &my_real_metric_value, sizeof(my_real_metric_value));\n\n    // Create some Template/UDT instance Parameters - in this example they represent custom tag properties\n    org_eclipse_tahu_protobuf_Payload_Template_Parameter parameter_one = org_eclipse_tahu_protobuf_Payload_Template_Parameter_init_default;\n    parameter_one.name = strdup(\"MyPropKey1\");\n    parameter_one.has_type = true;\n    parameter_one.type = PARAMETER_DATA_TYPE_STRING;\n    parameter_one.which_value = org_eclipse_tahu_protobuf_Payload_Template_Parameter_string_value_tag;\n    parameter_one.value.string_value = strdup(\"MyInstancePropValue1\");\n\n    org_eclipse_tahu_protobuf_Payload_Template_Parameter parameter_two = org_eclipse_tahu_protobuf_Payload_Template_Parameter_init_default;\n    parameter_two.name = strdup(\"MyPropKey2\");\n    parameter_two.has_type = true;\n    parameter_two.type = PARAMETER_DATA_TYPE_INT32;\n    parameter_two.which_value = org_eclipse_tahu_protobuf_Payload_Template_Parameter_int_value_tag;\n    parameter_two.value.int_value = 1089;\n\n    org_eclipse_tahu_protobuf_Payload_Template_Parameter parameter_three = org_eclipse_tahu_protobuf_Payload_Template_Parameter_init_default;\n    parameter_three.name = strdup(\"MyPropKey3\");\n    parameter_three.has_type = true;\n    parameter_three.type = PARAMETER_DATA_TYPE_FLOAT;\n    parameter_three.which_value = org_eclipse_tahu_protobuf_Payload_Template_Parameter_float_value_tag;\n    parameter_three.value.float_value = 12.34;\n\n    // Create the Template instance value which includes the Template members and parameters\n    org_eclipse_tahu_protobuf_Payload_Template udt_template = org_eclipse_tahu_protobuf_Payload_Template_init_default;\n    udt_template.version = NULL;\n    udt_template.metrics_count = 1;\n    udt_template.metrics = (org_eclipse_tahu_protobuf_Payload_Metric *)calloc(1, sizeof(org_eclipse_tahu_protobuf_Payload_Metric));\n    udt_template.metrics[0] = my_real_metric;\n    udt_template.parameters_count = 3;\n    udt_template.parameters = (org_eclipse_tahu_protobuf_Payload_Template_Parameter *)calloc(3, sizeof(org_eclipse_tahu_protobuf_Payload_Template_Parameter));\n    udt_template.parameters[0] = parameter_one;\n    udt_template.parameters[1] = parameter_two;\n    udt_template.parameters[2] = parameter_three;\n    udt_template.template_ref = strdup(\"My Metric Definition\");\n    udt_template.has_is_definition = true;\n    udt_template.is_definition = false;\n\n    // Create the root Template instance and add the Template instance value\n    org_eclipse_tahu_protobuf_Payload_Metric metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    init_metric(&metric, \"My Metric Instance 1\", true, 5, METRIC_DATA_TYPE_TEMPLATE, false, false, &udt_template, sizeof(udt_template));\n\n    // Add the Template Instance to the payload\n    add_metric_to_payload(&dbirth_payload, &metric);\n\n#ifdef SPARKPLUG_DEBUG\n    // Print the payload\n    print_payload(&dbirth_payload);\n#endif\n\n    // Encode the payload into a binary format so it can be published in the MQTT message.\n    // The binary_buffer must be large enough to hold the contents of the binary payload\n    size_t buffer_length = 1024;\n    uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n    size_t message_length = encode_payload(binary_buffer, buffer_length, &dbirth_payload);\n\n    // Publish the DBIRTH on the appropriate topic\n    mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DBIRTH/C Edge Node 1/Emulated Device\", message_length, binary_buffer, 0, false);\n\n    // Free the memory\n    free(binary_buffer);\n    free_payload(&dbirth_payload);\n}\n\nvoid publish_ddata_message(struct mosquitto *mosq) {\n    // Create the DDATA payload\n    org_eclipse_tahu_protobuf_Payload ddata_payload;\n    get_next_payload(&ddata_payload);\n\n    // Update the metric called 'My Real Metric' for the Template instance to update the 'real' metric value\n    org_eclipse_tahu_protobuf_Payload_Metric my_real_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    uint32_t my_real_metric_value = rand(); // Not a default - this is the actual value of the metric of instance\n    init_metric(&my_real_metric, \"My Real Metric\", false, 0, METRIC_DATA_TYPE_INT32, false, false, &my_real_metric_value, sizeof(my_real_metric_value));\n\n    // Create the Template instance value which includes the Template members and parameters\n    org_eclipse_tahu_protobuf_Payload_Template udt_template = org_eclipse_tahu_protobuf_Payload_Template_init_default;\n    udt_template.version = NULL;\n    udt_template.metrics_count = 1;\n    udt_template.metrics = (org_eclipse_tahu_protobuf_Payload_Metric *)calloc(1, sizeof(org_eclipse_tahu_protobuf_Payload_Metric));\n    udt_template.metrics[0] = my_real_metric;\n    udt_template.has_is_definition = true;\n    udt_template.is_definition = false;\n\n    // Create the root Template instance and add the Template instance value\n    org_eclipse_tahu_protobuf_Payload_Metric metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    init_metric(&metric, \"My Metric Instance 1\", true, 5, METRIC_DATA_TYPE_TEMPLATE, false, false, &udt_template, sizeof(udt_template));\n\n    add_metric_to_payload(&ddata_payload, &metric);\n\n#ifdef SPARKPLUG_DEBUG\n    // Print the payload\n    print_payload(&ddata_payload);\n#endif\n\n    // Encode the payload into a binary format so it can be published in the MQTT message.\n    // The binary_buffer must be large enough to hold the contents of the binary payload\n    size_t buffer_length = 1024;\n    uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n    size_t message_length = encode_payload(binary_buffer, buffer_length, &ddata_payload);\n\n    // Publish the DDATA on the appropriate topic\n    mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DDATA/C Edge Node 1/Emulated Device\", message_length, binary_buffer, 0, false);\n\n    // Free the memory\n    free(binary_buffer);\n    free_payload(&ddata_payload);\n}\n"
  },
  {
    "path": "c/examples/udt_example/Makefile",
    "content": "#/********************************************************************************\n# * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n# *\n# * This program and the accompanying materials are made available under the\n# * terms of the Eclipse Public License 2.0 which is available at\n# * http://www.eclipse.org/legal/epl-2.0.\n# *\n# * SPDX-License-Identifier: EPL-2.0\n# *\n# * Contributors:\n# *   Cirrus Link Solutions - initial implementation\n# ********************************************************************************/\n\nTARGET = example\nLIBS = ../../../../client_libraries/c/lib/libtahu.a -Llib -L/usr/local/lib -lmosquitto\nCC = gcc\nCFLAGS = -g -Wall -I../../../../client_libraries/c/include/\n\n.PHONY: default all clean\n\ndefault: $(TARGET)\nall: default\n\nOBJECTS = $(patsubst %.c, %.o, $(wildcard *.c))\nHEADERS = $(wildcard *.h)\n\n%.o: %.c $(HEADERS)\n\t$(CC) $(CFLAGS) -c $< -o $@\n\n.PRECIOUS: $(TARGET) $(OBJECTS)\n\n$(TARGET): $(OBJECTS)\n\t$(CC) $(OBJECTS) -Wall $(LIBS) -o $@\n\nclean:\n\t-rm -f *.o\n\t-rm -f $(TARGET)\n"
  },
  {
    "path": "c/examples/udt_example/example.c",
    "content": "/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n#include <stdio.h>\n#include <stdlib.h>\n#include <stdbool.h>\n#include <unistd.h>\n#include <tahu.h>\n#include <tahu.pb.h>\n#include <pb_decode.h>\n#include <pb_encode.h>\n#include <mosquitto.h>\n#include <inttypes.h>\n\n/* Mosquitto Callbacks */\nvoid my_message_callback(struct mosquitto *mosq, void *userdata, const struct mosquitto_message *message);\nvoid my_connect_callback(struct mosquitto *mosq, void *userdata, int result);\nvoid my_subscribe_callback(struct mosquitto *mosq, void *userdata, int mid, int qos_count, const int *granted_qos);\nvoid my_log_callback(struct mosquitto *mosq, void *userdata, int level, const char *str);\n\n/* Local Functions */\nvoid publisher(struct mosquitto *mosq, char *topic, void *buf, unsigned len);\nvoid publish_births(struct mosquitto *mosq);\nvoid publish_node_birth(struct mosquitto *mosq);\nvoid publish_device_birth(struct mosquitto *mosq);\nvoid publish_ddata_message(struct mosquitto *mosq);\n\nenum alias_map {\n    Next_Server = 0,\n    Rebirth = 1,\n    Reboot = 2,\n    Dataset = 3,\n    Node_Metric0 = 4,\n    Node_Metric1 = 5,\n    Node_Metric2 = 6,\n    Device_Metric0 = 7,\n    Device_Metric1 = 8,\n    Device_Metric2 = 9,\n    Device_Metric3 = 10,\n    My_Custom_Motor = 11\n};\n\nint main(int argc, char *argv[]) {\n\n    // MQTT Parameters\n    char *host = \"localhost\";\n    int port = 1883;\n    int keepalive = 60;\n    bool clean_session = true;\n    struct mosquitto *mosq = NULL;\n\n    // MQTT Setup\n    srand(time(NULL));\n    mosquitto_lib_init();\n    mosq = mosquitto_new(NULL, clean_session, NULL);\n    if (!mosq) {\n        fprintf(stderr, \"Error: Out of memory.\\n\");\n        return 1;\n    }\n    mosquitto_log_callback_set(mosq, my_log_callback);\n    mosquitto_connect_callback_set(mosq, my_connect_callback);\n    mosquitto_message_callback_set(mosq, my_message_callback);\n    mosquitto_subscribe_callback_set(mosq, my_subscribe_callback);\n    mosquitto_username_pw_set(mosq, \"admin\", \"changeme\");\n    mosquitto_will_set(mosq, \"spBv1.0/Sparkplug B Devices/NDEATH/C Edge Node 1\", 0, NULL, 0, false);\n\n    // Optional SSL parameters for MQTT\n    //mosquitto_tls_insecure_set(mosq, true);\n    //mosquitto_tls_opts_set(mosq, 0, \"tlsv1.2\", NULL);               // 0 is DO NOT SSL_VERIFY_PEER\n\n    // MQTT Connect\n    if (mosquitto_connect(mosq, host, port, keepalive)) {\n        fprintf(stderr, \"Unable to connect.\\n\");\n        return 1;\n    }\n\n    // Publish the NBIRTH and DBIRTH Sparkplug messages (Birth Certificates)\n    publish_births(mosq);\n\n    // Loop and publish more DDATA messages every 5 seconds.  Note this should only be done in real/production\n    // scenarios with change events on inputs.  Because Sparkplug ensures state there is no reason to send DDATA\n    // messages unless the state of a I/O point has changed.\n    int i;\n    for (i = 0; i < 100; i++) {\n        publish_ddata_message(mosq);\n        int j;\n        for (j = 0; j < 50; j++) {\n            usleep(10000);\n            mosquitto_loop(mosq, 0, 1);\n        }\n    }\n\n    // Close and cleanup\n    mosquitto_destroy(mosq);\n    mosquitto_lib_cleanup();\n    return 0;\n}\n\n/*\n * Helper function to publish MQTT messages to the MQTT server\n */\nvoid publisher(struct mosquitto *mosq, char *topic, void *buf, unsigned len) {\n    // publish the data\n    mosquitto_publish(mosq, NULL, topic, len, buf, 0, false);\n}\n\n/*\n * Callback for incoming MQTT messages. Since this is a Sparkplug implementation these will be NCMD and DCMD messages\n */\nvoid my_message_callback(struct mosquitto *mosq, void *userdata, const struct mosquitto_message *message) {\n\n    if (message->payloadlen) {\n        fprintf(stdout, \"%s :: %d\\n\", message->topic, message->payloadlen);\n    } else {\n        fprintf(stdout, \"%s (null)\\n\", message->topic);\n    }\n    fflush(stdout);\n\n    // Decode the payload\n    org_eclipse_tahu_protobuf_Payload inbound_payload = org_eclipse_tahu_protobuf_Payload_init_zero;\n    if (decode_payload(&inbound_payload, message->payload, message->payloadlen)) {\n    } else {\n        fprintf(stderr, \"Failed to decode the payload\\n\");\n    }\n\n    // Get the number of metrics in the payload and iterate over them handling them as needed\n    int i;\n    for (i = 0; i < inbound_payload.metrics_count; i++) {\n\n        if (inbound_payload.metrics[i].name != NULL) {\n            // Handle the incoming message as necessary - start with the 'Node Control' metrics\n            if (strcmp(inbound_payload.metrics[i].name, \"Node Control/Next Server\") == 0) {\n                // 'Node Control/Next Server' is an NCMD used to tell the device/client application to\n                // disconnect from the current MQTT server and connect to the next MQTT server in the\n                // list of available servers.  This is used for clients that have a pool of MQTT servers\n                // to connect to.\n                fprintf(stderr, \"'Node Control/Next Server' is not implemented in this example\\n\");\n            } else if (strcmp(inbound_payload.metrics[i].name, \"Node Control/Rebirth\") == 0) {\n                // 'Node Control/Rebirth' is an NCMD used to tell the device/client application to resend\n                // its full NBIRTH and DBIRTH again.  MQTT Engine will send this NCMD to a device/client\n                // application if it receives an NDATA or DDATA with a metric that was not published in the\n                // original NBIRTH or DBIRTH.  This is why the application must send all known metrics in\n                // its original NBIRTH and DBIRTH messages.\n                publish_births(mosq);\n            } else if (strcmp(inbound_payload.metrics[i].name, \"Node Control/Reboot\") == 0) {\n                // 'Node Control/Reboot' is an NCMD used to tell a device/client application to reboot\n                // This can be used for devices that need a full application reset via a soft reboot.\n                // In this case, we fake a full reboot with a republishing of the NBIRTH and DBIRTH\n                // messages.\n                publish_births(mosq);\n            } else if (strcmp(inbound_payload.metrics[i].name, \"output/Device Metric2\") == 0) {\n                // This is a metric we declared in our DBIRTH message and we're emulating an output.\n                // So, on incoming 'writes' to the output we must publish a DDATA with the new output\n                // value.  If this were a real output we'd write to the output and then read it back\n                // before publishing a DDATA message.\n\n                // We know this is an Int16 because of how we declated it in the DBIRTH\n                uint32_t new_value = inbound_payload.metrics[i].value.int_value;\n                fprintf(stdout, \"CMD message for output/Device Metric2 - New Value: %d\\n\", new_value);\n\n                // Create the DDATA payload\n                org_eclipse_tahu_protobuf_Payload ddata_payload;\n                get_next_payload(&ddata_payload);\n                // Note the Metric name 'output/Device Metric2' is not needed because we're using aliases\n                add_simple_metric(&ddata_payload, NULL, true, Device_Metric2, METRIC_DATA_TYPE_INT16, false, false, &new_value, sizeof(new_value));\n\n                // Encode the payload into a binary format so it can be published in the MQTT message.\n                // The binary_buffer must be large enough to hold the contents of the binary payload\n                size_t buffer_length = 128;\n                uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n                size_t message_length = encode_payload(binary_buffer, buffer_length, &ddata_payload);\n\n                // Publish the DDATA on the appropriate topic\n                mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DDATA/C Edge Node 1/Emulated Device\", message_length, binary_buffer, 0, false);\n\n                // Free the memory\n                free(binary_buffer);\n                free_payload(&ddata_payload);\n            } else if (strcmp(inbound_payload.metrics[i].name, \"output/Device Metric3\") == 0) {\n                // This is a metric we declared in our DBIRTH message and we're emulating an output.\n                // So, on incoming 'writes' to the output we must publish a DDATA with the new output\n                // value.  If this were a real output we'd write to the output and then read it back\n                // before publishing a DDATA message.\n\n                // We know this is an Boolean because of how we declated it in the DBIRTH\n                bool new_value = inbound_payload.metrics[i].value.boolean_value;\n                fprintf(stdout, \"CMD message for output/Device Metric3 - New Value: %s\\n\", new_value ? \"true\" : \"false\");\n\n                // Create the DDATA payload\n                org_eclipse_tahu_protobuf_Payload ddata_payload;\n                get_next_payload(&ddata_payload);\n                // Note the Metric name 'output/Device Metric3' is not needed because we're using aliases\n                add_simple_metric(&ddata_payload, NULL, true, Device_Metric3, METRIC_DATA_TYPE_BOOLEAN, false, false, &new_value, sizeof(new_value));\n\n                // Encode the payload into a binary format so it can be published in the MQTT message.\n                // The binary_buffer must be large enough to hold the contents of the binary payload\n                size_t buffer_length = 128;\n                uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n                size_t message_length = encode_payload(binary_buffer, buffer_length, &ddata_payload);\n\n                // Publish the DDATA on the appropriate topic\n                mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DDATA/C Edge Node 1/Emulated Device\", message_length, binary_buffer, 0, false);\n\n                // Free the memory\n                free(binary_buffer);\n                free_payload(&ddata_payload);\n            } else {\n                fprintf(stderr, \"Unknown CMD: %s\\n\", inbound_payload.metrics[i].name);\n            }\n        } else if (inbound_payload.metrics[i].has_alias) {\n            // Handle the incoming message as necessary - start with the 'Node Control' metrics\n            if (inbound_payload.metrics[i].alias == Next_Server) {\n                // 'Node Control/Next Server' is an NCMD used to tell the device/client application to\n                // disconnect from the current MQTT server and connect to the next MQTT server in the\n                // list of available servers.  This is used for clients that have a pool of MQTT servers\n                // to connect to.\n                fprintf(stderr, \"'Node Control/Next Server' is not implemented in this example\\n\");\n            } else if (inbound_payload.metrics[i].alias == Rebirth) {\n                // 'Node Control/Rebirth' is an NCMD used to tell the device/client application to resend\n                // its full NBIRTH and DBIRTH again.  MQTT Engine will send this NCMD to a device/client\n                // application if it receives an NDATA or DDATA with a metric that was not published in the\n                // original NBIRTH or DBIRTH.  This is why the application must send all known metrics in\n                // its original NBIRTH and DBIRTH messages.\n                publish_births(mosq);\n            } else if (inbound_payload.metrics[i].alias == Reboot) {\n                // 'Node Control/Reboot' is an NCMD used to tell a device/client application to reboot\n                // This can be used for devices that need a full application reset via a soft reboot.\n                // In this case, we fake a full reboot with a republishing of the NBIRTH and DBIRTH\n                // messages.\n                publish_births(mosq);\n            } else if (inbound_payload.metrics[i].alias == Device_Metric2) {\n                // This is a metric we declared in our DBIRTH message and we're emulating an output.\n                // So, on incoming 'writes' to the output we must publish a DDATA with the new output\n                // value.  If this were a real output we'd write to the output and then read it back\n                // before publishing a DDATA message.\n\n                // We know this is an Int16 because of how we declated it in the DBIRTH\n                uint32_t new_value = inbound_payload.metrics[i].value.int_value;\n                fprintf(stdout, \"CMD message for output/Device Metric2 - New Value: %d\\n\", new_value);\n\n                // Create the DDATA payload\n                org_eclipse_tahu_protobuf_Payload ddata_payload;\n                get_next_payload(&ddata_payload);\n                // Note the Metric name 'output/Device Metric2' is not needed because we're using aliases\n                add_simple_metric(&ddata_payload, NULL, true, Device_Metric2, METRIC_DATA_TYPE_INT16, false, false, &new_value, sizeof(new_value));\n\n                // Encode the payload into a binary format so it can be published in the MQTT message.\n                // The binary_buffer must be large enough to hold the contents of the binary payload\n                size_t buffer_length = 128;\n                uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n                size_t message_length = encode_payload(binary_buffer, buffer_length, &ddata_payload);\n\n                // Publish the DDATA on the appropriate topic\n                mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DDATA/C Edge Node 1/Emulated Device\", message_length, binary_buffer, 0, false);\n\n                // Free the memory\n                free(binary_buffer);\n                free_payload(&ddata_payload);\n            } else if (inbound_payload.metrics[i].alias == Device_Metric3) {\n                // This is a metric we declared in our DBIRTH message and we're emulating an output.\n                // So, on incoming 'writes' to the output we must publish a DDATA with the new output\n                // value.  If this were a real output we'd write to the output and then read it back\n                // before publishing a DDATA message.\n\n                // We know this is an Boolean because of how we declated it in the DBIRTH\n                bool new_value = inbound_payload.metrics[i].value.boolean_value;\n                fprintf(stdout, \"CMD message for output/Device Metric3 - New Value: %s\\n\", new_value ? \"true\" : \"false\");\n\n                // Create the DDATA payload\n                org_eclipse_tahu_protobuf_Payload ddata_payload;\n                get_next_payload(&ddata_payload);\n                // Note the Metric name 'output/Device Metric3' is not needed because we're using aliases\n                add_simple_metric(&ddata_payload, NULL, true, Device_Metric3, METRIC_DATA_TYPE_BOOLEAN, false, false, &new_value, sizeof(new_value));\n\n                // Encode the payload into a binary format so it can be published in the MQTT message.\n                // The binary_buffer must be large enough to hold the contents of the binary payload\n                size_t buffer_length = 128;\n                uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n                size_t message_length = encode_payload(binary_buffer, buffer_length, &ddata_payload);\n\n                // Publish the DDATA on the appropriate topic\n                mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DDATA/C Edge Node 1/Emulated Device\", message_length, binary_buffer, 0, false);\n\n                // Free the memory\n                free(binary_buffer);\n                free_payload(&ddata_payload);\n            } else {\n                fprintf(stderr, \"Unknown CMD: %ld\\n\", inbound_payload.metrics[i].alias);\n            }\n        } else {\n            fprintf(stdout, \"Not a metric name or alias??\\n\");\n        }\n    }\n}\n\n/*\n * Callback for successful or unsuccessful MQTT connect.  Upon successful connect, subscribe to our Sparkplug NCMD and DCMD messages.\n * A production application should handle MQTT connect failures and reattempt as necessary.\n */\nvoid my_connect_callback(struct mosquitto *mosq, void *userdata, int result) {\n    if (!result) {\n        // Subscribe to commands\n        mosquitto_subscribe(mosq, NULL, \"spBv1.0/Sparkplug B Devices/NCMD/C Edge Node 1/#\", 0);\n        mosquitto_subscribe(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DCMD/C Edge Node 1/#\", 0);\n    } else {\n        fprintf(stderr, \"MQTT Connect failed\\n\");\n    }\n}\n\n/*\n * Callback for successful MQTT subscriptions.\n */\nvoid my_subscribe_callback(struct mosquitto *mosq, void *userdata, int mid, int qos_count, const int *granted_qos) {\n    int i;\n\n    fprintf(stdout, \"Subscribed (mid: %d): %d\", mid, granted_qos[0]);\n    for (i = 1; i < qos_count; i++) {\n        fprintf(stdout, \", %d\", granted_qos[i]);\n    }\n    fprintf(stdout, \"\\n\");\n}\n\n/*\n * MQTT logger callback\n */\nvoid my_log_callback(struct mosquitto *mosq, void *userdata, int level, const char *str) {\n    // Print all log messages regardless of level.\n    fprintf(stdout, \"%s\\n\", str);\n}\n\n/*\n * Helper to publish the Sparkplug NBIRTH and DBIRTH messages after initial MQTT connect.\n * This is also used for Rebirth requests from the backend.\n */\nvoid publish_births(struct mosquitto *mosq) {\n    // Initialize the sequence number for Sparkplug MQTT messages\n    // This must be zero on every NBIRTH publish\n\n    // Publish the NBIRTH\n    publish_node_birth(mosq);\n\n    // Publish the DBIRTH\n    publish_device_birth(mosq);\n}\n\n/*\n * Helper function to publish a NBIRTH message.  The NBIRTH should include all 'node control' metrics that denote device capability.\n * In addition, it should include every node metric that may ever be published from this edge node.  If any NDATA messages arrive at\n * MQTT Engine that were not included in the NBIRTH, MQTT Engine will request a Rebirth from the device.\n */\nvoid publish_node_birth(struct mosquitto *mosq) {\n    // Create the NBIRTH payload\n    org_eclipse_tahu_protobuf_Payload nbirth_payload;\n\n    // Initialize the sequence number for Sparkplug MQTT messages\n    // This must be zero on every NBIRTH publish\n    reset_sparkplug_sequence();\n    get_next_payload(&nbirth_payload);\n    nbirth_payload.uuid = strdup(\"MyUUID\");\n\n    // Add node control metrics\n    fprintf(stdout, \"Adding metric: 'Node Control/Next Server'\\n\");\n    bool next_server_value = false;\n    add_simple_metric(&nbirth_payload, \"Node Control/Next Server\", true, Next_Server, METRIC_DATA_TYPE_BOOLEAN, false, false, &next_server_value, sizeof(next_server_value));\n    fprintf(stdout, \"Adding metric: 'Node Control/Rebirth'\\n\");\n    bool rebirth_value = false;\n    add_simple_metric(&nbirth_payload, \"Node Control/Rebirth\", true, Rebirth, METRIC_DATA_TYPE_BOOLEAN, false, false, &rebirth_value, sizeof(rebirth_value));\n    fprintf(stdout, \"Adding metric: 'Node Control/Reboot'\\n\");\n    bool reboot_value = false;\n    add_simple_metric(&nbirth_payload, \"Node Control/Reboot\", true, Reboot, METRIC_DATA_TYPE_BOOLEAN, false, false, &reboot_value, sizeof(reboot_value));\n\n    // Add some regular node metrics\n    fprintf(stdout, \"Adding metric: 'Node Metric0'\\n\");\n    char nbirth_metric_zero_value[] = \"hello node\";\n    add_simple_metric(&nbirth_payload, \"Node Metric0\", true, Node_Metric0, METRIC_DATA_TYPE_STRING, false, false, &nbirth_metric_zero_value, sizeof(nbirth_metric_zero_value));\n    fprintf(stdout, \"Adding metric: 'Node Metric1'\\n\");\n    bool nbirth_metric_one_value = true;\n    add_simple_metric(&nbirth_payload, \"Node Metric1\", true, Node_Metric1, METRIC_DATA_TYPE_BOOLEAN, false, false, &nbirth_metric_one_value, sizeof(nbirth_metric_one_value));\n\n    // Create a DataSet\n    org_eclipse_tahu_protobuf_Payload_DataSet dataset = org_eclipse_tahu_protobuf_Payload_DataSet_init_default;\n    uint32_t datatypes[] = { DATA_SET_DATA_TYPE_INT8,\n        DATA_SET_DATA_TYPE_INT16,\n        DATA_SET_DATA_TYPE_INT32 };\n    const char *column_keys[] = { \"Int8s\",\n        \"Int16s\",\n        \"Int32s\" };\n    org_eclipse_tahu_protobuf_Payload_DataSet_Row *row_data = (org_eclipse_tahu_protobuf_Payload_DataSet_Row *)\n        calloc(2, sizeof(org_eclipse_tahu_protobuf_Payload_DataSet_Row));\n    row_data[0].elements_count = 3;\n    row_data[0].elements = (org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue *)\n        calloc(3, sizeof(org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue));\n    row_data[0].elements[0].which_value = org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_int_value_tag;\n    row_data[0].elements[0].value.int_value = 0;\n    row_data[0].elements[1].which_value = org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_int_value_tag;\n    row_data[0].elements[1].value.int_value = 1;\n    row_data[0].elements[2].which_value = org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_int_value_tag;\n    row_data[0].elements[2].value.int_value = 2;\n    row_data[1].elements_count = 3;\n    row_data[1].elements = (org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue *)\n        calloc(3, sizeof(org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue));\n    row_data[1].elements[0].which_value = org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_int_value_tag;\n    row_data[1].elements[0].value.int_value = 3;\n    row_data[1].elements[1].which_value = org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_int_value_tag;\n    row_data[1].elements[1].value.int_value = 4;\n    row_data[1].elements[2].which_value = org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_int_value_tag;\n    row_data[1].elements[2].value.int_value = 5;\n    init_dataset(&dataset, 2, 3, datatypes, column_keys, row_data);\n\n    // Create the a Metric with the DataSet value and add it to the payload\n    fprintf(stdout, \"Adding metric: 'DataSet'\\n\");\n    org_eclipse_tahu_protobuf_Payload_Metric dataset_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    init_metric(&dataset_metric, \"DataSet\", true, Dataset, METRIC_DATA_TYPE_DATASET, false, false, &dataset, sizeof(dataset));\n    add_metric_to_payload(&nbirth_payload, &dataset_metric);\n\n    // Add a metric with a custom property\n    fprintf(stdout, \"Adding metric: 'Node Metric2'\\n\");\n    org_eclipse_tahu_protobuf_Payload_Metric prop_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    uint32_t nbirth_metric_two_value = 13;\n    init_metric(&prop_metric, \"Node Metric2\", true, Node_Metric2, METRIC_DATA_TYPE_INT16, false, false, &nbirth_metric_two_value, sizeof(nbirth_metric_two_value));\n    org_eclipse_tahu_protobuf_Payload_PropertySet properties = org_eclipse_tahu_protobuf_Payload_PropertySet_init_default;\n    add_property_to_set(&properties, \"engUnit\", PROPERTY_DATA_TYPE_STRING, \"MyCustomUnits\", sizeof(\"MyCustomUnits\"));\n    add_propertyset_to_metric(&prop_metric, &properties);\n    add_metric_to_payload(&nbirth_payload, &prop_metric);\n\n    // Create a metric called RPMs which is a member of the UDT definition - note aliases do not apply to UDT members\n    org_eclipse_tahu_protobuf_Payload_Metric rpms_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    uint32_t rpms_value = 0;\n    init_metric(&rpms_metric, \"RPMs\", false, 0, METRIC_DATA_TYPE_INT32, false, false, &rpms_value, sizeof(rpms_value));\n\n    // Create a metric called AMPs which is a member of the UDT definition - note aliases do not apply to UDT members\n    org_eclipse_tahu_protobuf_Payload_Metric amps_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    uint32_t amps_value = 0;\n    init_metric(&amps_metric, \"AMPs\", false, 0, METRIC_DATA_TYPE_INT32, false, false, &amps_value, sizeof(amps_value));\n\n    // Create a Template/UDT Parameter - this is purely for example of including parameters and is not actually used by UDT instances\n    org_eclipse_tahu_protobuf_Payload_Template_Parameter parameter = org_eclipse_tahu_protobuf_Payload_Template_Parameter_init_default;\n    parameter.name = strdup(\"Index\");\n    parameter.has_type = true;\n    parameter.type = PARAMETER_DATA_TYPE_STRING;\n    parameter.which_value = org_eclipse_tahu_protobuf_Payload_Template_Parameter_string_value_tag;\n    parameter.value.string_value = strdup(\"0\");\n\n    // Create the UDT definition value which includes the UDT members and parameters\n    org_eclipse_tahu_protobuf_Payload_Template udt_template = org_eclipse_tahu_protobuf_Payload_Template_init_default;\n    udt_template.metrics_count = 2;\n    udt_template.metrics = (org_eclipse_tahu_protobuf_Payload_Metric *)calloc(2, sizeof(org_eclipse_tahu_protobuf_Payload_Metric));\n    udt_template.metrics[0] = rpms_metric;\n    udt_template.metrics[1] = amps_metric;\n    udt_template.parameters_count = 1;\n    udt_template.parameters = (org_eclipse_tahu_protobuf_Payload_Template_Parameter *)calloc(1, sizeof(org_eclipse_tahu_protobuf_Payload_Template_Parameter));\n    udt_template.parameters[0] = parameter;\n    udt_template.template_ref = NULL;\n    udt_template.has_is_definition = true;\n    udt_template.is_definition = true;\n\n    // Create the root UDT definition and add the UDT definition value which includes the UDT members and parameters\n    org_eclipse_tahu_protobuf_Payload_Metric metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    init_metric(&metric, \"_types_/Custom_Motor\", false, 0, METRIC_DATA_TYPE_TEMPLATE, false, false, &udt_template, sizeof(udt_template));\n\n    // Add the UDT to the payload\n    add_metric_to_payload(&nbirth_payload, &metric);\n\n#ifdef SPARKPLUG_DEBUG\n    // Print the payload for debug\n    print_payload(&nbirth_payload);\n#endif\n\n    // Encode the payload into a binary format so it can be published in the MQTT message.\n    // The binary_buffer must be large enough to hold the contents of the binary payload\n    size_t buffer_length = 1024;\n    uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n    size_t message_length = encode_payload(binary_buffer, buffer_length, &nbirth_payload);\n\n    // Publish the NBIRTH on the appropriate topic\n    mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/NBIRTH/C Edge Node 1\", message_length, binary_buffer, 0, false);\n\n    // Free the memory\n    free(binary_buffer);\n    free(row_data);\n    free_payload(&nbirth_payload);\n}\n\nvoid publish_device_birth(struct mosquitto *mosq) {\n    // Create the DBIRTH payload\n    org_eclipse_tahu_protobuf_Payload dbirth_payload;\n    get_next_payload(&dbirth_payload);\n\n    // Add some device metrics\n    fprintf(stdout, \"Adding metric: 'input/Device Metric0'\\n\");\n    char dbirth_metric_zero_value[] = \"hello device\";\n    add_simple_metric(&dbirth_payload, \"input/Device Metric0\", true, Device_Metric0, METRIC_DATA_TYPE_STRING, false, false, &dbirth_metric_zero_value, sizeof(dbirth_metric_zero_value));\n    fprintf(stdout, \"Adding metric: 'input/Device Metric1'\\n\");\n    bool dbirth_metric_one_value = true;\n    add_simple_metric(&dbirth_payload, \"input/Device Metric1\", true, Device_Metric1, METRIC_DATA_TYPE_BOOLEAN, false, false, &dbirth_metric_one_value, sizeof(dbirth_metric_one_value));\n    fprintf(stdout, \"Adding metric: 'output/Device Metric2'\\n\");\n    uint32_t dbirth_metric_two_value = 16;\n    add_simple_metric(&dbirth_payload, \"output/Device Metric2\", true, Device_Metric2, METRIC_DATA_TYPE_INT16, false, false, &dbirth_metric_two_value, sizeof(dbirth_metric_two_value));\n    fprintf(stdout, \"Adding metric: 'output/Device Metric3'\\n\");\n    bool dbirth_metric_three_value = true;\n    add_simple_metric(&dbirth_payload, \"output/Device Metric3\", true, Device_Metric3, METRIC_DATA_TYPE_BOOLEAN, false, false, &dbirth_metric_three_value, sizeof(dbirth_metric_three_value));\n\n    // Create a metric called RPMs for the UDT instance\n    org_eclipse_tahu_protobuf_Payload_Metric rpms_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    uint32_t rpms_value = 123;\n    init_metric(&rpms_metric, \"RPMs\", false, 0, METRIC_DATA_TYPE_INT32, false, false, &rpms_value, sizeof(rpms_value));\n\n    // Create a metric called AMPs for the UDT instance and create a custom property (milliamps) for it\n    org_eclipse_tahu_protobuf_Payload_Metric amps_metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    uint32_t amps_value = 456;\n    init_metric(&amps_metric, \"AMPs\", false, 0, METRIC_DATA_TYPE_INT32, false, false, &amps_value, sizeof(amps_value));\n    org_eclipse_tahu_protobuf_Payload_PropertySet properties = org_eclipse_tahu_protobuf_Payload_PropertySet_init_default;\n    add_property_to_set(&properties, \"engUnit\", PROPERTY_DATA_TYPE_STRING, \"milliamps\", sizeof(\"milliamps\"));\n    add_propertyset_to_metric(&amps_metric, &properties);\n\n    // Create a Template/UDT instance Parameter - this is purely for example of including parameters and is not actually used by UDT instances\n    org_eclipse_tahu_protobuf_Payload_Template_Parameter parameter = org_eclipse_tahu_protobuf_Payload_Template_Parameter_init_default;\n    parameter.name = strdup(\"Index\");\n    parameter.has_type = true;\n    parameter.type = PARAMETER_DATA_TYPE_STRING;\n    parameter.which_value = org_eclipse_tahu_protobuf_Payload_Template_Parameter_string_value_tag;\n    parameter.value.string_value = strdup(\"1\");\n\n    // Create the UDT instance value which includes the UDT members and parameters\n    org_eclipse_tahu_protobuf_Payload_Template udt_template = org_eclipse_tahu_protobuf_Payload_Template_init_default;\n    udt_template.version = NULL;\n    udt_template.metrics_count = 2;\n    udt_template.metrics = (org_eclipse_tahu_protobuf_Payload_Metric *)calloc(2, sizeof(org_eclipse_tahu_protobuf_Payload_Metric));\n    udt_template.metrics[0] = rpms_metric;\n    udt_template.metrics[1] = amps_metric;\n    udt_template.parameters_count = 1;\n    udt_template.parameters = (org_eclipse_tahu_protobuf_Payload_Template_Parameter *)calloc(1, sizeof(org_eclipse_tahu_protobuf_Payload_Template_Parameter));\n    udt_template.parameters[0] = parameter;\n    udt_template.template_ref = strdup(\"Custom_Motor\");\n    udt_template.has_is_definition = true;\n    udt_template.is_definition = false;\n\n    // Create the root UDT instance and add the UDT instance value\n    org_eclipse_tahu_protobuf_Payload_Metric metric = org_eclipse_tahu_protobuf_Payload_Metric_init_default;\n    init_metric(&metric, \"My_Custom_Motor\", true, My_Custom_Motor, METRIC_DATA_TYPE_TEMPLATE, false, false, &udt_template, sizeof(udt_template));\n\n    // Add the UDT Instance to the payload\n    add_metric_to_payload(&dbirth_payload, &metric);\n\n#ifdef SPARKPLUG_DEBUG\n    // Print the payload\n    print_payload(&dbirth_payload);\n#endif\n\n    // Encode the payload into a binary format so it can be published in the MQTT message.\n    // The binary_buffer must be large enough to hold the contents of the binary payload\n    size_t buffer_length = 1024;\n    uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n    size_t message_length = encode_payload(binary_buffer, buffer_length, &dbirth_payload);\n\n    // Publish the DBIRTH on the appropriate topic\n    mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DBIRTH/C Edge Node 1/Emulated Device\", message_length, binary_buffer, 0, false);\n\n    // Free the memory\n    free(binary_buffer);\n    free_payload(&dbirth_payload);\n}\n\nvoid publish_ddata_message(struct mosquitto *mosq) {\n    // Create the DDATA payload\n    org_eclipse_tahu_protobuf_Payload ddata_payload;\n    get_next_payload(&ddata_payload);\n\n    // Add some device metrics to denote changed values on inputs\n    fprintf(stdout, \"Adding metric: 'input/Device Metric0'\\n\");\n    char ddata_metric_zero_value[13];\n    int i;\n    for (i = 0; i < 12; ++i) {\n        ddata_metric_zero_value[i] = '0' + rand() % 72; // starting on '0', ending on '}'\n    }\n    ddata_metric_zero_value[12] = 0;\n\n    // Note the Metric name 'input/Device Metric0' is not needed because we're using aliases\n    add_simple_metric(&ddata_payload, NULL, true, Device_Metric0, METRIC_DATA_TYPE_STRING, false, false, &ddata_metric_zero_value, sizeof(ddata_metric_zero_value));\n    fprintf(stdout, \"Adding metric: 'input/Device Metric1'\\n\");\n    bool ddata_metric_one_value = rand() % 2;\n    // Note the Metric name 'input/Device Metric1' is not needed because we're using aliases\n    add_simple_metric(&ddata_payload, NULL, true, Device_Metric1, METRIC_DATA_TYPE_BOOLEAN, false, false, &ddata_metric_one_value, sizeof(ddata_metric_one_value));\n\n#ifdef SPARKPLUG_DEBUG\n    // Print the payload\n    print_payload(&ddata_payload);\n#endif\n\n    // Encode the payload into a binary format so it can be published in the MQTT message.\n    // The binary_buffer must be large enough to hold the contents of the binary payload\n    size_t buffer_length = 1024;\n    uint8_t *binary_buffer = (uint8_t *)malloc(buffer_length * sizeof(uint8_t));\n    size_t message_length = encode_payload(binary_buffer, buffer_length, &ddata_payload);\n\n    // Publish the DDATA on the appropriate topic\n    mosquitto_publish(mosq, NULL, \"spBv1.0/Sparkplug B Devices/DDATA/C Edge Node 1/Emulated Device\", message_length, binary_buffer, 0, false);\n\n    // Free the memory\n    free(binary_buffer);\n    free_payload(&ddata_payload);\n}\n"
  },
  {
    "path": "c_sharp/core/readme.txt",
    "content": "# To generate the base protobuf sparkplug_b Java library\nprotoc --proto_path=../../ --csharp_out=src --csharp_opt=base_namespace=Org.Eclipse.Tahu.Protobuf ../../sparkplug_b/sparkplug_b_c_sharp.proto\n"
  },
  {
    "path": "c_sharp/core/src/SparkplugBCSharp.cs",
    "content": "// Generated by the protocol buffer compiler.  DO NOT EDIT!\n// source: sparkplug_b/sparkplug_b_c_sharp.proto\n#pragma warning disable 1591, 0612, 3021\n#region Designer generated code\n\nusing pb = global::Google.Protobuf;\nusing pbc = global::Google.Protobuf.Collections;\nusing pbr = global::Google.Protobuf.Reflection;\nusing scg = global::System.Collections.Generic;\nnamespace Org.Eclipse.Tahu.Protobuf {\n\n  /// <summary>Holder for reflection information generated from sparkplug_b/sparkplug_b_c_sharp.proto</summary>\n  public static partial class SparkplugBCSharpReflection {\n\n    #region Descriptor\n    /// <summary>File descriptor for sparkplug_b/sparkplug_b_c_sharp.proto</summary>\n    public static pbr::FileDescriptor Descriptor {\n      get { return descriptor; }\n    }\n    private static pbr::FileDescriptor descriptor;\n\n    static SparkplugBCSharpReflection() {\n      byte[] descriptorData = global::System.Convert.FromBase64String(\n          string.Concat(\n            \"CiVzcGFya3BsdWdfYi9zcGFya3BsdWdfYl9jX3NoYXJwLnByb3RvEhlvcmcu\",\n            \"ZWNsaXBzZS50YWh1LnByb3RvYnVmGhlnb29nbGUvcHJvdG9idWYvYW55LnBy\",\n            \"b3RvIrEYCgdQYXlsb2FkEhEKCXRpbWVzdGFtcBgBIAEoBBI6CgdtZXRyaWNz\",\n            \"GAIgAygLMikub3JnLmVjbGlwc2UudGFodS5wcm90b2J1Zi5QYXlsb2FkLk1l\",\n            \"dHJpYxILCgNzZXEYAyABKAQSDAoEdXVpZBgEIAEoCRIMCgRib2R5GAUgASgM\",\n            \"EiUKB2RldGFpbHMYBiADKAsyFC5nb29nbGUucHJvdG9idWYuQW55GuMECghU\",\n            \"ZW1wbGF0ZRIPCgd2ZXJzaW9uGAEgASgJEjoKB21ldHJpY3MYAiADKAsyKS5v\",\n            \"cmcuZWNsaXBzZS50YWh1LnByb3RvYnVmLlBheWxvYWQuTWV0cmljEkkKCnBh\",\n            \"cmFtZXRlcnMYAyADKAsyNS5vcmcuZWNsaXBzZS50YWh1LnByb3RvYnVmLlBh\",\n            \"eWxvYWQuVGVtcGxhdGUuUGFyYW1ldGVyEhQKDHRlbXBsYXRlX3JlZhgEIAEo\",\n            \"CRIVCg1pc19kZWZpbml0aW9uGAUgASgIEiUKB2RldGFpbHMYBiADKAsyFC5n\",\n            \"b29nbGUucHJvdG9idWYuQW55GuoCCglQYXJhbWV0ZXISDAoEbmFtZRgBIAEo\",\n            \"CRIMCgR0eXBlGAIgASgNEhMKCWludF92YWx1ZRgDIAEoDUgAEhQKCmxvbmdf\",\n            \"dmFsdWUYBCABKARIABIVCgtmbG9hdF92YWx1ZRgFIAEoAkgAEhYKDGRvdWJs\",\n            \"ZV92YWx1ZRgGIAEoAUgAEhcKDWJvb2xlYW5fdmFsdWUYByABKAhIABIWCgxz\",\n            \"dHJpbmdfdmFsdWUYCCABKAlIABJoCg9leHRlbnNpb25fdmFsdWUYCSABKAsy\",\n            \"TS5vcmcuZWNsaXBzZS50YWh1LnByb3RvYnVmLlBheWxvYWQuVGVtcGxhdGUu\",\n            \"UGFyYW1ldGVyLlBhcmFtZXRlclZhbHVlRXh0ZW5zaW9uSAAaQwoXUGFyYW1l\",\n            \"dGVyVmFsdWVFeHRlbnNpb24SKAoKZXh0ZW5zaW9ucxgBIAMoCzIULmdvb2ds\",\n            \"ZS5wcm90b2J1Zi5BbnlCBwoFdmFsdWUa7gQKB0RhdGFTZXQSFgoObnVtX29m\",\n            \"X2NvbHVtbnMYASABKAQSDwoHY29sdW1ucxgCIAMoCRINCgV0eXBlcxgDIAMo\",\n            \"DRI8CgRyb3dzGAQgAygLMi4ub3JnLmVjbGlwc2UudGFodS5wcm90b2J1Zi5Q\",\n            \"YXlsb2FkLkRhdGFTZXQuUm93EiUKB2RldGFpbHMYBSADKAsyFC5nb29nbGUu\",\n            \"cHJvdG9idWYuQW55GswCCgxEYXRhU2V0VmFsdWUSEwoJaW50X3ZhbHVlGAEg\",\n            \"ASgNSAASFAoKbG9uZ192YWx1ZRgCIAEoBEgAEhUKC2Zsb2F0X3ZhbHVlGAMg\",\n            \"ASgCSAASFgoMZG91YmxlX3ZhbHVlGAQgASgBSAASFwoNYm9vbGVhbl92YWx1\",\n            \"ZRgFIAEoCEgAEhYKDHN0cmluZ192YWx1ZRgGIAEoCUgAEmgKD2V4dGVuc2lv\",\n            \"bl92YWx1ZRgHIAEoCzJNLm9yZy5lY2xpcHNlLnRhaHUucHJvdG9idWYuUGF5\",\n            \"bG9hZC5EYXRhU2V0LkRhdGFTZXRWYWx1ZS5EYXRhU2V0VmFsdWVFeHRlbnNp\",\n            \"b25IABo+ChVEYXRhU2V0VmFsdWVFeHRlbnNpb24SJQoHZGV0YWlscxgBIAMo\",\n            \"CzIULmdvb2dsZS5wcm90b2J1Zi5BbnlCBwoFdmFsdWUadwoDUm93EkkKCGVs\",\n            \"ZW1lbnRzGAEgAygLMjcub3JnLmVjbGlwc2UudGFodS5wcm90b2J1Zi5QYXls\",\n            \"b2FkLkRhdGFTZXQuRGF0YVNldFZhbHVlEiUKB2RldGFpbHMYAiADKAsyFC5n\",\n            \"b29nbGUucHJvdG9idWYuQW55GoYECg1Qcm9wZXJ0eVZhbHVlEgwKBHR5cGUY\",\n            \"ASABKA0SDwoHaXNfbnVsbBgCIAEoCBITCglpbnRfdmFsdWUYAyABKA1IABIU\",\n            \"Cgpsb25nX3ZhbHVlGAQgASgESAASFQoLZmxvYXRfdmFsdWUYBSABKAJIABIW\",\n            \"Cgxkb3VibGVfdmFsdWUYBiABKAFIABIXCg1ib29sZWFuX3ZhbHVlGAcgASgI\",\n            \"SAASFgoMc3RyaW5nX3ZhbHVlGAggASgJSAASSwoRcHJvcGVydHlzZXRfdmFs\",\n            \"dWUYCSABKAsyLi5vcmcuZWNsaXBzZS50YWh1LnByb3RvYnVmLlBheWxvYWQu\",\n            \"UHJvcGVydHlTZXRIABJQChJwcm9wZXJ0eXNldHNfdmFsdWUYCiABKAsyMi5v\",\n            \"cmcuZWNsaXBzZS50YWh1LnByb3RvYnVmLlBheWxvYWQuUHJvcGVydHlTZXRM\",\n            \"aXN0SAASYgoPZXh0ZW5zaW9uX3ZhbHVlGAsgASgLMkcub3JnLmVjbGlwc2Uu\",\n            \"dGFodS5wcm90b2J1Zi5QYXlsb2FkLlByb3BlcnR5VmFsdWUuUHJvcGVydHlW\",\n            \"YWx1ZUV4dGVuc2lvbkgAGj8KFlByb3BlcnR5VmFsdWVFeHRlbnNpb24SJQoH\",\n            \"ZGV0YWlscxgBIAMoCzIULmdvb2dsZS5wcm90b2J1Zi5BbnlCBwoFdmFsdWUa\",\n            \"hAEKC1Byb3BlcnR5U2V0EgwKBGtleXMYASADKAkSQAoGdmFsdWVzGAIgAygL\",\n            \"MjAub3JnLmVjbGlwc2UudGFodS5wcm90b2J1Zi5QYXlsb2FkLlByb3BlcnR5\",\n            \"VmFsdWUSJQoHZGV0YWlscxgDIAMoCzIULmdvb2dsZS5wcm90b2J1Zi5Bbnka\",\n            \"fQoPUHJvcGVydHlTZXRMaXN0EkMKC3Byb3BlcnR5c2V0GAEgAygLMi4ub3Jn\",\n            \"LmVjbGlwc2UudGFodS5wcm90b2J1Zi5QYXlsb2FkLlByb3BlcnR5U2V0EiUK\",\n            \"B2RldGFpbHMYAiADKAsyFC5nb29nbGUucHJvdG9idWYuQW55GsEBCghNZXRh\",\n            \"RGF0YRIVCg1pc19tdWx0aV9wYXJ0GAEgASgIEhQKDGNvbnRlbnRfdHlwZRgC\",\n            \"IAEoCRIMCgRzaXplGAMgASgEEgsKA3NlcRgEIAEoBBIRCglmaWxlX25hbWUY\",\n            \"BSABKAkSEQoJZmlsZV90eXBlGAYgASgJEgsKA21kNRgHIAEoCRITCgtkZXNj\",\n            \"cmlwdGlvbhgIIAEoCRIlCgdkZXRhaWxzGAkgAygLMhQuZ29vZ2xlLnByb3Rv\",\n            \"YnVmLkFueRrcBQoGTWV0cmljEgwKBG5hbWUYASABKAkSDQoFYWxpYXMYAiAB\",\n            \"KAQSEQoJdGltZXN0YW1wGAMgASgEEhAKCGRhdGF0eXBlGAQgASgNEhUKDWlz\",\n            \"X2hpc3RvcmljYWwYBSABKAgSFAoMaXNfdHJhbnNpZW50GAYgASgIEg8KB2lz\",\n            \"X251bGwYByABKAgSPQoIbWV0YWRhdGEYCCABKAsyKy5vcmcuZWNsaXBzZS50\",\n            \"YWh1LnByb3RvYnVmLlBheWxvYWQuTWV0YURhdGESQgoKcHJvcGVydGllcxgJ\",\n            \"IAEoCzIuLm9yZy5lY2xpcHNlLnRhaHUucHJvdG9idWYuUGF5bG9hZC5Qcm9w\",\n            \"ZXJ0eVNldBITCglpbnRfdmFsdWUYCiABKA1IABIUCgpsb25nX3ZhbHVlGAsg\",\n            \"ASgESAASFQoLZmxvYXRfdmFsdWUYDCABKAJIABIWCgxkb3VibGVfdmFsdWUY\",\n            \"DSABKAFIABIXCg1ib29sZWFuX3ZhbHVlGA4gASgISAASFgoMc3RyaW5nX3Zh\",\n            \"bHVlGA8gASgJSAASFQoLYnl0ZXNfdmFsdWUYECABKAxIABJDCg1kYXRhc2V0\",\n            \"X3ZhbHVlGBEgASgLMioub3JnLmVjbGlwc2UudGFodS5wcm90b2J1Zi5QYXls\",\n            \"b2FkLkRhdGFTZXRIABJFCg50ZW1wbGF0ZV92YWx1ZRgSIAEoCzIrLm9yZy5l\",\n            \"Y2xpcHNlLnRhaHUucHJvdG9idWYuUGF5bG9hZC5UZW1wbGF0ZUgAElkKD2V4\",\n            \"dGVuc2lvbl92YWx1ZRgTIAEoCzI+Lm9yZy5lY2xpcHNlLnRhaHUucHJvdG9i\",\n            \"dWYuUGF5bG9hZC5NZXRyaWMuTWV0cmljVmFsdWVFeHRlbnNpb25IABo9ChRN\",\n            \"ZXRyaWNWYWx1ZUV4dGVuc2lvbhIlCgdkZXRhaWxzGAEgAygLMhQuZ29vZ2xl\",\n            \"LnByb3RvYnVmLkFueUIHCgV2YWx1ZUIsChlvcmcuZWNsaXBzZS50YWh1LnBy\",\n            \"b3RvYnVmQg9TcGFya3BsdWdCUHJvdG9iBnByb3RvMw==\"));\n      descriptor = pbr::FileDescriptor.FromGeneratedCode(descriptorData,\n          new pbr::FileDescriptor[] { global::Google.Protobuf.WellKnownTypes.AnyReflection.Descriptor, },\n          new pbr::GeneratedClrTypeInfo(null, new pbr::GeneratedClrTypeInfo[] {\n            new pbr::GeneratedClrTypeInfo(typeof(global::Org.Eclipse.Tahu.Protobuf.Payload), global::Org.Eclipse.Tahu.Protobuf.Payload.Parser, new[]{ \"Timestamp\", \"Metrics\", \"Seq\", \"Uuid\", \"Body\", \"Details\" }, null, null, new pbr::GeneratedClrTypeInfo[] { new pbr::GeneratedClrTypeInfo(typeof(global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template), global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Parser, new[]{ \"Version\", \"Metrics\", \"Parameters\", \"TemplateRef\", \"IsDefinition\", \"Details\" }, null, null, new pbr::GeneratedClrTypeInfo[] { new pbr::GeneratedClrTypeInfo(typeof(global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Types.Parameter), global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Types.Parameter.Parser, new[]{ \"Name\", \"Type\", \"IntValue\", \"LongValue\", \"FloatValue\", \"DoubleValue\", \"BooleanValue\", \"StringValue\", \"ExtensionValue\" }, new[]{ \"Value\" }, null, new pbr::GeneratedClrTypeInfo[] { new pbr::GeneratedClrTypeInfo(typeof(global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Types.Parameter.Types.ParameterValueExtension), global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Types.Parameter.Types.ParameterValueExtension.Parser, new[]{ \"Extensions\" }, null, null, null)})}),\n            new pbr::GeneratedClrTypeInfo(typeof(global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet), global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Parser, new[]{ \"NumOfColumns\", \"Columns\", \"Types_\", \"Rows\", \"Details\" }, null, null, new pbr::GeneratedClrTypeInfo[] { new pbr::GeneratedClrTypeInfo(typeof(global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.DataSetValue), global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.DataSetValue.Parser, new[]{ \"IntValue\", \"LongValue\", \"FloatValue\", \"DoubleValue\", \"BooleanValue\", \"StringValue\", \"ExtensionValue\" }, new[]{ \"Value\" }, null, new pbr::GeneratedClrTypeInfo[] { new pbr::GeneratedClrTypeInfo(typeof(global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.DataSetValue.Types.DataSetValueExtension), global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.DataSetValue.Types.DataSetValueExtension.Parser, new[]{ \"Details\" }, null, null, null)}),\n            new pbr::GeneratedClrTypeInfo(typeof(global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.Row), global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.Row.Parser, new[]{ \"Elements\", \"Details\" }, null, null, null)}),\n            new pbr::GeneratedClrTypeInfo(typeof(global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertyValue), global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertyValue.Parser, new[]{ \"Type\", \"IsNull\", \"IntValue\", \"LongValue\", \"FloatValue\", \"DoubleValue\", \"BooleanValue\", \"StringValue\", \"PropertysetValue\", \"PropertysetsValue\", \"ExtensionValue\" }, new[]{ \"Value\" }, null, new pbr::GeneratedClrTypeInfo[] { new pbr::GeneratedClrTypeInfo(typeof(global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertyValue.Types.PropertyValueExtension), global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertyValue.Types.PropertyValueExtension.Parser, new[]{ \"Details\" }, null, null, null)}),\n            new pbr::GeneratedClrTypeInfo(typeof(global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySet), global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySet.Parser, new[]{ \"Keys\", \"Values\", \"Details\" }, null, null, null),\n            new pbr::GeneratedClrTypeInfo(typeof(global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySetList), global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySetList.Parser, new[]{ \"Propertyset\", \"Details\" }, null, null, null),\n            new pbr::GeneratedClrTypeInfo(typeof(global::Org.Eclipse.Tahu.Protobuf.Payload.Types.MetaData), global::Org.Eclipse.Tahu.Protobuf.Payload.Types.MetaData.Parser, new[]{ \"IsMultiPart\", \"ContentType\", \"Size\", \"Seq\", \"FileName\", \"FileType\", \"Md5\", \"Description\", \"Details\" }, null, null, null),\n            new pbr::GeneratedClrTypeInfo(typeof(global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric), global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric.Parser, new[]{ \"Name\", \"Alias\", \"Timestamp\", \"Datatype\", \"IsHistorical\", \"IsTransient\", \"IsNull\", \"Metadata\", \"Properties\", \"IntValue\", \"LongValue\", \"FloatValue\", \"DoubleValue\", \"BooleanValue\", \"StringValue\", \"BytesValue\", \"DatasetValue\", \"TemplateValue\", \"ExtensionValue\" }, new[]{ \"Value\" }, null, new pbr::GeneratedClrTypeInfo[] { new pbr::GeneratedClrTypeInfo(typeof(global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric.Types.MetricValueExtension), global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric.Types.MetricValueExtension.Parser, new[]{ \"Details\" }, null, null, null)})})\n          }));\n    }\n    #endregion\n\n  }\n  #region Messages\n  /// <summary>\n  ///\n  ///// Indexes of Data Types\n  ///// Unknown placeholder for future expansion.\n  ///Unknown         = 0;\n  ///// Basic Types\n  ///Int8            = 1;\n  ///Int16           = 2;\n  ///Int32           = 3;\n  ///Int64           = 4;\n  ///UInt8           = 5;\n  ///UInt16          = 6;\n  ///UInt32          = 7;\n  ///UInt64          = 8;\n  ///Float           = 9;\n  ///Double          = 10;\n  ///Boolean         = 11;\n  ///String          = 12;\n  ///DateTime        = 13;\n  ///Text            = 14;\n  ///// Additional Metric Types\n  ///UUID            = 15;\n  ///DataSet         = 16;\n  ///Bytes           = 17;\n  ///File            = 18;\n  ///Template        = 19;\n  ///// Additional PropertyValue Types\n  ///PropertySet     = 20;\n  ///PropertySetList = 21;\n  /// </summary>\n  public sealed partial class Payload : pb::IMessage<Payload> {\n    private static readonly pb::MessageParser<Payload> _parser = new pb::MessageParser<Payload>(() => new Payload());\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public static pb::MessageParser<Payload> Parser { get { return _parser; } }\n\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public static pbr::MessageDescriptor Descriptor {\n      get { return global::Org.Eclipse.Tahu.Protobuf.SparkplugBCSharpReflection.Descriptor.MessageTypes[0]; }\n    }\n\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    pbr::MessageDescriptor pb::IMessage.Descriptor {\n      get { return Descriptor; }\n    }\n\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public Payload() {\n      OnConstruction();\n    }\n\n    partial void OnConstruction();\n\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public Payload(Payload other) : this() {\n      timestamp_ = other.timestamp_;\n      metrics_ = other.metrics_.Clone();\n      seq_ = other.seq_;\n      uuid_ = other.uuid_;\n      body_ = other.body_;\n      details_ = other.details_.Clone();\n    }\n\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public Payload Clone() {\n      return new Payload(this);\n    }\n\n    /// <summary>Field number for the \"timestamp\" field.</summary>\n    public const int TimestampFieldNumber = 1;\n    private ulong timestamp_;\n    /// <summary>\n    /// Timestamp at message sending time\n    /// </summary>\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public ulong Timestamp {\n      get { return timestamp_; }\n      set {\n        timestamp_ = value;\n      }\n    }\n\n    /// <summary>Field number for the \"metrics\" field.</summary>\n    public const int MetricsFieldNumber = 2;\n    private static readonly pb::FieldCodec<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric> _repeated_metrics_codec\n        = pb::FieldCodec.ForMessage(18, global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric.Parser);\n    private readonly pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric> metrics_ = new pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric>();\n    /// <summary>\n    /// Repeated forever - no limit in Google Protobufs\n    /// </summary>\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric> Metrics {\n      get { return metrics_; }\n    }\n\n    /// <summary>Field number for the \"seq\" field.</summary>\n    public const int SeqFieldNumber = 3;\n    private ulong seq_;\n    /// <summary>\n    /// Sequence number\n    /// </summary>\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public ulong Seq {\n      get { return seq_; }\n      set {\n        seq_ = value;\n      }\n    }\n\n    /// <summary>Field number for the \"uuid\" field.</summary>\n    public const int UuidFieldNumber = 4;\n    private string uuid_ = \"\";\n    /// <summary>\n    /// UUID to track message type in terms of schema definitions\n    /// </summary>\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public string Uuid {\n      get { return uuid_; }\n      set {\n        uuid_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n      }\n    }\n\n    /// <summary>Field number for the \"body\" field.</summary>\n    public const int BodyFieldNumber = 5;\n    private pb::ByteString body_ = pb::ByteString.Empty;\n    /// <summary>\n    /// To optionally bypass the whole definition above\n    /// </summary>\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public pb::ByteString Body {\n      get { return body_; }\n      set {\n        body_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n      }\n    }\n\n    /// <summary>Field number for the \"details\" field.</summary>\n    public const int DetailsFieldNumber = 6;\n    private static readonly pb::FieldCodec<global::Google.Protobuf.WellKnownTypes.Any> _repeated_details_codec\n        = pb::FieldCodec.ForMessage(50, global::Google.Protobuf.WellKnownTypes.Any.Parser);\n    private readonly pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> details_ = new pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any>();\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> Details {\n      get { return details_; }\n    }\n\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public override bool Equals(object other) {\n      return Equals(other as Payload);\n    }\n\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public bool Equals(Payload other) {\n      if (ReferenceEquals(other, null)) {\n        return false;\n      }\n      if (ReferenceEquals(other, this)) {\n        return true;\n      }\n      if (Timestamp != other.Timestamp) return false;\n      if(!metrics_.Equals(other.metrics_)) return false;\n      if (Seq != other.Seq) return false;\n      if (Uuid != other.Uuid) return false;\n      if (Body != other.Body) return false;\n      if(!details_.Equals(other.details_)) return false;\n      return true;\n    }\n\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public override int GetHashCode() {\n      int hash = 1;\n      if (Timestamp != 0UL) hash ^= Timestamp.GetHashCode();\n      hash ^= metrics_.GetHashCode();\n      if (Seq != 0UL) hash ^= Seq.GetHashCode();\n      if (Uuid.Length != 0) hash ^= Uuid.GetHashCode();\n      if (Body.Length != 0) hash ^= Body.GetHashCode();\n      hash ^= details_.GetHashCode();\n      return hash;\n    }\n\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public override string ToString() {\n      return pb::JsonFormatter.ToDiagnosticString(this);\n    }\n\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public void WriteTo(pb::CodedOutputStream output) {\n      if (Timestamp != 0UL) {\n        output.WriteRawTag(8);\n        output.WriteUInt64(Timestamp);\n      }\n      metrics_.WriteTo(output, _repeated_metrics_codec);\n      if (Seq != 0UL) {\n        output.WriteRawTag(24);\n        output.WriteUInt64(Seq);\n      }\n      if (Uuid.Length != 0) {\n        output.WriteRawTag(34);\n        output.WriteString(Uuid);\n      }\n      if (Body.Length != 0) {\n        output.WriteRawTag(42);\n        output.WriteBytes(Body);\n      }\n      details_.WriteTo(output, _repeated_details_codec);\n    }\n\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public int CalculateSize() {\n      int size = 0;\n      if (Timestamp != 0UL) {\n        size += 1 + pb::CodedOutputStream.ComputeUInt64Size(Timestamp);\n      }\n      size += metrics_.CalculateSize(_repeated_metrics_codec);\n      if (Seq != 0UL) {\n        size += 1 + pb::CodedOutputStream.ComputeUInt64Size(Seq);\n      }\n      if (Uuid.Length != 0) {\n        size += 1 + pb::CodedOutputStream.ComputeStringSize(Uuid);\n      }\n      if (Body.Length != 0) {\n        size += 1 + pb::CodedOutputStream.ComputeBytesSize(Body);\n      }\n      size += details_.CalculateSize(_repeated_details_codec);\n      return size;\n    }\n\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public void MergeFrom(Payload other) {\n      if (other == null) {\n        return;\n      }\n      if (other.Timestamp != 0UL) {\n        Timestamp = other.Timestamp;\n      }\n      metrics_.Add(other.metrics_);\n      if (other.Seq != 0UL) {\n        Seq = other.Seq;\n      }\n      if (other.Uuid.Length != 0) {\n        Uuid = other.Uuid;\n      }\n      if (other.Body.Length != 0) {\n        Body = other.Body;\n      }\n      details_.Add(other.details_);\n    }\n\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public void MergeFrom(pb::CodedInputStream input) {\n      uint tag;\n      while ((tag = input.ReadTag()) != 0) {\n        switch(tag) {\n          default:\n            input.SkipLastField();\n            break;\n          case 8: {\n            Timestamp = input.ReadUInt64();\n            break;\n          }\n          case 18: {\n            metrics_.AddEntriesFrom(input, _repeated_metrics_codec);\n            break;\n          }\n          case 24: {\n            Seq = input.ReadUInt64();\n            break;\n          }\n          case 34: {\n            Uuid = input.ReadString();\n            break;\n          }\n          case 42: {\n            Body = input.ReadBytes();\n            break;\n          }\n          case 50: {\n            details_.AddEntriesFrom(input, _repeated_details_codec);\n            break;\n          }\n        }\n      }\n    }\n\n    #region Nested types\n    /// <summary>Container for nested types declared in the Payload message type.</summary>\n    [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n    public static partial class Types {\n      public sealed partial class Template : pb::IMessage<Template> {\n        private static readonly pb::MessageParser<Template> _parser = new pb::MessageParser<Template>(() => new Template());\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static pb::MessageParser<Template> Parser { get { return _parser; } }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static pbr::MessageDescriptor Descriptor {\n          get { return global::Org.Eclipse.Tahu.Protobuf.Payload.Descriptor.NestedTypes[0]; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        pbr::MessageDescriptor pb::IMessage.Descriptor {\n          get { return Descriptor; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public Template() {\n          OnConstruction();\n        }\n\n        partial void OnConstruction();\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public Template(Template other) : this() {\n          version_ = other.version_;\n          metrics_ = other.metrics_.Clone();\n          parameters_ = other.parameters_.Clone();\n          templateRef_ = other.templateRef_;\n          isDefinition_ = other.isDefinition_;\n          details_ = other.details_.Clone();\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public Template Clone() {\n          return new Template(this);\n        }\n\n        /// <summary>Field number for the \"version\" field.</summary>\n        public const int VersionFieldNumber = 1;\n        private string version_ = \"\";\n        /// <summary>\n        /// The version of the Template to prevent mismatches\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public string Version {\n          get { return version_; }\n          set {\n            version_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n          }\n        }\n\n        /// <summary>Field number for the \"metrics\" field.</summary>\n        public const int MetricsFieldNumber = 2;\n        private static readonly pb::FieldCodec<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric> _repeated_metrics_codec\n            = pb::FieldCodec.ForMessage(18, global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric.Parser);\n        private readonly pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric> metrics_ = new pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric>();\n        /// <summary>\n        /// Each metric is the name of the metric and the datatype of the member but does not contain a value\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric> Metrics {\n          get { return metrics_; }\n        }\n\n        /// <summary>Field number for the \"parameters\" field.</summary>\n        public const int ParametersFieldNumber = 3;\n        private static readonly pb::FieldCodec<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Types.Parameter> _repeated_parameters_codec\n            = pb::FieldCodec.ForMessage(26, global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Types.Parameter.Parser);\n        private readonly pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Types.Parameter> parameters_ = new pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Types.Parameter>();\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Types.Parameter> Parameters {\n          get { return parameters_; }\n        }\n\n        /// <summary>Field number for the \"template_ref\" field.</summary>\n        public const int TemplateRefFieldNumber = 4;\n        private string templateRef_ = \"\";\n        /// <summary>\n        /// Reference to a template if this is extending a Template or an instance - must exist if an instance\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public string TemplateRef {\n          get { return templateRef_; }\n          set {\n            templateRef_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n          }\n        }\n\n        /// <summary>Field number for the \"is_definition\" field.</summary>\n        public const int IsDefinitionFieldNumber = 5;\n        private bool isDefinition_;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public bool IsDefinition {\n          get { return isDefinition_; }\n          set {\n            isDefinition_ = value;\n          }\n        }\n\n        /// <summary>Field number for the \"details\" field.</summary>\n        public const int DetailsFieldNumber = 6;\n        private static readonly pb::FieldCodec<global::Google.Protobuf.WellKnownTypes.Any> _repeated_details_codec\n            = pb::FieldCodec.ForMessage(50, global::Google.Protobuf.WellKnownTypes.Any.Parser);\n        private readonly pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> details_ = new pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any>();\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> Details {\n          get { return details_; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override bool Equals(object other) {\n          return Equals(other as Template);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public bool Equals(Template other) {\n          if (ReferenceEquals(other, null)) {\n            return false;\n          }\n          if (ReferenceEquals(other, this)) {\n            return true;\n          }\n          if (Version != other.Version) return false;\n          if(!metrics_.Equals(other.metrics_)) return false;\n          if(!parameters_.Equals(other.parameters_)) return false;\n          if (TemplateRef != other.TemplateRef) return false;\n          if (IsDefinition != other.IsDefinition) return false;\n          if(!details_.Equals(other.details_)) return false;\n          return true;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override int GetHashCode() {\n          int hash = 1;\n          if (Version.Length != 0) hash ^= Version.GetHashCode();\n          hash ^= metrics_.GetHashCode();\n          hash ^= parameters_.GetHashCode();\n          if (TemplateRef.Length != 0) hash ^= TemplateRef.GetHashCode();\n          if (IsDefinition != false) hash ^= IsDefinition.GetHashCode();\n          hash ^= details_.GetHashCode();\n          return hash;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override string ToString() {\n          return pb::JsonFormatter.ToDiagnosticString(this);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void WriteTo(pb::CodedOutputStream output) {\n          if (Version.Length != 0) {\n            output.WriteRawTag(10);\n            output.WriteString(Version);\n          }\n          metrics_.WriteTo(output, _repeated_metrics_codec);\n          parameters_.WriteTo(output, _repeated_parameters_codec);\n          if (TemplateRef.Length != 0) {\n            output.WriteRawTag(34);\n            output.WriteString(TemplateRef);\n          }\n          if (IsDefinition != false) {\n            output.WriteRawTag(40);\n            output.WriteBool(IsDefinition);\n          }\n          details_.WriteTo(output, _repeated_details_codec);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public int CalculateSize() {\n          int size = 0;\n          if (Version.Length != 0) {\n            size += 1 + pb::CodedOutputStream.ComputeStringSize(Version);\n          }\n          size += metrics_.CalculateSize(_repeated_metrics_codec);\n          size += parameters_.CalculateSize(_repeated_parameters_codec);\n          if (TemplateRef.Length != 0) {\n            size += 1 + pb::CodedOutputStream.ComputeStringSize(TemplateRef);\n          }\n          if (IsDefinition != false) {\n            size += 1 + 1;\n          }\n          size += details_.CalculateSize(_repeated_details_codec);\n          return size;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void MergeFrom(Template other) {\n          if (other == null) {\n            return;\n          }\n          if (other.Version.Length != 0) {\n            Version = other.Version;\n          }\n          metrics_.Add(other.metrics_);\n          parameters_.Add(other.parameters_);\n          if (other.TemplateRef.Length != 0) {\n            TemplateRef = other.TemplateRef;\n          }\n          if (other.IsDefinition != false) {\n            IsDefinition = other.IsDefinition;\n          }\n          details_.Add(other.details_);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void MergeFrom(pb::CodedInputStream input) {\n          uint tag;\n          while ((tag = input.ReadTag()) != 0) {\n            switch(tag) {\n              default:\n                input.SkipLastField();\n                break;\n              case 10: {\n                Version = input.ReadString();\n                break;\n              }\n              case 18: {\n                metrics_.AddEntriesFrom(input, _repeated_metrics_codec);\n                break;\n              }\n              case 26: {\n                parameters_.AddEntriesFrom(input, _repeated_parameters_codec);\n                break;\n              }\n              case 34: {\n                TemplateRef = input.ReadString();\n                break;\n              }\n              case 40: {\n                IsDefinition = input.ReadBool();\n                break;\n              }\n              case 50: {\n                details_.AddEntriesFrom(input, _repeated_details_codec);\n                break;\n              }\n            }\n          }\n        }\n\n        #region Nested types\n        /// <summary>Container for nested types declared in the Template message type.</summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static partial class Types {\n          public sealed partial class Parameter : pb::IMessage<Parameter> {\n            private static readonly pb::MessageParser<Parameter> _parser = new pb::MessageParser<Parameter>(() => new Parameter());\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public static pb::MessageParser<Parameter> Parser { get { return _parser; } }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public static pbr::MessageDescriptor Descriptor {\n              get { return global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Descriptor.NestedTypes[0]; }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            pbr::MessageDescriptor pb::IMessage.Descriptor {\n              get { return Descriptor; }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public Parameter() {\n              OnConstruction();\n            }\n\n            partial void OnConstruction();\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public Parameter(Parameter other) : this() {\n              name_ = other.name_;\n              type_ = other.type_;\n              switch (other.ValueCase) {\n                case ValueOneofCase.IntValue:\n                  IntValue = other.IntValue;\n                  break;\n                case ValueOneofCase.LongValue:\n                  LongValue = other.LongValue;\n                  break;\n                case ValueOneofCase.FloatValue:\n                  FloatValue = other.FloatValue;\n                  break;\n                case ValueOneofCase.DoubleValue:\n                  DoubleValue = other.DoubleValue;\n                  break;\n                case ValueOneofCase.BooleanValue:\n                  BooleanValue = other.BooleanValue;\n                  break;\n                case ValueOneofCase.StringValue:\n                  StringValue = other.StringValue;\n                  break;\n                case ValueOneofCase.ExtensionValue:\n                  ExtensionValue = other.ExtensionValue.Clone();\n                  break;\n              }\n\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public Parameter Clone() {\n              return new Parameter(this);\n            }\n\n            /// <summary>Field number for the \"name\" field.</summary>\n            public const int NameFieldNumber = 1;\n            private string name_ = \"\";\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public string Name {\n              get { return name_; }\n              set {\n                name_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n              }\n            }\n\n            /// <summary>Field number for the \"type\" field.</summary>\n            public const int TypeFieldNumber = 2;\n            private uint type_;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public uint Type {\n              get { return type_; }\n              set {\n                type_ = value;\n              }\n            }\n\n            /// <summary>Field number for the \"int_value\" field.</summary>\n            public const int IntValueFieldNumber = 3;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public uint IntValue {\n              get { return valueCase_ == ValueOneofCase.IntValue ? (uint) value_ : 0; }\n              set {\n                value_ = value;\n                valueCase_ = ValueOneofCase.IntValue;\n              }\n            }\n\n            /// <summary>Field number for the \"long_value\" field.</summary>\n            public const int LongValueFieldNumber = 4;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public ulong LongValue {\n              get { return valueCase_ == ValueOneofCase.LongValue ? (ulong) value_ : 0UL; }\n              set {\n                value_ = value;\n                valueCase_ = ValueOneofCase.LongValue;\n              }\n            }\n\n            /// <summary>Field number for the \"float_value\" field.</summary>\n            public const int FloatValueFieldNumber = 5;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public float FloatValue {\n              get { return valueCase_ == ValueOneofCase.FloatValue ? (float) value_ : 0F; }\n              set {\n                value_ = value;\n                valueCase_ = ValueOneofCase.FloatValue;\n              }\n            }\n\n            /// <summary>Field number for the \"double_value\" field.</summary>\n            public const int DoubleValueFieldNumber = 6;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public double DoubleValue {\n              get { return valueCase_ == ValueOneofCase.DoubleValue ? (double) value_ : 0D; }\n              set {\n                value_ = value;\n                valueCase_ = ValueOneofCase.DoubleValue;\n              }\n            }\n\n            /// <summary>Field number for the \"boolean_value\" field.</summary>\n            public const int BooleanValueFieldNumber = 7;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public bool BooleanValue {\n              get { return valueCase_ == ValueOneofCase.BooleanValue ? (bool) value_ : false; }\n              set {\n                value_ = value;\n                valueCase_ = ValueOneofCase.BooleanValue;\n              }\n            }\n\n            /// <summary>Field number for the \"string_value\" field.</summary>\n            public const int StringValueFieldNumber = 8;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public string StringValue {\n              get { return valueCase_ == ValueOneofCase.StringValue ? (string) value_ : \"\"; }\n              set {\n                value_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n                valueCase_ = ValueOneofCase.StringValue;\n              }\n            }\n\n            /// <summary>Field number for the \"extension_value\" field.</summary>\n            public const int ExtensionValueFieldNumber = 9;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Types.Parameter.Types.ParameterValueExtension ExtensionValue {\n              get { return valueCase_ == ValueOneofCase.ExtensionValue ? (global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Types.Parameter.Types.ParameterValueExtension) value_ : null; }\n              set {\n                value_ = value;\n                valueCase_ = value == null ? ValueOneofCase.None : ValueOneofCase.ExtensionValue;\n              }\n            }\n\n            private object value_;\n            /// <summary>Enum of possible cases for the \"value\" oneof.</summary>\n            public enum ValueOneofCase {\n              None = 0,\n              IntValue = 3,\n              LongValue = 4,\n              FloatValue = 5,\n              DoubleValue = 6,\n              BooleanValue = 7,\n              StringValue = 8,\n              ExtensionValue = 9,\n            }\n            private ValueOneofCase valueCase_ = ValueOneofCase.None;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public ValueOneofCase ValueCase {\n              get { return valueCase_; }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void ClearValue() {\n              valueCase_ = ValueOneofCase.None;\n              value_ = null;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public override bool Equals(object other) {\n              return Equals(other as Parameter);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public bool Equals(Parameter other) {\n              if (ReferenceEquals(other, null)) {\n                return false;\n              }\n              if (ReferenceEquals(other, this)) {\n                return true;\n              }\n              if (Name != other.Name) return false;\n              if (Type != other.Type) return false;\n              if (IntValue != other.IntValue) return false;\n              if (LongValue != other.LongValue) return false;\n              if (FloatValue != other.FloatValue) return false;\n              if (DoubleValue != other.DoubleValue) return false;\n              if (BooleanValue != other.BooleanValue) return false;\n              if (StringValue != other.StringValue) return false;\n              if (!object.Equals(ExtensionValue, other.ExtensionValue)) return false;\n              if (ValueCase != other.ValueCase) return false;\n              return true;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public override int GetHashCode() {\n              int hash = 1;\n              if (Name.Length != 0) hash ^= Name.GetHashCode();\n              if (Type != 0) hash ^= Type.GetHashCode();\n              if (valueCase_ == ValueOneofCase.IntValue) hash ^= IntValue.GetHashCode();\n              if (valueCase_ == ValueOneofCase.LongValue) hash ^= LongValue.GetHashCode();\n              if (valueCase_ == ValueOneofCase.FloatValue) hash ^= FloatValue.GetHashCode();\n              if (valueCase_ == ValueOneofCase.DoubleValue) hash ^= DoubleValue.GetHashCode();\n              if (valueCase_ == ValueOneofCase.BooleanValue) hash ^= BooleanValue.GetHashCode();\n              if (valueCase_ == ValueOneofCase.StringValue) hash ^= StringValue.GetHashCode();\n              if (valueCase_ == ValueOneofCase.ExtensionValue) hash ^= ExtensionValue.GetHashCode();\n              hash ^= (int) valueCase_;\n              return hash;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public override string ToString() {\n              return pb::JsonFormatter.ToDiagnosticString(this);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void WriteTo(pb::CodedOutputStream output) {\n              if (Name.Length != 0) {\n                output.WriteRawTag(10);\n                output.WriteString(Name);\n              }\n              if (Type != 0) {\n                output.WriteRawTag(16);\n                output.WriteUInt32(Type);\n              }\n              if (valueCase_ == ValueOneofCase.IntValue) {\n                output.WriteRawTag(24);\n                output.WriteUInt32(IntValue);\n              }\n              if (valueCase_ == ValueOneofCase.LongValue) {\n                output.WriteRawTag(32);\n                output.WriteUInt64(LongValue);\n              }\n              if (valueCase_ == ValueOneofCase.FloatValue) {\n                output.WriteRawTag(45);\n                output.WriteFloat(FloatValue);\n              }\n              if (valueCase_ == ValueOneofCase.DoubleValue) {\n                output.WriteRawTag(49);\n                output.WriteDouble(DoubleValue);\n              }\n              if (valueCase_ == ValueOneofCase.BooleanValue) {\n                output.WriteRawTag(56);\n                output.WriteBool(BooleanValue);\n              }\n              if (valueCase_ == ValueOneofCase.StringValue) {\n                output.WriteRawTag(66);\n                output.WriteString(StringValue);\n              }\n              if (valueCase_ == ValueOneofCase.ExtensionValue) {\n                output.WriteRawTag(74);\n                output.WriteMessage(ExtensionValue);\n              }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public int CalculateSize() {\n              int size = 0;\n              if (Name.Length != 0) {\n                size += 1 + pb::CodedOutputStream.ComputeStringSize(Name);\n              }\n              if (Type != 0) {\n                size += 1 + pb::CodedOutputStream.ComputeUInt32Size(Type);\n              }\n              if (valueCase_ == ValueOneofCase.IntValue) {\n                size += 1 + pb::CodedOutputStream.ComputeUInt32Size(IntValue);\n              }\n              if (valueCase_ == ValueOneofCase.LongValue) {\n                size += 1 + pb::CodedOutputStream.ComputeUInt64Size(LongValue);\n              }\n              if (valueCase_ == ValueOneofCase.FloatValue) {\n                size += 1 + 4;\n              }\n              if (valueCase_ == ValueOneofCase.DoubleValue) {\n                size += 1 + 8;\n              }\n              if (valueCase_ == ValueOneofCase.BooleanValue) {\n                size += 1 + 1;\n              }\n              if (valueCase_ == ValueOneofCase.StringValue) {\n                size += 1 + pb::CodedOutputStream.ComputeStringSize(StringValue);\n              }\n              if (valueCase_ == ValueOneofCase.ExtensionValue) {\n                size += 1 + pb::CodedOutputStream.ComputeMessageSize(ExtensionValue);\n              }\n              return size;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void MergeFrom(Parameter other) {\n              if (other == null) {\n                return;\n              }\n              if (other.Name.Length != 0) {\n                Name = other.Name;\n              }\n              if (other.Type != 0) {\n                Type = other.Type;\n              }\n              switch (other.ValueCase) {\n                case ValueOneofCase.IntValue:\n                  IntValue = other.IntValue;\n                  break;\n                case ValueOneofCase.LongValue:\n                  LongValue = other.LongValue;\n                  break;\n                case ValueOneofCase.FloatValue:\n                  FloatValue = other.FloatValue;\n                  break;\n                case ValueOneofCase.DoubleValue:\n                  DoubleValue = other.DoubleValue;\n                  break;\n                case ValueOneofCase.BooleanValue:\n                  BooleanValue = other.BooleanValue;\n                  break;\n                case ValueOneofCase.StringValue:\n                  StringValue = other.StringValue;\n                  break;\n                case ValueOneofCase.ExtensionValue:\n                  ExtensionValue = other.ExtensionValue;\n                  break;\n              }\n\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void MergeFrom(pb::CodedInputStream input) {\n              uint tag;\n              while ((tag = input.ReadTag()) != 0) {\n                switch(tag) {\n                  default:\n                    input.SkipLastField();\n                    break;\n                  case 10: {\n                    Name = input.ReadString();\n                    break;\n                  }\n                  case 16: {\n                    Type = input.ReadUInt32();\n                    break;\n                  }\n                  case 24: {\n                    IntValue = input.ReadUInt32();\n                    break;\n                  }\n                  case 32: {\n                    LongValue = input.ReadUInt64();\n                    break;\n                  }\n                  case 45: {\n                    FloatValue = input.ReadFloat();\n                    break;\n                  }\n                  case 49: {\n                    DoubleValue = input.ReadDouble();\n                    break;\n                  }\n                  case 56: {\n                    BooleanValue = input.ReadBool();\n                    break;\n                  }\n                  case 66: {\n                    StringValue = input.ReadString();\n                    break;\n                  }\n                  case 74: {\n                    global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Types.Parameter.Types.ParameterValueExtension subBuilder = new global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Types.Parameter.Types.ParameterValueExtension();\n                    if (valueCase_ == ValueOneofCase.ExtensionValue) {\n                      subBuilder.MergeFrom(ExtensionValue);\n                    }\n                    input.ReadMessage(subBuilder);\n                    ExtensionValue = subBuilder;\n                    break;\n                  }\n                }\n              }\n            }\n\n            #region Nested types\n            /// <summary>Container for nested types declared in the Parameter message type.</summary>\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public static partial class Types {\n              public sealed partial class ParameterValueExtension : pb::IMessage<ParameterValueExtension> {\n                private static readonly pb::MessageParser<ParameterValueExtension> _parser = new pb::MessageParser<ParameterValueExtension>(() => new ParameterValueExtension());\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public static pb::MessageParser<ParameterValueExtension> Parser { get { return _parser; } }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public static pbr::MessageDescriptor Descriptor {\n                  get { return global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template.Types.Parameter.Descriptor.NestedTypes[0]; }\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                pbr::MessageDescriptor pb::IMessage.Descriptor {\n                  get { return Descriptor; }\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public ParameterValueExtension() {\n                  OnConstruction();\n                }\n\n                partial void OnConstruction();\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public ParameterValueExtension(ParameterValueExtension other) : this() {\n                  extensions_ = other.extensions_.Clone();\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public ParameterValueExtension Clone() {\n                  return new ParameterValueExtension(this);\n                }\n\n                /// <summary>Field number for the \"extensions\" field.</summary>\n                public const int ExtensionsFieldNumber = 1;\n                private static readonly pb::FieldCodec<global::Google.Protobuf.WellKnownTypes.Any> _repeated_extensions_codec\n                    = pb::FieldCodec.ForMessage(10, global::Google.Protobuf.WellKnownTypes.Any.Parser);\n                private readonly pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> extensions_ = new pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any>();\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> Extensions {\n                  get { return extensions_; }\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public override bool Equals(object other) {\n                  return Equals(other as ParameterValueExtension);\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public bool Equals(ParameterValueExtension other) {\n                  if (ReferenceEquals(other, null)) {\n                    return false;\n                  }\n                  if (ReferenceEquals(other, this)) {\n                    return true;\n                  }\n                  if(!extensions_.Equals(other.extensions_)) return false;\n                  return true;\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public override int GetHashCode() {\n                  int hash = 1;\n                  hash ^= extensions_.GetHashCode();\n                  return hash;\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public override string ToString() {\n                  return pb::JsonFormatter.ToDiagnosticString(this);\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public void WriteTo(pb::CodedOutputStream output) {\n                  extensions_.WriteTo(output, _repeated_extensions_codec);\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public int CalculateSize() {\n                  int size = 0;\n                  size += extensions_.CalculateSize(_repeated_extensions_codec);\n                  return size;\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public void MergeFrom(ParameterValueExtension other) {\n                  if (other == null) {\n                    return;\n                  }\n                  extensions_.Add(other.extensions_);\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public void MergeFrom(pb::CodedInputStream input) {\n                  uint tag;\n                  while ((tag = input.ReadTag()) != 0) {\n                    switch(tag) {\n                      default:\n                        input.SkipLastField();\n                        break;\n                      case 10: {\n                        extensions_.AddEntriesFrom(input, _repeated_extensions_codec);\n                        break;\n                      }\n                    }\n                  }\n                }\n\n              }\n\n            }\n            #endregion\n\n          }\n\n        }\n        #endregion\n\n      }\n\n      public sealed partial class DataSet : pb::IMessage<DataSet> {\n        private static readonly pb::MessageParser<DataSet> _parser = new pb::MessageParser<DataSet>(() => new DataSet());\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static pb::MessageParser<DataSet> Parser { get { return _parser; } }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static pbr::MessageDescriptor Descriptor {\n          get { return global::Org.Eclipse.Tahu.Protobuf.Payload.Descriptor.NestedTypes[1]; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        pbr::MessageDescriptor pb::IMessage.Descriptor {\n          get { return Descriptor; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public DataSet() {\n          OnConstruction();\n        }\n\n        partial void OnConstruction();\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public DataSet(DataSet other) : this() {\n          numOfColumns_ = other.numOfColumns_;\n          columns_ = other.columns_.Clone();\n          types_ = other.types_.Clone();\n          rows_ = other.rows_.Clone();\n          details_ = other.details_.Clone();\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public DataSet Clone() {\n          return new DataSet(this);\n        }\n\n        /// <summary>Field number for the \"num_of_columns\" field.</summary>\n        public const int NumOfColumnsFieldNumber = 1;\n        private ulong numOfColumns_;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public ulong NumOfColumns {\n          get { return numOfColumns_; }\n          set {\n            numOfColumns_ = value;\n          }\n        }\n\n        /// <summary>Field number for the \"columns\" field.</summary>\n        public const int ColumnsFieldNumber = 2;\n        private static readonly pb::FieldCodec<string> _repeated_columns_codec\n            = pb::FieldCodec.ForString(18);\n        private readonly pbc::RepeatedField<string> columns_ = new pbc::RepeatedField<string>();\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public pbc::RepeatedField<string> Columns {\n          get { return columns_; }\n        }\n\n        /// <summary>Field number for the \"types\" field.</summary>\n        public const int Types_FieldNumber = 3;\n        private static readonly pb::FieldCodec<uint> _repeated_types_codec\n            = pb::FieldCodec.ForUInt32(26);\n        private readonly pbc::RepeatedField<uint> types_ = new pbc::RepeatedField<uint>();\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public pbc::RepeatedField<uint> Types_ {\n          get { return types_; }\n        }\n\n        /// <summary>Field number for the \"rows\" field.</summary>\n        public const int RowsFieldNumber = 4;\n        private static readonly pb::FieldCodec<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.Row> _repeated_rows_codec\n            = pb::FieldCodec.ForMessage(34, global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.Row.Parser);\n        private readonly pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.Row> rows_ = new pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.Row>();\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.Row> Rows {\n          get { return rows_; }\n        }\n\n        /// <summary>Field number for the \"details\" field.</summary>\n        public const int DetailsFieldNumber = 5;\n        private static readonly pb::FieldCodec<global::Google.Protobuf.WellKnownTypes.Any> _repeated_details_codec\n            = pb::FieldCodec.ForMessage(42, global::Google.Protobuf.WellKnownTypes.Any.Parser);\n        private readonly pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> details_ = new pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any>();\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> Details {\n          get { return details_; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override bool Equals(object other) {\n          return Equals(other as DataSet);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public bool Equals(DataSet other) {\n          if (ReferenceEquals(other, null)) {\n            return false;\n          }\n          if (ReferenceEquals(other, this)) {\n            return true;\n          }\n          if (NumOfColumns != other.NumOfColumns) return false;\n          if(!columns_.Equals(other.columns_)) return false;\n          if(!types_.Equals(other.types_)) return false;\n          if(!rows_.Equals(other.rows_)) return false;\n          if(!details_.Equals(other.details_)) return false;\n          return true;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override int GetHashCode() {\n          int hash = 1;\n          if (NumOfColumns != 0UL) hash ^= NumOfColumns.GetHashCode();\n          hash ^= columns_.GetHashCode();\n          hash ^= types_.GetHashCode();\n          hash ^= rows_.GetHashCode();\n          hash ^= details_.GetHashCode();\n          return hash;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override string ToString() {\n          return pb::JsonFormatter.ToDiagnosticString(this);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void WriteTo(pb::CodedOutputStream output) {\n          if (NumOfColumns != 0UL) {\n            output.WriteRawTag(8);\n            output.WriteUInt64(NumOfColumns);\n          }\n          columns_.WriteTo(output, _repeated_columns_codec);\n          types_.WriteTo(output, _repeated_types_codec);\n          rows_.WriteTo(output, _repeated_rows_codec);\n          details_.WriteTo(output, _repeated_details_codec);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public int CalculateSize() {\n          int size = 0;\n          if (NumOfColumns != 0UL) {\n            size += 1 + pb::CodedOutputStream.ComputeUInt64Size(NumOfColumns);\n          }\n          size += columns_.CalculateSize(_repeated_columns_codec);\n          size += types_.CalculateSize(_repeated_types_codec);\n          size += rows_.CalculateSize(_repeated_rows_codec);\n          size += details_.CalculateSize(_repeated_details_codec);\n          return size;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void MergeFrom(DataSet other) {\n          if (other == null) {\n            return;\n          }\n          if (other.NumOfColumns != 0UL) {\n            NumOfColumns = other.NumOfColumns;\n          }\n          columns_.Add(other.columns_);\n          types_.Add(other.types_);\n          rows_.Add(other.rows_);\n          details_.Add(other.details_);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void MergeFrom(pb::CodedInputStream input) {\n          uint tag;\n          while ((tag = input.ReadTag()) != 0) {\n            switch(tag) {\n              default:\n                input.SkipLastField();\n                break;\n              case 8: {\n                NumOfColumns = input.ReadUInt64();\n                break;\n              }\n              case 18: {\n                columns_.AddEntriesFrom(input, _repeated_columns_codec);\n                break;\n              }\n              case 26:\n              case 24: {\n                types_.AddEntriesFrom(input, _repeated_types_codec);\n                break;\n              }\n              case 34: {\n                rows_.AddEntriesFrom(input, _repeated_rows_codec);\n                break;\n              }\n              case 42: {\n                details_.AddEntriesFrom(input, _repeated_details_codec);\n                break;\n              }\n            }\n          }\n        }\n\n        #region Nested types\n        /// <summary>Container for nested types declared in the DataSet message type.</summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static partial class Types {\n          public sealed partial class DataSetValue : pb::IMessage<DataSetValue> {\n            private static readonly pb::MessageParser<DataSetValue> _parser = new pb::MessageParser<DataSetValue>(() => new DataSetValue());\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public static pb::MessageParser<DataSetValue> Parser { get { return _parser; } }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public static pbr::MessageDescriptor Descriptor {\n              get { return global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Descriptor.NestedTypes[0]; }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            pbr::MessageDescriptor pb::IMessage.Descriptor {\n              get { return Descriptor; }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public DataSetValue() {\n              OnConstruction();\n            }\n\n            partial void OnConstruction();\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public DataSetValue(DataSetValue other) : this() {\n              switch (other.ValueCase) {\n                case ValueOneofCase.IntValue:\n                  IntValue = other.IntValue;\n                  break;\n                case ValueOneofCase.LongValue:\n                  LongValue = other.LongValue;\n                  break;\n                case ValueOneofCase.FloatValue:\n                  FloatValue = other.FloatValue;\n                  break;\n                case ValueOneofCase.DoubleValue:\n                  DoubleValue = other.DoubleValue;\n                  break;\n                case ValueOneofCase.BooleanValue:\n                  BooleanValue = other.BooleanValue;\n                  break;\n                case ValueOneofCase.StringValue:\n                  StringValue = other.StringValue;\n                  break;\n                case ValueOneofCase.ExtensionValue:\n                  ExtensionValue = other.ExtensionValue.Clone();\n                  break;\n              }\n\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public DataSetValue Clone() {\n              return new DataSetValue(this);\n            }\n\n            /// <summary>Field number for the \"int_value\" field.</summary>\n            public const int IntValueFieldNumber = 1;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public uint IntValue {\n              get { return valueCase_ == ValueOneofCase.IntValue ? (uint) value_ : 0; }\n              set {\n                value_ = value;\n                valueCase_ = ValueOneofCase.IntValue;\n              }\n            }\n\n            /// <summary>Field number for the \"long_value\" field.</summary>\n            public const int LongValueFieldNumber = 2;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public ulong LongValue {\n              get { return valueCase_ == ValueOneofCase.LongValue ? (ulong) value_ : 0UL; }\n              set {\n                value_ = value;\n                valueCase_ = ValueOneofCase.LongValue;\n              }\n            }\n\n            /// <summary>Field number for the \"float_value\" field.</summary>\n            public const int FloatValueFieldNumber = 3;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public float FloatValue {\n              get { return valueCase_ == ValueOneofCase.FloatValue ? (float) value_ : 0F; }\n              set {\n                value_ = value;\n                valueCase_ = ValueOneofCase.FloatValue;\n              }\n            }\n\n            /// <summary>Field number for the \"double_value\" field.</summary>\n            public const int DoubleValueFieldNumber = 4;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public double DoubleValue {\n              get { return valueCase_ == ValueOneofCase.DoubleValue ? (double) value_ : 0D; }\n              set {\n                value_ = value;\n                valueCase_ = ValueOneofCase.DoubleValue;\n              }\n            }\n\n            /// <summary>Field number for the \"boolean_value\" field.</summary>\n            public const int BooleanValueFieldNumber = 5;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public bool BooleanValue {\n              get { return valueCase_ == ValueOneofCase.BooleanValue ? (bool) value_ : false; }\n              set {\n                value_ = value;\n                valueCase_ = ValueOneofCase.BooleanValue;\n              }\n            }\n\n            /// <summary>Field number for the \"string_value\" field.</summary>\n            public const int StringValueFieldNumber = 6;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public string StringValue {\n              get { return valueCase_ == ValueOneofCase.StringValue ? (string) value_ : \"\"; }\n              set {\n                value_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n                valueCase_ = ValueOneofCase.StringValue;\n              }\n            }\n\n            /// <summary>Field number for the \"extension_value\" field.</summary>\n            public const int ExtensionValueFieldNumber = 7;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.DataSetValue.Types.DataSetValueExtension ExtensionValue {\n              get { return valueCase_ == ValueOneofCase.ExtensionValue ? (global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.DataSetValue.Types.DataSetValueExtension) value_ : null; }\n              set {\n                value_ = value;\n                valueCase_ = value == null ? ValueOneofCase.None : ValueOneofCase.ExtensionValue;\n              }\n            }\n\n            private object value_;\n            /// <summary>Enum of possible cases for the \"value\" oneof.</summary>\n            public enum ValueOneofCase {\n              None = 0,\n              IntValue = 1,\n              LongValue = 2,\n              FloatValue = 3,\n              DoubleValue = 4,\n              BooleanValue = 5,\n              StringValue = 6,\n              ExtensionValue = 7,\n            }\n            private ValueOneofCase valueCase_ = ValueOneofCase.None;\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public ValueOneofCase ValueCase {\n              get { return valueCase_; }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void ClearValue() {\n              valueCase_ = ValueOneofCase.None;\n              value_ = null;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public override bool Equals(object other) {\n              return Equals(other as DataSetValue);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public bool Equals(DataSetValue other) {\n              if (ReferenceEquals(other, null)) {\n                return false;\n              }\n              if (ReferenceEquals(other, this)) {\n                return true;\n              }\n              if (IntValue != other.IntValue) return false;\n              if (LongValue != other.LongValue) return false;\n              if (FloatValue != other.FloatValue) return false;\n              if (DoubleValue != other.DoubleValue) return false;\n              if (BooleanValue != other.BooleanValue) return false;\n              if (StringValue != other.StringValue) return false;\n              if (!object.Equals(ExtensionValue, other.ExtensionValue)) return false;\n              if (ValueCase != other.ValueCase) return false;\n              return true;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public override int GetHashCode() {\n              int hash = 1;\n              if (valueCase_ == ValueOneofCase.IntValue) hash ^= IntValue.GetHashCode();\n              if (valueCase_ == ValueOneofCase.LongValue) hash ^= LongValue.GetHashCode();\n              if (valueCase_ == ValueOneofCase.FloatValue) hash ^= FloatValue.GetHashCode();\n              if (valueCase_ == ValueOneofCase.DoubleValue) hash ^= DoubleValue.GetHashCode();\n              if (valueCase_ == ValueOneofCase.BooleanValue) hash ^= BooleanValue.GetHashCode();\n              if (valueCase_ == ValueOneofCase.StringValue) hash ^= StringValue.GetHashCode();\n              if (valueCase_ == ValueOneofCase.ExtensionValue) hash ^= ExtensionValue.GetHashCode();\n              hash ^= (int) valueCase_;\n              return hash;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public override string ToString() {\n              return pb::JsonFormatter.ToDiagnosticString(this);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void WriteTo(pb::CodedOutputStream output) {\n              if (valueCase_ == ValueOneofCase.IntValue) {\n                output.WriteRawTag(8);\n                output.WriteUInt32(IntValue);\n              }\n              if (valueCase_ == ValueOneofCase.LongValue) {\n                output.WriteRawTag(16);\n                output.WriteUInt64(LongValue);\n              }\n              if (valueCase_ == ValueOneofCase.FloatValue) {\n                output.WriteRawTag(29);\n                output.WriteFloat(FloatValue);\n              }\n              if (valueCase_ == ValueOneofCase.DoubleValue) {\n                output.WriteRawTag(33);\n                output.WriteDouble(DoubleValue);\n              }\n              if (valueCase_ == ValueOneofCase.BooleanValue) {\n                output.WriteRawTag(40);\n                output.WriteBool(BooleanValue);\n              }\n              if (valueCase_ == ValueOneofCase.StringValue) {\n                output.WriteRawTag(50);\n                output.WriteString(StringValue);\n              }\n              if (valueCase_ == ValueOneofCase.ExtensionValue) {\n                output.WriteRawTag(58);\n                output.WriteMessage(ExtensionValue);\n              }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public int CalculateSize() {\n              int size = 0;\n              if (valueCase_ == ValueOneofCase.IntValue) {\n                size += 1 + pb::CodedOutputStream.ComputeUInt32Size(IntValue);\n              }\n              if (valueCase_ == ValueOneofCase.LongValue) {\n                size += 1 + pb::CodedOutputStream.ComputeUInt64Size(LongValue);\n              }\n              if (valueCase_ == ValueOneofCase.FloatValue) {\n                size += 1 + 4;\n              }\n              if (valueCase_ == ValueOneofCase.DoubleValue) {\n                size += 1 + 8;\n              }\n              if (valueCase_ == ValueOneofCase.BooleanValue) {\n                size += 1 + 1;\n              }\n              if (valueCase_ == ValueOneofCase.StringValue) {\n                size += 1 + pb::CodedOutputStream.ComputeStringSize(StringValue);\n              }\n              if (valueCase_ == ValueOneofCase.ExtensionValue) {\n                size += 1 + pb::CodedOutputStream.ComputeMessageSize(ExtensionValue);\n              }\n              return size;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void MergeFrom(DataSetValue other) {\n              if (other == null) {\n                return;\n              }\n              switch (other.ValueCase) {\n                case ValueOneofCase.IntValue:\n                  IntValue = other.IntValue;\n                  break;\n                case ValueOneofCase.LongValue:\n                  LongValue = other.LongValue;\n                  break;\n                case ValueOneofCase.FloatValue:\n                  FloatValue = other.FloatValue;\n                  break;\n                case ValueOneofCase.DoubleValue:\n                  DoubleValue = other.DoubleValue;\n                  break;\n                case ValueOneofCase.BooleanValue:\n                  BooleanValue = other.BooleanValue;\n                  break;\n                case ValueOneofCase.StringValue:\n                  StringValue = other.StringValue;\n                  break;\n                case ValueOneofCase.ExtensionValue:\n                  ExtensionValue = other.ExtensionValue;\n                  break;\n              }\n\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void MergeFrom(pb::CodedInputStream input) {\n              uint tag;\n              while ((tag = input.ReadTag()) != 0) {\n                switch(tag) {\n                  default:\n                    input.SkipLastField();\n                    break;\n                  case 8: {\n                    IntValue = input.ReadUInt32();\n                    break;\n                  }\n                  case 16: {\n                    LongValue = input.ReadUInt64();\n                    break;\n                  }\n                  case 29: {\n                    FloatValue = input.ReadFloat();\n                    break;\n                  }\n                  case 33: {\n                    DoubleValue = input.ReadDouble();\n                    break;\n                  }\n                  case 40: {\n                    BooleanValue = input.ReadBool();\n                    break;\n                  }\n                  case 50: {\n                    StringValue = input.ReadString();\n                    break;\n                  }\n                  case 58: {\n                    global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.DataSetValue.Types.DataSetValueExtension subBuilder = new global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.DataSetValue.Types.DataSetValueExtension();\n                    if (valueCase_ == ValueOneofCase.ExtensionValue) {\n                      subBuilder.MergeFrom(ExtensionValue);\n                    }\n                    input.ReadMessage(subBuilder);\n                    ExtensionValue = subBuilder;\n                    break;\n                  }\n                }\n              }\n            }\n\n            #region Nested types\n            /// <summary>Container for nested types declared in the DataSetValue message type.</summary>\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public static partial class Types {\n              public sealed partial class DataSetValueExtension : pb::IMessage<DataSetValueExtension> {\n                private static readonly pb::MessageParser<DataSetValueExtension> _parser = new pb::MessageParser<DataSetValueExtension>(() => new DataSetValueExtension());\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public static pb::MessageParser<DataSetValueExtension> Parser { get { return _parser; } }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public static pbr::MessageDescriptor Descriptor {\n                  get { return global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.DataSetValue.Descriptor.NestedTypes[0]; }\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                pbr::MessageDescriptor pb::IMessage.Descriptor {\n                  get { return Descriptor; }\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public DataSetValueExtension() {\n                  OnConstruction();\n                }\n\n                partial void OnConstruction();\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public DataSetValueExtension(DataSetValueExtension other) : this() {\n                  details_ = other.details_.Clone();\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public DataSetValueExtension Clone() {\n                  return new DataSetValueExtension(this);\n                }\n\n                /// <summary>Field number for the \"details\" field.</summary>\n                public const int DetailsFieldNumber = 1;\n                private static readonly pb::FieldCodec<global::Google.Protobuf.WellKnownTypes.Any> _repeated_details_codec\n                    = pb::FieldCodec.ForMessage(10, global::Google.Protobuf.WellKnownTypes.Any.Parser);\n                private readonly pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> details_ = new pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any>();\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> Details {\n                  get { return details_; }\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public override bool Equals(object other) {\n                  return Equals(other as DataSetValueExtension);\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public bool Equals(DataSetValueExtension other) {\n                  if (ReferenceEquals(other, null)) {\n                    return false;\n                  }\n                  if (ReferenceEquals(other, this)) {\n                    return true;\n                  }\n                  if(!details_.Equals(other.details_)) return false;\n                  return true;\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public override int GetHashCode() {\n                  int hash = 1;\n                  hash ^= details_.GetHashCode();\n                  return hash;\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public override string ToString() {\n                  return pb::JsonFormatter.ToDiagnosticString(this);\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public void WriteTo(pb::CodedOutputStream output) {\n                  details_.WriteTo(output, _repeated_details_codec);\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public int CalculateSize() {\n                  int size = 0;\n                  size += details_.CalculateSize(_repeated_details_codec);\n                  return size;\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public void MergeFrom(DataSetValueExtension other) {\n                  if (other == null) {\n                    return;\n                  }\n                  details_.Add(other.details_);\n                }\n\n                [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n                public void MergeFrom(pb::CodedInputStream input) {\n                  uint tag;\n                  while ((tag = input.ReadTag()) != 0) {\n                    switch(tag) {\n                      default:\n                        input.SkipLastField();\n                        break;\n                      case 10: {\n                        details_.AddEntriesFrom(input, _repeated_details_codec);\n                        break;\n                      }\n                    }\n                  }\n                }\n\n              }\n\n            }\n            #endregion\n\n          }\n\n          public sealed partial class Row : pb::IMessage<Row> {\n            private static readonly pb::MessageParser<Row> _parser = new pb::MessageParser<Row>(() => new Row());\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public static pb::MessageParser<Row> Parser { get { return _parser; } }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public static pbr::MessageDescriptor Descriptor {\n              get { return global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Descriptor.NestedTypes[1]; }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            pbr::MessageDescriptor pb::IMessage.Descriptor {\n              get { return Descriptor; }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public Row() {\n              OnConstruction();\n            }\n\n            partial void OnConstruction();\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public Row(Row other) : this() {\n              elements_ = other.elements_.Clone();\n              details_ = other.details_.Clone();\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public Row Clone() {\n              return new Row(this);\n            }\n\n            /// <summary>Field number for the \"elements\" field.</summary>\n            public const int ElementsFieldNumber = 1;\n            private static readonly pb::FieldCodec<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.DataSetValue> _repeated_elements_codec\n                = pb::FieldCodec.ForMessage(10, global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.DataSetValue.Parser);\n            private readonly pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.DataSetValue> elements_ = new pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.DataSetValue>();\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet.Types.DataSetValue> Elements {\n              get { return elements_; }\n            }\n\n            /// <summary>Field number for the \"details\" field.</summary>\n            public const int DetailsFieldNumber = 2;\n            private static readonly pb::FieldCodec<global::Google.Protobuf.WellKnownTypes.Any> _repeated_details_codec\n                = pb::FieldCodec.ForMessage(18, global::Google.Protobuf.WellKnownTypes.Any.Parser);\n            private readonly pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> details_ = new pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any>();\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> Details {\n              get { return details_; }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public override bool Equals(object other) {\n              return Equals(other as Row);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public bool Equals(Row other) {\n              if (ReferenceEquals(other, null)) {\n                return false;\n              }\n              if (ReferenceEquals(other, this)) {\n                return true;\n              }\n              if(!elements_.Equals(other.elements_)) return false;\n              if(!details_.Equals(other.details_)) return false;\n              return true;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public override int GetHashCode() {\n              int hash = 1;\n              hash ^= elements_.GetHashCode();\n              hash ^= details_.GetHashCode();\n              return hash;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public override string ToString() {\n              return pb::JsonFormatter.ToDiagnosticString(this);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void WriteTo(pb::CodedOutputStream output) {\n              elements_.WriteTo(output, _repeated_elements_codec);\n              details_.WriteTo(output, _repeated_details_codec);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public int CalculateSize() {\n              int size = 0;\n              size += elements_.CalculateSize(_repeated_elements_codec);\n              size += details_.CalculateSize(_repeated_details_codec);\n              return size;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void MergeFrom(Row other) {\n              if (other == null) {\n                return;\n              }\n              elements_.Add(other.elements_);\n              details_.Add(other.details_);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void MergeFrom(pb::CodedInputStream input) {\n              uint tag;\n              while ((tag = input.ReadTag()) != 0) {\n                switch(tag) {\n                  default:\n                    input.SkipLastField();\n                    break;\n                  case 10: {\n                    elements_.AddEntriesFrom(input, _repeated_elements_codec);\n                    break;\n                  }\n                  case 18: {\n                    details_.AddEntriesFrom(input, _repeated_details_codec);\n                    break;\n                  }\n                }\n              }\n            }\n\n          }\n\n        }\n        #endregion\n\n      }\n\n      public sealed partial class PropertyValue : pb::IMessage<PropertyValue> {\n        private static readonly pb::MessageParser<PropertyValue> _parser = new pb::MessageParser<PropertyValue>(() => new PropertyValue());\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static pb::MessageParser<PropertyValue> Parser { get { return _parser; } }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static pbr::MessageDescriptor Descriptor {\n          get { return global::Org.Eclipse.Tahu.Protobuf.Payload.Descriptor.NestedTypes[2]; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        pbr::MessageDescriptor pb::IMessage.Descriptor {\n          get { return Descriptor; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public PropertyValue() {\n          OnConstruction();\n        }\n\n        partial void OnConstruction();\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public PropertyValue(PropertyValue other) : this() {\n          type_ = other.type_;\n          isNull_ = other.isNull_;\n          switch (other.ValueCase) {\n            case ValueOneofCase.IntValue:\n              IntValue = other.IntValue;\n              break;\n            case ValueOneofCase.LongValue:\n              LongValue = other.LongValue;\n              break;\n            case ValueOneofCase.FloatValue:\n              FloatValue = other.FloatValue;\n              break;\n            case ValueOneofCase.DoubleValue:\n              DoubleValue = other.DoubleValue;\n              break;\n            case ValueOneofCase.BooleanValue:\n              BooleanValue = other.BooleanValue;\n              break;\n            case ValueOneofCase.StringValue:\n              StringValue = other.StringValue;\n              break;\n            case ValueOneofCase.PropertysetValue:\n              PropertysetValue = other.PropertysetValue.Clone();\n              break;\n            case ValueOneofCase.PropertysetsValue:\n              PropertysetsValue = other.PropertysetsValue.Clone();\n              break;\n            case ValueOneofCase.ExtensionValue:\n              ExtensionValue = other.ExtensionValue.Clone();\n              break;\n          }\n\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public PropertyValue Clone() {\n          return new PropertyValue(this);\n        }\n\n        /// <summary>Field number for the \"type\" field.</summary>\n        public const int TypeFieldNumber = 1;\n        private uint type_;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public uint Type {\n          get { return type_; }\n          set {\n            type_ = value;\n          }\n        }\n\n        /// <summary>Field number for the \"is_null\" field.</summary>\n        public const int IsNullFieldNumber = 2;\n        private bool isNull_;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public bool IsNull {\n          get { return isNull_; }\n          set {\n            isNull_ = value;\n          }\n        }\n\n        /// <summary>Field number for the \"int_value\" field.</summary>\n        public const int IntValueFieldNumber = 3;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public uint IntValue {\n          get { return valueCase_ == ValueOneofCase.IntValue ? (uint) value_ : 0; }\n          set {\n            value_ = value;\n            valueCase_ = ValueOneofCase.IntValue;\n          }\n        }\n\n        /// <summary>Field number for the \"long_value\" field.</summary>\n        public const int LongValueFieldNumber = 4;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public ulong LongValue {\n          get { return valueCase_ == ValueOneofCase.LongValue ? (ulong) value_ : 0UL; }\n          set {\n            value_ = value;\n            valueCase_ = ValueOneofCase.LongValue;\n          }\n        }\n\n        /// <summary>Field number for the \"float_value\" field.</summary>\n        public const int FloatValueFieldNumber = 5;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public float FloatValue {\n          get { return valueCase_ == ValueOneofCase.FloatValue ? (float) value_ : 0F; }\n          set {\n            value_ = value;\n            valueCase_ = ValueOneofCase.FloatValue;\n          }\n        }\n\n        /// <summary>Field number for the \"double_value\" field.</summary>\n        public const int DoubleValueFieldNumber = 6;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public double DoubleValue {\n          get { return valueCase_ == ValueOneofCase.DoubleValue ? (double) value_ : 0D; }\n          set {\n            value_ = value;\n            valueCase_ = ValueOneofCase.DoubleValue;\n          }\n        }\n\n        /// <summary>Field number for the \"boolean_value\" field.</summary>\n        public const int BooleanValueFieldNumber = 7;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public bool BooleanValue {\n          get { return valueCase_ == ValueOneofCase.BooleanValue ? (bool) value_ : false; }\n          set {\n            value_ = value;\n            valueCase_ = ValueOneofCase.BooleanValue;\n          }\n        }\n\n        /// <summary>Field number for the \"string_value\" field.</summary>\n        public const int StringValueFieldNumber = 8;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public string StringValue {\n          get { return valueCase_ == ValueOneofCase.StringValue ? (string) value_ : \"\"; }\n          set {\n            value_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n            valueCase_ = ValueOneofCase.StringValue;\n          }\n        }\n\n        /// <summary>Field number for the \"propertyset_value\" field.</summary>\n        public const int PropertysetValueFieldNumber = 9;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySet PropertysetValue {\n          get { return valueCase_ == ValueOneofCase.PropertysetValue ? (global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySet) value_ : null; }\n          set {\n            value_ = value;\n            valueCase_ = value == null ? ValueOneofCase.None : ValueOneofCase.PropertysetValue;\n          }\n        }\n\n        /// <summary>Field number for the \"propertysets_value\" field.</summary>\n        public const int PropertysetsValueFieldNumber = 10;\n        /// <summary>\n        /// List of Property Values\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySetList PropertysetsValue {\n          get { return valueCase_ == ValueOneofCase.PropertysetsValue ? (global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySetList) value_ : null; }\n          set {\n            value_ = value;\n            valueCase_ = value == null ? ValueOneofCase.None : ValueOneofCase.PropertysetsValue;\n          }\n        }\n\n        /// <summary>Field number for the \"extension_value\" field.</summary>\n        public const int ExtensionValueFieldNumber = 11;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertyValue.Types.PropertyValueExtension ExtensionValue {\n          get { return valueCase_ == ValueOneofCase.ExtensionValue ? (global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertyValue.Types.PropertyValueExtension) value_ : null; }\n          set {\n            value_ = value;\n            valueCase_ = value == null ? ValueOneofCase.None : ValueOneofCase.ExtensionValue;\n          }\n        }\n\n        private object value_;\n        /// <summary>Enum of possible cases for the \"value\" oneof.</summary>\n        public enum ValueOneofCase {\n          None = 0,\n          IntValue = 3,\n          LongValue = 4,\n          FloatValue = 5,\n          DoubleValue = 6,\n          BooleanValue = 7,\n          StringValue = 8,\n          PropertysetValue = 9,\n          PropertysetsValue = 10,\n          ExtensionValue = 11,\n        }\n        private ValueOneofCase valueCase_ = ValueOneofCase.None;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public ValueOneofCase ValueCase {\n          get { return valueCase_; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void ClearValue() {\n          valueCase_ = ValueOneofCase.None;\n          value_ = null;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override bool Equals(object other) {\n          return Equals(other as PropertyValue);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public bool Equals(PropertyValue other) {\n          if (ReferenceEquals(other, null)) {\n            return false;\n          }\n          if (ReferenceEquals(other, this)) {\n            return true;\n          }\n          if (Type != other.Type) return false;\n          if (IsNull != other.IsNull) return false;\n          if (IntValue != other.IntValue) return false;\n          if (LongValue != other.LongValue) return false;\n          if (FloatValue != other.FloatValue) return false;\n          if (DoubleValue != other.DoubleValue) return false;\n          if (BooleanValue != other.BooleanValue) return false;\n          if (StringValue != other.StringValue) return false;\n          if (!object.Equals(PropertysetValue, other.PropertysetValue)) return false;\n          if (!object.Equals(PropertysetsValue, other.PropertysetsValue)) return false;\n          if (!object.Equals(ExtensionValue, other.ExtensionValue)) return false;\n          if (ValueCase != other.ValueCase) return false;\n          return true;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override int GetHashCode() {\n          int hash = 1;\n          if (Type != 0) hash ^= Type.GetHashCode();\n          if (IsNull != false) hash ^= IsNull.GetHashCode();\n          if (valueCase_ == ValueOneofCase.IntValue) hash ^= IntValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.LongValue) hash ^= LongValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.FloatValue) hash ^= FloatValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.DoubleValue) hash ^= DoubleValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.BooleanValue) hash ^= BooleanValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.StringValue) hash ^= StringValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.PropertysetValue) hash ^= PropertysetValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.PropertysetsValue) hash ^= PropertysetsValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.ExtensionValue) hash ^= ExtensionValue.GetHashCode();\n          hash ^= (int) valueCase_;\n          return hash;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override string ToString() {\n          return pb::JsonFormatter.ToDiagnosticString(this);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void WriteTo(pb::CodedOutputStream output) {\n          if (Type != 0) {\n            output.WriteRawTag(8);\n            output.WriteUInt32(Type);\n          }\n          if (IsNull != false) {\n            output.WriteRawTag(16);\n            output.WriteBool(IsNull);\n          }\n          if (valueCase_ == ValueOneofCase.IntValue) {\n            output.WriteRawTag(24);\n            output.WriteUInt32(IntValue);\n          }\n          if (valueCase_ == ValueOneofCase.LongValue) {\n            output.WriteRawTag(32);\n            output.WriteUInt64(LongValue);\n          }\n          if (valueCase_ == ValueOneofCase.FloatValue) {\n            output.WriteRawTag(45);\n            output.WriteFloat(FloatValue);\n          }\n          if (valueCase_ == ValueOneofCase.DoubleValue) {\n            output.WriteRawTag(49);\n            output.WriteDouble(DoubleValue);\n          }\n          if (valueCase_ == ValueOneofCase.BooleanValue) {\n            output.WriteRawTag(56);\n            output.WriteBool(BooleanValue);\n          }\n          if (valueCase_ == ValueOneofCase.StringValue) {\n            output.WriteRawTag(66);\n            output.WriteString(StringValue);\n          }\n          if (valueCase_ == ValueOneofCase.PropertysetValue) {\n            output.WriteRawTag(74);\n            output.WriteMessage(PropertysetValue);\n          }\n          if (valueCase_ == ValueOneofCase.PropertysetsValue) {\n            output.WriteRawTag(82);\n            output.WriteMessage(PropertysetsValue);\n          }\n          if (valueCase_ == ValueOneofCase.ExtensionValue) {\n            output.WriteRawTag(90);\n            output.WriteMessage(ExtensionValue);\n          }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public int CalculateSize() {\n          int size = 0;\n          if (Type != 0) {\n            size += 1 + pb::CodedOutputStream.ComputeUInt32Size(Type);\n          }\n          if (IsNull != false) {\n            size += 1 + 1;\n          }\n          if (valueCase_ == ValueOneofCase.IntValue) {\n            size += 1 + pb::CodedOutputStream.ComputeUInt32Size(IntValue);\n          }\n          if (valueCase_ == ValueOneofCase.LongValue) {\n            size += 1 + pb::CodedOutputStream.ComputeUInt64Size(LongValue);\n          }\n          if (valueCase_ == ValueOneofCase.FloatValue) {\n            size += 1 + 4;\n          }\n          if (valueCase_ == ValueOneofCase.DoubleValue) {\n            size += 1 + 8;\n          }\n          if (valueCase_ == ValueOneofCase.BooleanValue) {\n            size += 1 + 1;\n          }\n          if (valueCase_ == ValueOneofCase.StringValue) {\n            size += 1 + pb::CodedOutputStream.ComputeStringSize(StringValue);\n          }\n          if (valueCase_ == ValueOneofCase.PropertysetValue) {\n            size += 1 + pb::CodedOutputStream.ComputeMessageSize(PropertysetValue);\n          }\n          if (valueCase_ == ValueOneofCase.PropertysetsValue) {\n            size += 1 + pb::CodedOutputStream.ComputeMessageSize(PropertysetsValue);\n          }\n          if (valueCase_ == ValueOneofCase.ExtensionValue) {\n            size += 1 + pb::CodedOutputStream.ComputeMessageSize(ExtensionValue);\n          }\n          return size;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void MergeFrom(PropertyValue other) {\n          if (other == null) {\n            return;\n          }\n          if (other.Type != 0) {\n            Type = other.Type;\n          }\n          if (other.IsNull != false) {\n            IsNull = other.IsNull;\n          }\n          switch (other.ValueCase) {\n            case ValueOneofCase.IntValue:\n              IntValue = other.IntValue;\n              break;\n            case ValueOneofCase.LongValue:\n              LongValue = other.LongValue;\n              break;\n            case ValueOneofCase.FloatValue:\n              FloatValue = other.FloatValue;\n              break;\n            case ValueOneofCase.DoubleValue:\n              DoubleValue = other.DoubleValue;\n              break;\n            case ValueOneofCase.BooleanValue:\n              BooleanValue = other.BooleanValue;\n              break;\n            case ValueOneofCase.StringValue:\n              StringValue = other.StringValue;\n              break;\n            case ValueOneofCase.PropertysetValue:\n              PropertysetValue = other.PropertysetValue;\n              break;\n            case ValueOneofCase.PropertysetsValue:\n              PropertysetsValue = other.PropertysetsValue;\n              break;\n            case ValueOneofCase.ExtensionValue:\n              ExtensionValue = other.ExtensionValue;\n              break;\n          }\n\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void MergeFrom(pb::CodedInputStream input) {\n          uint tag;\n          while ((tag = input.ReadTag()) != 0) {\n            switch(tag) {\n              default:\n                input.SkipLastField();\n                break;\n              case 8: {\n                Type = input.ReadUInt32();\n                break;\n              }\n              case 16: {\n                IsNull = input.ReadBool();\n                break;\n              }\n              case 24: {\n                IntValue = input.ReadUInt32();\n                break;\n              }\n              case 32: {\n                LongValue = input.ReadUInt64();\n                break;\n              }\n              case 45: {\n                FloatValue = input.ReadFloat();\n                break;\n              }\n              case 49: {\n                DoubleValue = input.ReadDouble();\n                break;\n              }\n              case 56: {\n                BooleanValue = input.ReadBool();\n                break;\n              }\n              case 66: {\n                StringValue = input.ReadString();\n                break;\n              }\n              case 74: {\n                global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySet subBuilder = new global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySet();\n                if (valueCase_ == ValueOneofCase.PropertysetValue) {\n                  subBuilder.MergeFrom(PropertysetValue);\n                }\n                input.ReadMessage(subBuilder);\n                PropertysetValue = subBuilder;\n                break;\n              }\n              case 82: {\n                global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySetList subBuilder = new global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySetList();\n                if (valueCase_ == ValueOneofCase.PropertysetsValue) {\n                  subBuilder.MergeFrom(PropertysetsValue);\n                }\n                input.ReadMessage(subBuilder);\n                PropertysetsValue = subBuilder;\n                break;\n              }\n              case 90: {\n                global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertyValue.Types.PropertyValueExtension subBuilder = new global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertyValue.Types.PropertyValueExtension();\n                if (valueCase_ == ValueOneofCase.ExtensionValue) {\n                  subBuilder.MergeFrom(ExtensionValue);\n                }\n                input.ReadMessage(subBuilder);\n                ExtensionValue = subBuilder;\n                break;\n              }\n            }\n          }\n        }\n\n        #region Nested types\n        /// <summary>Container for nested types declared in the PropertyValue message type.</summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static partial class Types {\n          public sealed partial class PropertyValueExtension : pb::IMessage<PropertyValueExtension> {\n            private static readonly pb::MessageParser<PropertyValueExtension> _parser = new pb::MessageParser<PropertyValueExtension>(() => new PropertyValueExtension());\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public static pb::MessageParser<PropertyValueExtension> Parser { get { return _parser; } }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public static pbr::MessageDescriptor Descriptor {\n              get { return global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertyValue.Descriptor.NestedTypes[0]; }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            pbr::MessageDescriptor pb::IMessage.Descriptor {\n              get { return Descriptor; }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public PropertyValueExtension() {\n              OnConstruction();\n            }\n\n            partial void OnConstruction();\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public PropertyValueExtension(PropertyValueExtension other) : this() {\n              details_ = other.details_.Clone();\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public PropertyValueExtension Clone() {\n              return new PropertyValueExtension(this);\n            }\n\n            /// <summary>Field number for the \"details\" field.</summary>\n            public const int DetailsFieldNumber = 1;\n            private static readonly pb::FieldCodec<global::Google.Protobuf.WellKnownTypes.Any> _repeated_details_codec\n                = pb::FieldCodec.ForMessage(10, global::Google.Protobuf.WellKnownTypes.Any.Parser);\n            private readonly pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> details_ = new pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any>();\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> Details {\n              get { return details_; }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public override bool Equals(object other) {\n              return Equals(other as PropertyValueExtension);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public bool Equals(PropertyValueExtension other) {\n              if (ReferenceEquals(other, null)) {\n                return false;\n              }\n              if (ReferenceEquals(other, this)) {\n                return true;\n              }\n              if(!details_.Equals(other.details_)) return false;\n              return true;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public override int GetHashCode() {\n              int hash = 1;\n              hash ^= details_.GetHashCode();\n              return hash;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public override string ToString() {\n              return pb::JsonFormatter.ToDiagnosticString(this);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void WriteTo(pb::CodedOutputStream output) {\n              details_.WriteTo(output, _repeated_details_codec);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public int CalculateSize() {\n              int size = 0;\n              size += details_.CalculateSize(_repeated_details_codec);\n              return size;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void MergeFrom(PropertyValueExtension other) {\n              if (other == null) {\n                return;\n              }\n              details_.Add(other.details_);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void MergeFrom(pb::CodedInputStream input) {\n              uint tag;\n              while ((tag = input.ReadTag()) != 0) {\n                switch(tag) {\n                  default:\n                    input.SkipLastField();\n                    break;\n                  case 10: {\n                    details_.AddEntriesFrom(input, _repeated_details_codec);\n                    break;\n                  }\n                }\n              }\n            }\n\n          }\n\n        }\n        #endregion\n\n      }\n\n      public sealed partial class PropertySet : pb::IMessage<PropertySet> {\n        private static readonly pb::MessageParser<PropertySet> _parser = new pb::MessageParser<PropertySet>(() => new PropertySet());\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static pb::MessageParser<PropertySet> Parser { get { return _parser; } }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static pbr::MessageDescriptor Descriptor {\n          get { return global::Org.Eclipse.Tahu.Protobuf.Payload.Descriptor.NestedTypes[3]; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        pbr::MessageDescriptor pb::IMessage.Descriptor {\n          get { return Descriptor; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public PropertySet() {\n          OnConstruction();\n        }\n\n        partial void OnConstruction();\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public PropertySet(PropertySet other) : this() {\n          keys_ = other.keys_.Clone();\n          values_ = other.values_.Clone();\n          details_ = other.details_.Clone();\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public PropertySet Clone() {\n          return new PropertySet(this);\n        }\n\n        /// <summary>Field number for the \"keys\" field.</summary>\n        public const int KeysFieldNumber = 1;\n        private static readonly pb::FieldCodec<string> _repeated_keys_codec\n            = pb::FieldCodec.ForString(10);\n        private readonly pbc::RepeatedField<string> keys_ = new pbc::RepeatedField<string>();\n        /// <summary>\n        /// Names of the properties\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public pbc::RepeatedField<string> Keys {\n          get { return keys_; }\n        }\n\n        /// <summary>Field number for the \"values\" field.</summary>\n        public const int ValuesFieldNumber = 2;\n        private static readonly pb::FieldCodec<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertyValue> _repeated_values_codec\n            = pb::FieldCodec.ForMessage(18, global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertyValue.Parser);\n        private readonly pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertyValue> values_ = new pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertyValue>();\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertyValue> Values {\n          get { return values_; }\n        }\n\n        /// <summary>Field number for the \"details\" field.</summary>\n        public const int DetailsFieldNumber = 3;\n        private static readonly pb::FieldCodec<global::Google.Protobuf.WellKnownTypes.Any> _repeated_details_codec\n            = pb::FieldCodec.ForMessage(26, global::Google.Protobuf.WellKnownTypes.Any.Parser);\n        private readonly pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> details_ = new pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any>();\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> Details {\n          get { return details_; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override bool Equals(object other) {\n          return Equals(other as PropertySet);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public bool Equals(PropertySet other) {\n          if (ReferenceEquals(other, null)) {\n            return false;\n          }\n          if (ReferenceEquals(other, this)) {\n            return true;\n          }\n          if(!keys_.Equals(other.keys_)) return false;\n          if(!values_.Equals(other.values_)) return false;\n          if(!details_.Equals(other.details_)) return false;\n          return true;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override int GetHashCode() {\n          int hash = 1;\n          hash ^= keys_.GetHashCode();\n          hash ^= values_.GetHashCode();\n          hash ^= details_.GetHashCode();\n          return hash;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override string ToString() {\n          return pb::JsonFormatter.ToDiagnosticString(this);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void WriteTo(pb::CodedOutputStream output) {\n          keys_.WriteTo(output, _repeated_keys_codec);\n          values_.WriteTo(output, _repeated_values_codec);\n          details_.WriteTo(output, _repeated_details_codec);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public int CalculateSize() {\n          int size = 0;\n          size += keys_.CalculateSize(_repeated_keys_codec);\n          size += values_.CalculateSize(_repeated_values_codec);\n          size += details_.CalculateSize(_repeated_details_codec);\n          return size;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void MergeFrom(PropertySet other) {\n          if (other == null) {\n            return;\n          }\n          keys_.Add(other.keys_);\n          values_.Add(other.values_);\n          details_.Add(other.details_);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void MergeFrom(pb::CodedInputStream input) {\n          uint tag;\n          while ((tag = input.ReadTag()) != 0) {\n            switch(tag) {\n              default:\n                input.SkipLastField();\n                break;\n              case 10: {\n                keys_.AddEntriesFrom(input, _repeated_keys_codec);\n                break;\n              }\n              case 18: {\n                values_.AddEntriesFrom(input, _repeated_values_codec);\n                break;\n              }\n              case 26: {\n                details_.AddEntriesFrom(input, _repeated_details_codec);\n                break;\n              }\n            }\n          }\n        }\n\n      }\n\n      public sealed partial class PropertySetList : pb::IMessage<PropertySetList> {\n        private static readonly pb::MessageParser<PropertySetList> _parser = new pb::MessageParser<PropertySetList>(() => new PropertySetList());\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static pb::MessageParser<PropertySetList> Parser { get { return _parser; } }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static pbr::MessageDescriptor Descriptor {\n          get { return global::Org.Eclipse.Tahu.Protobuf.Payload.Descriptor.NestedTypes[4]; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        pbr::MessageDescriptor pb::IMessage.Descriptor {\n          get { return Descriptor; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public PropertySetList() {\n          OnConstruction();\n        }\n\n        partial void OnConstruction();\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public PropertySetList(PropertySetList other) : this() {\n          propertyset_ = other.propertyset_.Clone();\n          details_ = other.details_.Clone();\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public PropertySetList Clone() {\n          return new PropertySetList(this);\n        }\n\n        /// <summary>Field number for the \"propertyset\" field.</summary>\n        public const int PropertysetFieldNumber = 1;\n        private static readonly pb::FieldCodec<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySet> _repeated_propertyset_codec\n            = pb::FieldCodec.ForMessage(10, global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySet.Parser);\n        private readonly pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySet> propertyset_ = new pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySet>();\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public pbc::RepeatedField<global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySet> Propertyset {\n          get { return propertyset_; }\n        }\n\n        /// <summary>Field number for the \"details\" field.</summary>\n        public const int DetailsFieldNumber = 2;\n        private static readonly pb::FieldCodec<global::Google.Protobuf.WellKnownTypes.Any> _repeated_details_codec\n            = pb::FieldCodec.ForMessage(18, global::Google.Protobuf.WellKnownTypes.Any.Parser);\n        private readonly pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> details_ = new pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any>();\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> Details {\n          get { return details_; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override bool Equals(object other) {\n          return Equals(other as PropertySetList);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public bool Equals(PropertySetList other) {\n          if (ReferenceEquals(other, null)) {\n            return false;\n          }\n          if (ReferenceEquals(other, this)) {\n            return true;\n          }\n          if(!propertyset_.Equals(other.propertyset_)) return false;\n          if(!details_.Equals(other.details_)) return false;\n          return true;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override int GetHashCode() {\n          int hash = 1;\n          hash ^= propertyset_.GetHashCode();\n          hash ^= details_.GetHashCode();\n          return hash;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override string ToString() {\n          return pb::JsonFormatter.ToDiagnosticString(this);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void WriteTo(pb::CodedOutputStream output) {\n          propertyset_.WriteTo(output, _repeated_propertyset_codec);\n          details_.WriteTo(output, _repeated_details_codec);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public int CalculateSize() {\n          int size = 0;\n          size += propertyset_.CalculateSize(_repeated_propertyset_codec);\n          size += details_.CalculateSize(_repeated_details_codec);\n          return size;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void MergeFrom(PropertySetList other) {\n          if (other == null) {\n            return;\n          }\n          propertyset_.Add(other.propertyset_);\n          details_.Add(other.details_);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void MergeFrom(pb::CodedInputStream input) {\n          uint tag;\n          while ((tag = input.ReadTag()) != 0) {\n            switch(tag) {\n              default:\n                input.SkipLastField();\n                break;\n              case 10: {\n                propertyset_.AddEntriesFrom(input, _repeated_propertyset_codec);\n                break;\n              }\n              case 18: {\n                details_.AddEntriesFrom(input, _repeated_details_codec);\n                break;\n              }\n            }\n          }\n        }\n\n      }\n\n      public sealed partial class MetaData : pb::IMessage<MetaData> {\n        private static readonly pb::MessageParser<MetaData> _parser = new pb::MessageParser<MetaData>(() => new MetaData());\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static pb::MessageParser<MetaData> Parser { get { return _parser; } }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static pbr::MessageDescriptor Descriptor {\n          get { return global::Org.Eclipse.Tahu.Protobuf.Payload.Descriptor.NestedTypes[5]; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        pbr::MessageDescriptor pb::IMessage.Descriptor {\n          get { return Descriptor; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public MetaData() {\n          OnConstruction();\n        }\n\n        partial void OnConstruction();\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public MetaData(MetaData other) : this() {\n          isMultiPart_ = other.isMultiPart_;\n          contentType_ = other.contentType_;\n          size_ = other.size_;\n          seq_ = other.seq_;\n          fileName_ = other.fileName_;\n          fileType_ = other.fileType_;\n          md5_ = other.md5_;\n          description_ = other.description_;\n          details_ = other.details_.Clone();\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public MetaData Clone() {\n          return new MetaData(this);\n        }\n\n        /// <summary>Field number for the \"is_multi_part\" field.</summary>\n        public const int IsMultiPartFieldNumber = 1;\n        private bool isMultiPart_;\n        /// <summary>\n        /// Bytes specific metadata\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public bool IsMultiPart {\n          get { return isMultiPart_; }\n          set {\n            isMultiPart_ = value;\n          }\n        }\n\n        /// <summary>Field number for the \"content_type\" field.</summary>\n        public const int ContentTypeFieldNumber = 2;\n        private string contentType_ = \"\";\n        /// <summary>\n        /// General metadata\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public string ContentType {\n          get { return contentType_; }\n          set {\n            contentType_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n          }\n        }\n\n        /// <summary>Field number for the \"size\" field.</summary>\n        public const int SizeFieldNumber = 3;\n        private ulong size_;\n        /// <summary>\n        /// File size, String size, Multi-part size, etc\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public ulong Size {\n          get { return size_; }\n          set {\n            size_ = value;\n          }\n        }\n\n        /// <summary>Field number for the \"seq\" field.</summary>\n        public const int SeqFieldNumber = 4;\n        private ulong seq_;\n        /// <summary>\n        /// Sequence number for multi-part messages\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public ulong Seq {\n          get { return seq_; }\n          set {\n            seq_ = value;\n          }\n        }\n\n        /// <summary>Field number for the \"file_name\" field.</summary>\n        public const int FileNameFieldNumber = 5;\n        private string fileName_ = \"\";\n        /// <summary>\n        /// File metadata\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public string FileName {\n          get { return fileName_; }\n          set {\n            fileName_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n          }\n        }\n\n        /// <summary>Field number for the \"file_type\" field.</summary>\n        public const int FileTypeFieldNumber = 6;\n        private string fileType_ = \"\";\n        /// <summary>\n        /// File type (i.e. xml, json, txt, cpp, etc)\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public string FileType {\n          get { return fileType_; }\n          set {\n            fileType_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n          }\n        }\n\n        /// <summary>Field number for the \"md5\" field.</summary>\n        public const int Md5FieldNumber = 7;\n        private string md5_ = \"\";\n        /// <summary>\n        /// md5 of data\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public string Md5 {\n          get { return md5_; }\n          set {\n            md5_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n          }\n        }\n\n        /// <summary>Field number for the \"description\" field.</summary>\n        public const int DescriptionFieldNumber = 8;\n        private string description_ = \"\";\n        /// <summary>\n        /// Catchalls and future expansion\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public string Description {\n          get { return description_; }\n          set {\n            description_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n          }\n        }\n\n        /// <summary>Field number for the \"details\" field.</summary>\n        public const int DetailsFieldNumber = 9;\n        private static readonly pb::FieldCodec<global::Google.Protobuf.WellKnownTypes.Any> _repeated_details_codec\n            = pb::FieldCodec.ForMessage(74, global::Google.Protobuf.WellKnownTypes.Any.Parser);\n        private readonly pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> details_ = new pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any>();\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> Details {\n          get { return details_; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override bool Equals(object other) {\n          return Equals(other as MetaData);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public bool Equals(MetaData other) {\n          if (ReferenceEquals(other, null)) {\n            return false;\n          }\n          if (ReferenceEquals(other, this)) {\n            return true;\n          }\n          if (IsMultiPart != other.IsMultiPart) return false;\n          if (ContentType != other.ContentType) return false;\n          if (Size != other.Size) return false;\n          if (Seq != other.Seq) return false;\n          if (FileName != other.FileName) return false;\n          if (FileType != other.FileType) return false;\n          if (Md5 != other.Md5) return false;\n          if (Description != other.Description) return false;\n          if(!details_.Equals(other.details_)) return false;\n          return true;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override int GetHashCode() {\n          int hash = 1;\n          if (IsMultiPart != false) hash ^= IsMultiPart.GetHashCode();\n          if (ContentType.Length != 0) hash ^= ContentType.GetHashCode();\n          if (Size != 0UL) hash ^= Size.GetHashCode();\n          if (Seq != 0UL) hash ^= Seq.GetHashCode();\n          if (FileName.Length != 0) hash ^= FileName.GetHashCode();\n          if (FileType.Length != 0) hash ^= FileType.GetHashCode();\n          if (Md5.Length != 0) hash ^= Md5.GetHashCode();\n          if (Description.Length != 0) hash ^= Description.GetHashCode();\n          hash ^= details_.GetHashCode();\n          return hash;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override string ToString() {\n          return pb::JsonFormatter.ToDiagnosticString(this);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void WriteTo(pb::CodedOutputStream output) {\n          if (IsMultiPart != false) {\n            output.WriteRawTag(8);\n            output.WriteBool(IsMultiPart);\n          }\n          if (ContentType.Length != 0) {\n            output.WriteRawTag(18);\n            output.WriteString(ContentType);\n          }\n          if (Size != 0UL) {\n            output.WriteRawTag(24);\n            output.WriteUInt64(Size);\n          }\n          if (Seq != 0UL) {\n            output.WriteRawTag(32);\n            output.WriteUInt64(Seq);\n          }\n          if (FileName.Length != 0) {\n            output.WriteRawTag(42);\n            output.WriteString(FileName);\n          }\n          if (FileType.Length != 0) {\n            output.WriteRawTag(50);\n            output.WriteString(FileType);\n          }\n          if (Md5.Length != 0) {\n            output.WriteRawTag(58);\n            output.WriteString(Md5);\n          }\n          if (Description.Length != 0) {\n            output.WriteRawTag(66);\n            output.WriteString(Description);\n          }\n          details_.WriteTo(output, _repeated_details_codec);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public int CalculateSize() {\n          int size = 0;\n          if (IsMultiPart != false) {\n            size += 1 + 1;\n          }\n          if (ContentType.Length != 0) {\n            size += 1 + pb::CodedOutputStream.ComputeStringSize(ContentType);\n          }\n          if (Size != 0UL) {\n            size += 1 + pb::CodedOutputStream.ComputeUInt64Size(Size);\n          }\n          if (Seq != 0UL) {\n            size += 1 + pb::CodedOutputStream.ComputeUInt64Size(Seq);\n          }\n          if (FileName.Length != 0) {\n            size += 1 + pb::CodedOutputStream.ComputeStringSize(FileName);\n          }\n          if (FileType.Length != 0) {\n            size += 1 + pb::CodedOutputStream.ComputeStringSize(FileType);\n          }\n          if (Md5.Length != 0) {\n            size += 1 + pb::CodedOutputStream.ComputeStringSize(Md5);\n          }\n          if (Description.Length != 0) {\n            size += 1 + pb::CodedOutputStream.ComputeStringSize(Description);\n          }\n          size += details_.CalculateSize(_repeated_details_codec);\n          return size;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void MergeFrom(MetaData other) {\n          if (other == null) {\n            return;\n          }\n          if (other.IsMultiPart != false) {\n            IsMultiPart = other.IsMultiPart;\n          }\n          if (other.ContentType.Length != 0) {\n            ContentType = other.ContentType;\n          }\n          if (other.Size != 0UL) {\n            Size = other.Size;\n          }\n          if (other.Seq != 0UL) {\n            Seq = other.Seq;\n          }\n          if (other.FileName.Length != 0) {\n            FileName = other.FileName;\n          }\n          if (other.FileType.Length != 0) {\n            FileType = other.FileType;\n          }\n          if (other.Md5.Length != 0) {\n            Md5 = other.Md5;\n          }\n          if (other.Description.Length != 0) {\n            Description = other.Description;\n          }\n          details_.Add(other.details_);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void MergeFrom(pb::CodedInputStream input) {\n          uint tag;\n          while ((tag = input.ReadTag()) != 0) {\n            switch(tag) {\n              default:\n                input.SkipLastField();\n                break;\n              case 8: {\n                IsMultiPart = input.ReadBool();\n                break;\n              }\n              case 18: {\n                ContentType = input.ReadString();\n                break;\n              }\n              case 24: {\n                Size = input.ReadUInt64();\n                break;\n              }\n              case 32: {\n                Seq = input.ReadUInt64();\n                break;\n              }\n              case 42: {\n                FileName = input.ReadString();\n                break;\n              }\n              case 50: {\n                FileType = input.ReadString();\n                break;\n              }\n              case 58: {\n                Md5 = input.ReadString();\n                break;\n              }\n              case 66: {\n                Description = input.ReadString();\n                break;\n              }\n              case 74: {\n                details_.AddEntriesFrom(input, _repeated_details_codec);\n                break;\n              }\n            }\n          }\n        }\n\n      }\n\n      public sealed partial class Metric : pb::IMessage<Metric> {\n        private static readonly pb::MessageParser<Metric> _parser = new pb::MessageParser<Metric>(() => new Metric());\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static pb::MessageParser<Metric> Parser { get { return _parser; } }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static pbr::MessageDescriptor Descriptor {\n          get { return global::Org.Eclipse.Tahu.Protobuf.Payload.Descriptor.NestedTypes[6]; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        pbr::MessageDescriptor pb::IMessage.Descriptor {\n          get { return Descriptor; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public Metric() {\n          OnConstruction();\n        }\n\n        partial void OnConstruction();\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public Metric(Metric other) : this() {\n          name_ = other.name_;\n          alias_ = other.alias_;\n          timestamp_ = other.timestamp_;\n          datatype_ = other.datatype_;\n          isHistorical_ = other.isHistorical_;\n          isTransient_ = other.isTransient_;\n          isNull_ = other.isNull_;\n          Metadata = other.metadata_ != null ? other.Metadata.Clone() : null;\n          Properties = other.properties_ != null ? other.Properties.Clone() : null;\n          switch (other.ValueCase) {\n            case ValueOneofCase.IntValue:\n              IntValue = other.IntValue;\n              break;\n            case ValueOneofCase.LongValue:\n              LongValue = other.LongValue;\n              break;\n            case ValueOneofCase.FloatValue:\n              FloatValue = other.FloatValue;\n              break;\n            case ValueOneofCase.DoubleValue:\n              DoubleValue = other.DoubleValue;\n              break;\n            case ValueOneofCase.BooleanValue:\n              BooleanValue = other.BooleanValue;\n              break;\n            case ValueOneofCase.StringValue:\n              StringValue = other.StringValue;\n              break;\n            case ValueOneofCase.BytesValue:\n              BytesValue = other.BytesValue;\n              break;\n            case ValueOneofCase.DatasetValue:\n              DatasetValue = other.DatasetValue.Clone();\n              break;\n            case ValueOneofCase.TemplateValue:\n              TemplateValue = other.TemplateValue.Clone();\n              break;\n            case ValueOneofCase.ExtensionValue:\n              ExtensionValue = other.ExtensionValue.Clone();\n              break;\n          }\n\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public Metric Clone() {\n          return new Metric(this);\n        }\n\n        /// <summary>Field number for the \"name\" field.</summary>\n        public const int NameFieldNumber = 1;\n        private string name_ = \"\";\n        /// <summary>\n        /// Metric name - should only be included on birth\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public string Name {\n          get { return name_; }\n          set {\n            name_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n          }\n        }\n\n        /// <summary>Field number for the \"alias\" field.</summary>\n        public const int AliasFieldNumber = 2;\n        private ulong alias_;\n        /// <summary>\n        /// Metric alias - tied to name on birth and included in all later DATA messages\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public ulong Alias {\n          get { return alias_; }\n          set {\n            alias_ = value;\n          }\n        }\n\n        /// <summary>Field number for the \"timestamp\" field.</summary>\n        public const int TimestampFieldNumber = 3;\n        private ulong timestamp_;\n        /// <summary>\n        /// Timestamp associated with data acquisition time\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public ulong Timestamp {\n          get { return timestamp_; }\n          set {\n            timestamp_ = value;\n          }\n        }\n\n        /// <summary>Field number for the \"datatype\" field.</summary>\n        public const int DatatypeFieldNumber = 4;\n        private uint datatype_;\n        /// <summary>\n        /// DataType of the metric/tag value\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public uint Datatype {\n          get { return datatype_; }\n          set {\n            datatype_ = value;\n          }\n        }\n\n        /// <summary>Field number for the \"is_historical\" field.</summary>\n        public const int IsHistoricalFieldNumber = 5;\n        private bool isHistorical_;\n        /// <summary>\n        /// If this is historical data and should not update real time tag\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public bool IsHistorical {\n          get { return isHistorical_; }\n          set {\n            isHistorical_ = value;\n          }\n        }\n\n        /// <summary>Field number for the \"is_transient\" field.</summary>\n        public const int IsTransientFieldNumber = 6;\n        private bool isTransient_;\n        /// <summary>\n        /// Tells consuming clients such as MQTT Engine to not store this as a tag\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public bool IsTransient {\n          get { return isTransient_; }\n          set {\n            isTransient_ = value;\n          }\n        }\n\n        /// <summary>Field number for the \"is_null\" field.</summary>\n        public const int IsNullFieldNumber = 7;\n        private bool isNull_;\n        /// <summary>\n        /// If this is null - explicitly say so rather than using -1, false, etc for some datatypes.\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public bool IsNull {\n          get { return isNull_; }\n          set {\n            isNull_ = value;\n          }\n        }\n\n        /// <summary>Field number for the \"metadata\" field.</summary>\n        public const int MetadataFieldNumber = 8;\n        private global::Org.Eclipse.Tahu.Protobuf.Payload.Types.MetaData metadata_;\n        /// <summary>\n        /// Metadata for the payload\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public global::Org.Eclipse.Tahu.Protobuf.Payload.Types.MetaData Metadata {\n          get { return metadata_; }\n          set {\n            metadata_ = value;\n          }\n        }\n\n        /// <summary>Field number for the \"properties\" field.</summary>\n        public const int PropertiesFieldNumber = 9;\n        private global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySet properties_;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySet Properties {\n          get { return properties_; }\n          set {\n            properties_ = value;\n          }\n        }\n\n        /// <summary>Field number for the \"int_value\" field.</summary>\n        public const int IntValueFieldNumber = 10;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public uint IntValue {\n          get { return valueCase_ == ValueOneofCase.IntValue ? (uint) value_ : 0; }\n          set {\n            value_ = value;\n            valueCase_ = ValueOneofCase.IntValue;\n          }\n        }\n\n        /// <summary>Field number for the \"long_value\" field.</summary>\n        public const int LongValueFieldNumber = 11;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public ulong LongValue {\n          get { return valueCase_ == ValueOneofCase.LongValue ? (ulong) value_ : 0UL; }\n          set {\n            value_ = value;\n            valueCase_ = ValueOneofCase.LongValue;\n          }\n        }\n\n        /// <summary>Field number for the \"float_value\" field.</summary>\n        public const int FloatValueFieldNumber = 12;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public float FloatValue {\n          get { return valueCase_ == ValueOneofCase.FloatValue ? (float) value_ : 0F; }\n          set {\n            value_ = value;\n            valueCase_ = ValueOneofCase.FloatValue;\n          }\n        }\n\n        /// <summary>Field number for the \"double_value\" field.</summary>\n        public const int DoubleValueFieldNumber = 13;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public double DoubleValue {\n          get { return valueCase_ == ValueOneofCase.DoubleValue ? (double) value_ : 0D; }\n          set {\n            value_ = value;\n            valueCase_ = ValueOneofCase.DoubleValue;\n          }\n        }\n\n        /// <summary>Field number for the \"boolean_value\" field.</summary>\n        public const int BooleanValueFieldNumber = 14;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public bool BooleanValue {\n          get { return valueCase_ == ValueOneofCase.BooleanValue ? (bool) value_ : false; }\n          set {\n            value_ = value;\n            valueCase_ = ValueOneofCase.BooleanValue;\n          }\n        }\n\n        /// <summary>Field number for the \"string_value\" field.</summary>\n        public const int StringValueFieldNumber = 15;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public string StringValue {\n          get { return valueCase_ == ValueOneofCase.StringValue ? (string) value_ : \"\"; }\n          set {\n            value_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n            valueCase_ = ValueOneofCase.StringValue;\n          }\n        }\n\n        /// <summary>Field number for the \"bytes_value\" field.</summary>\n        public const int BytesValueFieldNumber = 16;\n        /// <summary>\n        /// Bytes, File\n        /// </summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public pb::ByteString BytesValue {\n          get { return valueCase_ == ValueOneofCase.BytesValue ? (pb::ByteString) value_ : pb::ByteString.Empty; }\n          set {\n            value_ = pb::ProtoPreconditions.CheckNotNull(value, \"value\");\n            valueCase_ = ValueOneofCase.BytesValue;\n          }\n        }\n\n        /// <summary>Field number for the \"dataset_value\" field.</summary>\n        public const int DatasetValueFieldNumber = 17;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet DatasetValue {\n          get { return valueCase_ == ValueOneofCase.DatasetValue ? (global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet) value_ : null; }\n          set {\n            value_ = value;\n            valueCase_ = value == null ? ValueOneofCase.None : ValueOneofCase.DatasetValue;\n          }\n        }\n\n        /// <summary>Field number for the \"template_value\" field.</summary>\n        public const int TemplateValueFieldNumber = 18;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template TemplateValue {\n          get { return valueCase_ == ValueOneofCase.TemplateValue ? (global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template) value_ : null; }\n          set {\n            value_ = value;\n            valueCase_ = value == null ? ValueOneofCase.None : ValueOneofCase.TemplateValue;\n          }\n        }\n\n        /// <summary>Field number for the \"extension_value\" field.</summary>\n        public const int ExtensionValueFieldNumber = 19;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric.Types.MetricValueExtension ExtensionValue {\n          get { return valueCase_ == ValueOneofCase.ExtensionValue ? (global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric.Types.MetricValueExtension) value_ : null; }\n          set {\n            value_ = value;\n            valueCase_ = value == null ? ValueOneofCase.None : ValueOneofCase.ExtensionValue;\n          }\n        }\n\n        private object value_;\n        /// <summary>Enum of possible cases for the \"value\" oneof.</summary>\n        public enum ValueOneofCase {\n          None = 0,\n          IntValue = 10,\n          LongValue = 11,\n          FloatValue = 12,\n          DoubleValue = 13,\n          BooleanValue = 14,\n          StringValue = 15,\n          BytesValue = 16,\n          DatasetValue = 17,\n          TemplateValue = 18,\n          ExtensionValue = 19,\n        }\n        private ValueOneofCase valueCase_ = ValueOneofCase.None;\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public ValueOneofCase ValueCase {\n          get { return valueCase_; }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void ClearValue() {\n          valueCase_ = ValueOneofCase.None;\n          value_ = null;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override bool Equals(object other) {\n          return Equals(other as Metric);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public bool Equals(Metric other) {\n          if (ReferenceEquals(other, null)) {\n            return false;\n          }\n          if (ReferenceEquals(other, this)) {\n            return true;\n          }\n          if (Name != other.Name) return false;\n          if (Alias != other.Alias) return false;\n          if (Timestamp != other.Timestamp) return false;\n          if (Datatype != other.Datatype) return false;\n          if (IsHistorical != other.IsHistorical) return false;\n          if (IsTransient != other.IsTransient) return false;\n          if (IsNull != other.IsNull) return false;\n          if (!object.Equals(Metadata, other.Metadata)) return false;\n          if (!object.Equals(Properties, other.Properties)) return false;\n          if (IntValue != other.IntValue) return false;\n          if (LongValue != other.LongValue) return false;\n          if (FloatValue != other.FloatValue) return false;\n          if (DoubleValue != other.DoubleValue) return false;\n          if (BooleanValue != other.BooleanValue) return false;\n          if (StringValue != other.StringValue) return false;\n          if (BytesValue != other.BytesValue) return false;\n          if (!object.Equals(DatasetValue, other.DatasetValue)) return false;\n          if (!object.Equals(TemplateValue, other.TemplateValue)) return false;\n          if (!object.Equals(ExtensionValue, other.ExtensionValue)) return false;\n          if (ValueCase != other.ValueCase) return false;\n          return true;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override int GetHashCode() {\n          int hash = 1;\n          if (Name.Length != 0) hash ^= Name.GetHashCode();\n          if (Alias != 0UL) hash ^= Alias.GetHashCode();\n          if (Timestamp != 0UL) hash ^= Timestamp.GetHashCode();\n          if (Datatype != 0) hash ^= Datatype.GetHashCode();\n          if (IsHistorical != false) hash ^= IsHistorical.GetHashCode();\n          if (IsTransient != false) hash ^= IsTransient.GetHashCode();\n          if (IsNull != false) hash ^= IsNull.GetHashCode();\n          if (metadata_ != null) hash ^= Metadata.GetHashCode();\n          if (properties_ != null) hash ^= Properties.GetHashCode();\n          if (valueCase_ == ValueOneofCase.IntValue) hash ^= IntValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.LongValue) hash ^= LongValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.FloatValue) hash ^= FloatValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.DoubleValue) hash ^= DoubleValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.BooleanValue) hash ^= BooleanValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.StringValue) hash ^= StringValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.BytesValue) hash ^= BytesValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.DatasetValue) hash ^= DatasetValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.TemplateValue) hash ^= TemplateValue.GetHashCode();\n          if (valueCase_ == ValueOneofCase.ExtensionValue) hash ^= ExtensionValue.GetHashCode();\n          hash ^= (int) valueCase_;\n          return hash;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public override string ToString() {\n          return pb::JsonFormatter.ToDiagnosticString(this);\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void WriteTo(pb::CodedOutputStream output) {\n          if (Name.Length != 0) {\n            output.WriteRawTag(10);\n            output.WriteString(Name);\n          }\n          if (Alias != 0UL) {\n            output.WriteRawTag(16);\n            output.WriteUInt64(Alias);\n          }\n          if (Timestamp != 0UL) {\n            output.WriteRawTag(24);\n            output.WriteUInt64(Timestamp);\n          }\n          if (Datatype != 0) {\n            output.WriteRawTag(32);\n            output.WriteUInt32(Datatype);\n          }\n          if (IsHistorical != false) {\n            output.WriteRawTag(40);\n            output.WriteBool(IsHistorical);\n          }\n          if (IsTransient != false) {\n            output.WriteRawTag(48);\n            output.WriteBool(IsTransient);\n          }\n          if (IsNull != false) {\n            output.WriteRawTag(56);\n            output.WriteBool(IsNull);\n          }\n          if (metadata_ != null) {\n            output.WriteRawTag(66);\n            output.WriteMessage(Metadata);\n          }\n          if (properties_ != null) {\n            output.WriteRawTag(74);\n            output.WriteMessage(Properties);\n          }\n          if (valueCase_ == ValueOneofCase.IntValue) {\n            output.WriteRawTag(80);\n            output.WriteUInt32(IntValue);\n          }\n          if (valueCase_ == ValueOneofCase.LongValue) {\n            output.WriteRawTag(88);\n            output.WriteUInt64(LongValue);\n          }\n          if (valueCase_ == ValueOneofCase.FloatValue) {\n            output.WriteRawTag(101);\n            output.WriteFloat(FloatValue);\n          }\n          if (valueCase_ == ValueOneofCase.DoubleValue) {\n            output.WriteRawTag(105);\n            output.WriteDouble(DoubleValue);\n          }\n          if (valueCase_ == ValueOneofCase.BooleanValue) {\n            output.WriteRawTag(112);\n            output.WriteBool(BooleanValue);\n          }\n          if (valueCase_ == ValueOneofCase.StringValue) {\n            output.WriteRawTag(122);\n            output.WriteString(StringValue);\n          }\n          if (valueCase_ == ValueOneofCase.BytesValue) {\n            output.WriteRawTag(130, 1);\n            output.WriteBytes(BytesValue);\n          }\n          if (valueCase_ == ValueOneofCase.DatasetValue) {\n            output.WriteRawTag(138, 1);\n            output.WriteMessage(DatasetValue);\n          }\n          if (valueCase_ == ValueOneofCase.TemplateValue) {\n            output.WriteRawTag(146, 1);\n            output.WriteMessage(TemplateValue);\n          }\n          if (valueCase_ == ValueOneofCase.ExtensionValue) {\n            output.WriteRawTag(154, 1);\n            output.WriteMessage(ExtensionValue);\n          }\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public int CalculateSize() {\n          int size = 0;\n          if (Name.Length != 0) {\n            size += 1 + pb::CodedOutputStream.ComputeStringSize(Name);\n          }\n          if (Alias != 0UL) {\n            size += 1 + pb::CodedOutputStream.ComputeUInt64Size(Alias);\n          }\n          if (Timestamp != 0UL) {\n            size += 1 + pb::CodedOutputStream.ComputeUInt64Size(Timestamp);\n          }\n          if (Datatype != 0) {\n            size += 1 + pb::CodedOutputStream.ComputeUInt32Size(Datatype);\n          }\n          if (IsHistorical != false) {\n            size += 1 + 1;\n          }\n          if (IsTransient != false) {\n            size += 1 + 1;\n          }\n          if (IsNull != false) {\n            size += 1 + 1;\n          }\n          if (metadata_ != null) {\n            size += 1 + pb::CodedOutputStream.ComputeMessageSize(Metadata);\n          }\n          if (properties_ != null) {\n            size += 1 + pb::CodedOutputStream.ComputeMessageSize(Properties);\n          }\n          if (valueCase_ == ValueOneofCase.IntValue) {\n            size += 1 + pb::CodedOutputStream.ComputeUInt32Size(IntValue);\n          }\n          if (valueCase_ == ValueOneofCase.LongValue) {\n            size += 1 + pb::CodedOutputStream.ComputeUInt64Size(LongValue);\n          }\n          if (valueCase_ == ValueOneofCase.FloatValue) {\n            size += 1 + 4;\n          }\n          if (valueCase_ == ValueOneofCase.DoubleValue) {\n            size += 1 + 8;\n          }\n          if (valueCase_ == ValueOneofCase.BooleanValue) {\n            size += 1 + 1;\n          }\n          if (valueCase_ == ValueOneofCase.StringValue) {\n            size += 1 + pb::CodedOutputStream.ComputeStringSize(StringValue);\n          }\n          if (valueCase_ == ValueOneofCase.BytesValue) {\n            size += 2 + pb::CodedOutputStream.ComputeBytesSize(BytesValue);\n          }\n          if (valueCase_ == ValueOneofCase.DatasetValue) {\n            size += 2 + pb::CodedOutputStream.ComputeMessageSize(DatasetValue);\n          }\n          if (valueCase_ == ValueOneofCase.TemplateValue) {\n            size += 2 + pb::CodedOutputStream.ComputeMessageSize(TemplateValue);\n          }\n          if (valueCase_ == ValueOneofCase.ExtensionValue) {\n            size += 2 + pb::CodedOutputStream.ComputeMessageSize(ExtensionValue);\n          }\n          return size;\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void MergeFrom(Metric other) {\n          if (other == null) {\n            return;\n          }\n          if (other.Name.Length != 0) {\n            Name = other.Name;\n          }\n          if (other.Alias != 0UL) {\n            Alias = other.Alias;\n          }\n          if (other.Timestamp != 0UL) {\n            Timestamp = other.Timestamp;\n          }\n          if (other.Datatype != 0) {\n            Datatype = other.Datatype;\n          }\n          if (other.IsHistorical != false) {\n            IsHistorical = other.IsHistorical;\n          }\n          if (other.IsTransient != false) {\n            IsTransient = other.IsTransient;\n          }\n          if (other.IsNull != false) {\n            IsNull = other.IsNull;\n          }\n          if (other.metadata_ != null) {\n            if (metadata_ == null) {\n              metadata_ = new global::Org.Eclipse.Tahu.Protobuf.Payload.Types.MetaData();\n            }\n            Metadata.MergeFrom(other.Metadata);\n          }\n          if (other.properties_ != null) {\n            if (properties_ == null) {\n              properties_ = new global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySet();\n            }\n            Properties.MergeFrom(other.Properties);\n          }\n          switch (other.ValueCase) {\n            case ValueOneofCase.IntValue:\n              IntValue = other.IntValue;\n              break;\n            case ValueOneofCase.LongValue:\n              LongValue = other.LongValue;\n              break;\n            case ValueOneofCase.FloatValue:\n              FloatValue = other.FloatValue;\n              break;\n            case ValueOneofCase.DoubleValue:\n              DoubleValue = other.DoubleValue;\n              break;\n            case ValueOneofCase.BooleanValue:\n              BooleanValue = other.BooleanValue;\n              break;\n            case ValueOneofCase.StringValue:\n              StringValue = other.StringValue;\n              break;\n            case ValueOneofCase.BytesValue:\n              BytesValue = other.BytesValue;\n              break;\n            case ValueOneofCase.DatasetValue:\n              DatasetValue = other.DatasetValue;\n              break;\n            case ValueOneofCase.TemplateValue:\n              TemplateValue = other.TemplateValue;\n              break;\n            case ValueOneofCase.ExtensionValue:\n              ExtensionValue = other.ExtensionValue;\n              break;\n          }\n\n        }\n\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public void MergeFrom(pb::CodedInputStream input) {\n          uint tag;\n          while ((tag = input.ReadTag()) != 0) {\n            switch(tag) {\n              default:\n                input.SkipLastField();\n                break;\n              case 10: {\n                Name = input.ReadString();\n                break;\n              }\n              case 16: {\n                Alias = input.ReadUInt64();\n                break;\n              }\n              case 24: {\n                Timestamp = input.ReadUInt64();\n                break;\n              }\n              case 32: {\n                Datatype = input.ReadUInt32();\n                break;\n              }\n              case 40: {\n                IsHistorical = input.ReadBool();\n                break;\n              }\n              case 48: {\n                IsTransient = input.ReadBool();\n                break;\n              }\n              case 56: {\n                IsNull = input.ReadBool();\n                break;\n              }\n              case 66: {\n                if (metadata_ == null) {\n                  metadata_ = new global::Org.Eclipse.Tahu.Protobuf.Payload.Types.MetaData();\n                }\n                input.ReadMessage(metadata_);\n                break;\n              }\n              case 74: {\n                if (properties_ == null) {\n                  properties_ = new global::Org.Eclipse.Tahu.Protobuf.Payload.Types.PropertySet();\n                }\n                input.ReadMessage(properties_);\n                break;\n              }\n              case 80: {\n                IntValue = input.ReadUInt32();\n                break;\n              }\n              case 88: {\n                LongValue = input.ReadUInt64();\n                break;\n              }\n              case 101: {\n                FloatValue = input.ReadFloat();\n                break;\n              }\n              case 105: {\n                DoubleValue = input.ReadDouble();\n                break;\n              }\n              case 112: {\n                BooleanValue = input.ReadBool();\n                break;\n              }\n              case 122: {\n                StringValue = input.ReadString();\n                break;\n              }\n              case 130: {\n                BytesValue = input.ReadBytes();\n                break;\n              }\n              case 138: {\n                global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet subBuilder = new global::Org.Eclipse.Tahu.Protobuf.Payload.Types.DataSet();\n                if (valueCase_ == ValueOneofCase.DatasetValue) {\n                  subBuilder.MergeFrom(DatasetValue);\n                }\n                input.ReadMessage(subBuilder);\n                DatasetValue = subBuilder;\n                break;\n              }\n              case 146: {\n                global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template subBuilder = new global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Template();\n                if (valueCase_ == ValueOneofCase.TemplateValue) {\n                  subBuilder.MergeFrom(TemplateValue);\n                }\n                input.ReadMessage(subBuilder);\n                TemplateValue = subBuilder;\n                break;\n              }\n              case 154: {\n                global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric.Types.MetricValueExtension subBuilder = new global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric.Types.MetricValueExtension();\n                if (valueCase_ == ValueOneofCase.ExtensionValue) {\n                  subBuilder.MergeFrom(ExtensionValue);\n                }\n                input.ReadMessage(subBuilder);\n                ExtensionValue = subBuilder;\n                break;\n              }\n            }\n          }\n        }\n\n        #region Nested types\n        /// <summary>Container for nested types declared in the Metric message type.</summary>\n        [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n        public static partial class Types {\n          public sealed partial class MetricValueExtension : pb::IMessage<MetricValueExtension> {\n            private static readonly pb::MessageParser<MetricValueExtension> _parser = new pb::MessageParser<MetricValueExtension>(() => new MetricValueExtension());\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public static pb::MessageParser<MetricValueExtension> Parser { get { return _parser; } }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public static pbr::MessageDescriptor Descriptor {\n              get { return global::Org.Eclipse.Tahu.Protobuf.Payload.Types.Metric.Descriptor.NestedTypes[0]; }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            pbr::MessageDescriptor pb::IMessage.Descriptor {\n              get { return Descriptor; }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public MetricValueExtension() {\n              OnConstruction();\n            }\n\n            partial void OnConstruction();\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public MetricValueExtension(MetricValueExtension other) : this() {\n              details_ = other.details_.Clone();\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public MetricValueExtension Clone() {\n              return new MetricValueExtension(this);\n            }\n\n            /// <summary>Field number for the \"details\" field.</summary>\n            public const int DetailsFieldNumber = 1;\n            private static readonly pb::FieldCodec<global::Google.Protobuf.WellKnownTypes.Any> _repeated_details_codec\n                = pb::FieldCodec.ForMessage(10, global::Google.Protobuf.WellKnownTypes.Any.Parser);\n            private readonly pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> details_ = new pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any>();\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public pbc::RepeatedField<global::Google.Protobuf.WellKnownTypes.Any> Details {\n              get { return details_; }\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public override bool Equals(object other) {\n              return Equals(other as MetricValueExtension);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public bool Equals(MetricValueExtension other) {\n              if (ReferenceEquals(other, null)) {\n                return false;\n              }\n              if (ReferenceEquals(other, this)) {\n                return true;\n              }\n              if(!details_.Equals(other.details_)) return false;\n              return true;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public override int GetHashCode() {\n              int hash = 1;\n              hash ^= details_.GetHashCode();\n              return hash;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public override string ToString() {\n              return pb::JsonFormatter.ToDiagnosticString(this);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void WriteTo(pb::CodedOutputStream output) {\n              details_.WriteTo(output, _repeated_details_codec);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public int CalculateSize() {\n              int size = 0;\n              size += details_.CalculateSize(_repeated_details_codec);\n              return size;\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void MergeFrom(MetricValueExtension other) {\n              if (other == null) {\n                return;\n              }\n              details_.Add(other.details_);\n            }\n\n            [global::System.Diagnostics.DebuggerNonUserCodeAttribute]\n            public void MergeFrom(pb::CodedInputStream input) {\n              uint tag;\n              while ((tag = input.ReadTag()) != 0) {\n                switch(tag) {\n                  default:\n                    input.SkipLastField();\n                    break;\n                  case 10: {\n                    details_.AddEntriesFrom(input, _repeated_details_codec);\n                    break;\n                  }\n                }\n              }\n            }\n\n          }\n\n        }\n        #endregion\n\n      }\n\n    }\n    #endregion\n\n  }\n\n  #endregion\n\n}\n\n#endregion Designer generated code\n"
  },
  {
    "path": "edl-v10.html",
    "content": "Eclipse Distribution License - v 1.0\n\nCopyright (c) 2007, Eclipse Foundation, Inc. and its licensors.\n\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:\n\n    Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.\n    Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.\n    Neither the name of the Eclipse Foundation, Inc. nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n"
  },
  {
    "path": "epl-v20.html",
    "content": "<!DOCTYPE html PUBLIC \"-//W3C//DTD XHTML 1.0 Strict//EN\" \"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd\">\n<html xmlns=\"http://www.w3.org/1999/xhtml\" xml:lang=\"en\" lang=\"en\">\n  <head>\n    <meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\" />\n    <title>Eclipse Public License - Version 2.0</title>\n    <style type=\"text/css\">\n      body {\n        margin: 1.5em 3em;\n      }\n      h1{\n        font-size:1.5em;\n      }\n      h2{\n        font-size:1em;\n        margin-bottom:0.5em;\n        margin-top:1em;\n      }\n      p {\n        margin-top:  0.5em;\n        margin-bottom: 0.5em;\n      }\n      ul, ol{\n        list-style-type:none;\n      }\n    </style>\n  </head>\n  <body>\n    <h1>Eclipse Public License - v 2.0</h1>\n    <p>THE ACCOMPANYING PROGRAM IS PROVIDED UNDER THE TERMS OF THIS ECLIPSE\n      PUBLIC LICENSE (&ldquo;AGREEMENT&rdquo;). ANY USE, REPRODUCTION OR DISTRIBUTION\n      OF THE PROGRAM CONSTITUTES RECIPIENT&#039;S ACCEPTANCE OF THIS AGREEMENT.\n    </p>\n    <h2 id=\"definitions\">1. DEFINITIONS</h2>\n    <p>&ldquo;Contribution&rdquo; means:</p>\n    <ul>\n      <li>a) in the case of the initial Contributor, the initial content\n        Distributed under this Agreement, and\n      </li>\n      <li>\n        b) in the case of each subsequent Contributor:\n        <ul>\n          <li>i) changes to the Program, and</li>\n          <li>ii) additions to the Program;</li>\n        </ul>\n        where such changes and/or additions to the Program originate from\n        and are Distributed by that particular Contributor. A Contribution\n        &ldquo;originates&rdquo; from a Contributor if it was added to the Program by such\n        Contributor itself or anyone acting on such Contributor&#039;s behalf.\n        Contributions do not include changes or additions to the Program that\n        are not Modified Works.\n      </li>\n    </ul>\n    <p>&ldquo;Contributor&rdquo; means any person or entity that Distributes the Program.</p>\n    <p>&ldquo;Licensed Patents&rdquo; mean patent claims licensable by a Contributor which\n      are necessarily infringed by the use or sale of its Contribution alone\n      or when combined with the Program.\n    </p>\n    <p>&ldquo;Program&rdquo; means the Contributions Distributed in accordance with this\n      Agreement.\n    </p>\n    <p>&ldquo;Recipient&rdquo; means anyone who receives the Program under this Agreement\n      or any Secondary License (as applicable), including Contributors.\n    </p>\n    <p>&ldquo;Derivative Works&rdquo; shall mean any work, whether in Source Code or other\n      form, that is based on (or derived from) the Program and for which the\n      editorial revisions, annotations, elaborations, or other modifications\n      represent, as a whole, an original work of authorship.\n    </p>\n    <p>&ldquo;Modified Works&rdquo; shall mean any work in Source Code or other form that\n      results from an addition to, deletion from, or modification of the\n      contents of the Program, including, for purposes of clarity any new file\n      in Source Code form that contains any contents of the Program. Modified\n      Works shall not include works that contain only declarations, interfaces,\n      types, classes, structures, or files of the Program solely in each case\n      in order to link to, bind by name, or subclass the Program or Modified\n      Works thereof.\n    </p>\n    <p>&ldquo;Distribute&rdquo; means the acts of a) distributing or b) making available\n      in any manner that enables the transfer of a copy.\n    </p>\n    <p>&ldquo;Source Code&rdquo; means the form of a Program preferred for making\n      modifications, including but not limited to software source code,\n      documentation source, and configuration files.\n    </p>\n    <p>&ldquo;Secondary License&rdquo; means either the GNU General Public License,\n      Version 2.0, or any later versions of that license, including any\n      exceptions or additional permissions as identified by the initial\n      Contributor.\n    </p>\n    <h2 id=\"grant-of-rights\">2. GRANT OF RIGHTS</h2>\n    <ul>\n      <li>a) Subject to the terms of this Agreement, each Contributor hereby\n        grants Recipient a non-exclusive, worldwide, royalty-free copyright\n        license to reproduce, prepare Derivative Works of, publicly display,\n        publicly perform, Distribute and sublicense the Contribution of such\n        Contributor, if any, and such Derivative Works.\n      </li>\n      <li>b) Subject to the terms of this Agreement, each Contributor hereby\n        grants Recipient a non-exclusive, worldwide, royalty-free patent\n        license under Licensed Patents to make, use, sell, offer to sell,\n        import and otherwise transfer the Contribution of such Contributor,\n        if any, in Source Code or other form. This patent license shall\n        apply to the combination of the Contribution and the Program if,\n        at the time the Contribution is added by the Contributor, such\n        addition of the Contribution causes such combination to be covered\n        by the Licensed Patents. The patent license shall not apply to any\n        other combinations which include the Contribution. No hardware per\n        se is licensed hereunder.\n      </li>\n      <li>c) Recipient understands that although each Contributor grants the\n        licenses to its Contributions set forth herein, no assurances are\n        provided by any Contributor that the Program does not infringe the\n        patent or other intellectual property rights of any other entity.\n        Each Contributor disclaims any liability to Recipient for claims\n        brought by any other entity based on infringement of intellectual\n        property rights or otherwise. As a condition to exercising the rights\n        and licenses granted hereunder, each Recipient hereby assumes sole\n        responsibility to secure any other intellectual property rights needed,\n        if any. For example, if a third party patent license is required to\n        allow Recipient to Distribute the Program, it is Recipient&#039;s\n        responsibility to acquire that license before distributing the Program.\n      </li>\n      <li>d) Each Contributor represents that to its knowledge it has sufficient\n        copyright rights in its Contribution, if any, to grant the copyright\n        license set forth in this Agreement.\n      </li>\n      <li>e) Notwithstanding the terms of any Secondary License, no Contributor\n        makes additional grants to any Recipient (other than those set forth\n        in this Agreement) as a result of such Recipient&#039;s receipt of the\n        Program under the terms of a Secondary License (if permitted under\n        the terms of Section 3).\n      </li>\n    </ul>\n    <h2 id=\"requirements\">3. REQUIREMENTS</h2>\n    <p>3.1 If a Contributor Distributes the Program in any form, then:</p>\n    <ul>\n      <li>a) the Program must also be made available as Source Code, in\n        accordance with section 3.2, and the Contributor must accompany\n        the Program with a statement that the Source Code for the Program\n        is available under this Agreement, and informs Recipients how to\n        obtain it in a reasonable manner on or through a medium customarily\n        used for software exchange; and\n      </li>\n      <li>\n        b) the Contributor may Distribute the Program under a license\n        different than this Agreement, provided that such license:\n        <ul>\n          <li>i) effectively disclaims on behalf of all other Contributors all\n            warranties and conditions, express and implied, including warranties\n            or conditions of title and non-infringement, and implied warranties\n            or conditions of merchantability and fitness for a particular purpose;\n          </li>\n          <li>ii) effectively excludes on behalf of all other Contributors all\n            liability for damages, including direct, indirect, special, incidental\n            and consequential damages, such as lost profits;\n          </li>\n          <li>iii) does not attempt to limit or alter the recipients&#039; rights in the\n            Source Code under section 3.2; and\n          </li>\n          <li>iv) requires any subsequent distribution of the Program by any party\n            to be under a license that satisfies the requirements of this section 3.\n          </li>\n        </ul>\n      </li>\n    </ul>\n    <p>3.2 When the Program is Distributed as Source Code:</p>\n    <ul>\n      <li>a) it must be made available under this Agreement, or if the Program (i)\n        is combined with other material in a separate file or files made available\n        under a Secondary License, and (ii) the initial Contributor attached to\n        the Source Code the notice described in Exhibit A of this Agreement,\n        then the Program may be made available under the terms of such\n        Secondary Licenses, and\n      </li>\n      <li>b) a copy of this Agreement must be included with each copy of the Program.</li>\n    </ul>\n    <p>3.3 Contributors may not remove or alter any copyright, patent, trademark,\n      attribution notices, disclaimers of warranty, or limitations of liability\n      (&lsquo;notices&rsquo;) contained within the Program from any copy of the Program which\n      they Distribute, provided that Contributors may add their own appropriate\n      notices.\n    </p>\n    <h2 id=\"commercial-distribution\">4. COMMERCIAL DISTRIBUTION</h2>\n    <p>Commercial distributors of software may accept certain responsibilities\n      with respect to end users, business partners and the like. While this\n      license is intended to facilitate the commercial use of the Program, the\n      Contributor who includes the Program in a commercial product offering should\n      do so in a manner which does not create potential liability for other\n      Contributors. Therefore, if a Contributor includes the Program in a\n      commercial product offering, such Contributor (&ldquo;Commercial Contributor&rdquo;)\n      hereby agrees to defend and indemnify every other Contributor\n      (&ldquo;Indemnified Contributor&rdquo;) against any losses, damages and costs\n      (collectively &ldquo;Losses&rdquo;) arising from claims, lawsuits and other legal actions\n      brought by a third party against the Indemnified Contributor to the extent\n      caused by the acts or omissions of such Commercial Contributor in connection\n      with its distribution of the Program in a commercial product offering.\n      The obligations in this section do not apply to any claims or Losses relating\n      to any actual or alleged intellectual property infringement. In order to\n      qualify, an Indemnified Contributor must: a) promptly notify the\n      Commercial Contributor in writing of such claim, and b) allow the Commercial\n      Contributor to control, and cooperate with the Commercial Contributor in,\n      the defense and any related settlement negotiations. The Indemnified\n      Contributor may participate in any such claim at its own expense.\n    </p>\n    <p>For example, a Contributor might include the Program\n      in a commercial product offering, Product X. That Contributor is then a\n      Commercial Contributor. If that Commercial Contributor then makes performance\n      claims, or offers warranties related to Product X, those performance claims\n      and warranties are such Commercial Contributor&#039;s responsibility alone.\n      Under this section, the Commercial Contributor would have to defend claims\n      against the other Contributors related to those performance claims and\n      warranties, and if a court requires any other Contributor to pay any damages\n      as a result, the Commercial Contributor must pay those damages.\n    </p>\n    <h2 id=\"warranty\">5. NO WARRANTY</h2>\n    <p>EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, AND TO THE EXTENT PERMITTED\n      BY APPLICABLE LAW, THE PROGRAM IS PROVIDED ON AN &ldquo;AS IS&rdquo; BASIS, WITHOUT\n      WARRANTIES OR CONDITIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED INCLUDING,\n      WITHOUT LIMITATION, ANY WARRANTIES OR CONDITIONS OF TITLE, NON-INFRINGEMENT,\n      MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Each Recipient is\n      solely responsible for determining the appropriateness of using and\n      distributing the Program and assumes all risks associated with its\n      exercise of rights under this Agreement, including but not limited to the\n      risks and costs of program errors, compliance with applicable laws, damage\n      to or loss of data, programs or equipment, and unavailability or\n      interruption of operations.\n    </p>\n    <h2 id=\"disclaimer\">6. DISCLAIMER OF LIABILITY</h2>\n    <p>EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, AND TO THE EXTENT PERMITTED\n      BY APPLICABLE LAW, NEITHER RECIPIENT NOR ANY CONTRIBUTORS SHALL HAVE ANY\n      LIABILITY FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY,\n      OR CONSEQUENTIAL DAMAGES (INCLUDING WITHOUT LIMITATION LOST PROFITS),\n      HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT\n      LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY\n      OUT OF THE USE OR DISTRIBUTION OF THE PROGRAM OR THE EXERCISE OF ANY RIGHTS\n      GRANTED HEREUNDER, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.\n    </p>\n    <h2 id=\"general\">7. GENERAL</h2>\n    <p>If any provision of this Agreement is invalid or unenforceable under\n      applicable law, it shall not affect the validity or enforceability of the\n      remainder of the terms of this Agreement, and without further action by the\n      parties hereto, such provision shall be reformed to the minimum extent\n      necessary to make such provision valid and enforceable.\n    </p>\n    <p>If Recipient institutes patent litigation against any entity (including a\n      cross-claim or counterclaim in a lawsuit) alleging that the Program itself\n      (excluding combinations of the Program with other software or hardware)\n      infringes such Recipient&#039;s patent(s), then such Recipient&#039;s rights granted\n      under Section 2(b) shall terminate as of the date such litigation is filed.\n    </p>\n    <p>All Recipient&#039;s rights under this Agreement shall terminate if it fails to\n      comply with any of the material terms or conditions of this Agreement and\n      does not cure such failure in a reasonable period of time after becoming\n      aware of such noncompliance. If all Recipient&#039;s rights under this Agreement\n      terminate, Recipient agrees to cease use and distribution of the Program\n      as soon as reasonably practicable. However, Recipient&#039;s obligations under\n      this Agreement and any licenses granted by Recipient relating to the\n      Program shall continue and survive.\n    </p>\n    <p>Everyone is permitted to copy and distribute copies of this Agreement,\n      but in order to avoid inconsistency the Agreement is copyrighted and may\n      only be modified in the following manner. The Agreement Steward reserves\n      the right to publish new versions (including revisions) of this Agreement\n      from time to time. No one other than the Agreement Steward has the right\n      to modify this Agreement. The Eclipse Foundation is the initial Agreement\n      Steward. The Eclipse Foundation may assign the responsibility to serve as\n      the Agreement Steward to a suitable separate entity. Each new version of\n      the Agreement will be given a distinguishing version number. The Program\n      (including Contributions) may always be Distributed subject to the version\n      of the Agreement under which it was received. In addition, after a new\n      version of the Agreement is published, Contributor may elect to Distribute\n      the Program (including its Contributions) under the new version.\n    </p>\n    <p>Except as expressly stated in Sections 2(a) and 2(b) above, Recipient\n      receives no rights or licenses to the intellectual property of any\n      Contributor under this Agreement, whether expressly, by implication,\n      estoppel or otherwise. All rights in the Program not expressly granted\n      under this Agreement are reserved. Nothing in this Agreement is intended\n      to be enforceable by any entity that is not a Contributor or Recipient.\n      No third-party beneficiary rights are created under this Agreement.\n    </p>\n    <h2 id=\"exhibit-a\">Exhibit A &ndash; Form of Secondary Licenses Notice</h2>\n    <p>&ldquo;This Source Code may also be made available under the following \n    \tSecondary Licenses when the conditions for such availability set forth \n    \tin the Eclipse Public License, v. 2.0 are satisfied: {name license(s),\n    \tversion(s), and exceptions or additional permissions here}.&rdquo;\n    </p>\n    <blockquote>\n      <p>Simply including a copy of this Agreement, including this Exhibit A\n        is not sufficient to license the Source Code under Secondary Licenses.\n      </p>\n      <p>If it is not possible or desirable to put the notice in a particular file,\n        then You may include the notice in a location (such as a LICENSE file in a\n        relevant directory) where a recipient would be likely to look for\n        such a notice.\n      </p>\n      <p>You may add additional accurate notices of copyright ownership.</p>\n    </blockquote>\n  </body>\n</html>"
  },
  {
    "path": "java/.gitignore",
    "content": "DEPENDENCIES\nexamples/DEPENDENCIES\n"
  },
  {
    "path": "java/README.md",
    "content": "# Tahu Java Libraries and Implementations\n\nThese are the Java based Eclipse Sparkplug libraries, implementations, and examples.\n\n# Building\n\nFrom the git root directory run the following commands\n\n```\ncd java\nmvn clean install\n```\n\n# Eclipse Tahu Java Libraries\n\nThe Tahu Java implementation provides the following libraries. These can be used for developing custom Java based Sparkplug applications.\n\n* org.eclipse.tahu:tahu-core\n  * This is the core Sparklplug library to use for modeling, encoding, and decoding of Sparkplug topics and payloads\n* org.eclipse.tahu:tahu-edge\n  * This is the core Sparkplug library to use for implementing Sparkplug Edge Node Applications\n* org.eclipse.tahu:tahu-host\n  * This is the core Sparkplug library to use for implementing Sparkplug Host Applications\n\n# Eclipse Tahu Java Applications\n\nThe Tahu Java implementation provides the following Sparkplug compatible implementations. These are complete implementations that fully pass the Eclipse Sparkplug TCK here: https://github.com/eclipse-sparkplug/sparkplug/blob/master/tck/README.md.\n\n* org.eclipse.tahu:tahu-edge-compat\n  * This is a fully compliant Spark plug Edge Node Application that passes the Sparkplug TCK. It uses the RandomDataSimulator implementation of the DataSimulator interface to initially publish BIRTH messages and then periodically send DATA messages to an MQTT Server.\n  * To run:\n    ```\n    java -jar compat_impl/edge/target/tahu-edge-compat-1.0.1-SNAPSHOT.jar \n    ```\n  * The following config options exist for the Tahu Edge Node in compat_impl/edge/src/main/java/org/eclipse/tahu/edge/SparkplugEdgeNode.java\n    ```\n    private static final String COMMAND_LISTENER_DIRECTORY = \"/tmp/commands\";\n    private static final long COMMAND_LISTENER_POLL_RATE = 50L;\n\n    private static final String GROUP_ID = \"G1\";\n    private static final String EDGE_NODE_ID = \"E1\";\n    private static final EdgeNodeDescriptor EDGE_NODE_DESCRIPTOR = new EdgeNodeDescriptor(GROUP_ID, EDGE_NODE_ID);\n    private static final List<String> DEVICE_IDS = Arrays.asList(\"D1\");\n    private static final String PRIMARY_HOST_ID = \"IamHost\";\n    private static final boolean USE_ALIASES = true;\n    private static final Long REBIRTH_DEBOUNCE_DELAY = 5000L;\n\n    private static final MqttServerName MQTT_SERVER_NAME_1 = new MqttServerName(\"Mqtt Server One\");\n    private static final String MQTT_CLIENT_ID_1 = \"Sparkplug-Tahu-Compatible-Impl-One\";\n    private static final MqttServerUrl MQTT_SERVER_URL_1 = new MqttServerUrl(\"tcp://localhost:1883\");\n    private static final String USERNAME_1 = \"admin\";\n    private static final String PASSWORD_1 = \"changeme\";\n    private static final MqttServerName MQTT_SERVER_NAME_2 = new MqttServerName(\"Mqtt Server Two\");\n    private static final String MQTT_CLIENT_ID_2 = \"Sparkplug-Tahu-Compatible-Impl-Two\";\n    private static final MqttServerUrl MQTT_SERVER_URL_2 = new MqttServerUrl(\"tcp://localhost:1884\");\n    private static final String USERNAME_2 = \"admin\";\n    private static final String PASSWORD_2 = \"changeme\";\n    private static final int KEEP_ALIVE_TIMEOUT = 30;\n    private static final Topic NDEATH_TOPIC = new Topic(SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX, GROUP_ID, EDGE_NODE_ID, MessageType.NDEATH);\n    ```\n* org.eclipse.tahu:tahu-host-compat\n  * This is a fully compliant Sparkplug Host Application that passes the Sparkplug TCK. It receives BIRTH and DATA messages and logs them to the console. It also handles message reordering and will send 'Node Control/Rebirth' requests in the event of invalid or out of order messages.\n  * To run:\n    ```\n    java -jar compat_impl/host/target/tahu-host-compat-1.0.1-SNAPSHOT.jar \n    ```\n  * The following config options exist for the Tahu Edge Node in compat_impl/host/src/main/java/org/eclipse/tahu/host/SparkplugHostApplication.java\n    ```\n    private static final String COMMAND_LISTENER_DIRECTORY = \"/tmp/commands\";\n    private static final long COMMAND_LISTENER_POLL_RATE = 50L;\n\n    private static final String HOST_ID = \"IamHost\";\n    private static final String MQTT_SERVER_NAME_1 = \"Mqtt Server One\";\n    private static final String MQTT_CLIENT_ID_1 = \"Tahu_Host_Application\";\n    private static final String MQTT_SERVER_URL_1 = \"tcp://localhost:1883\";\n    private static final String USERNAME_1 = \"admin\";\n    private static final String PASSWORD_1 = \"changeme\";\n    private static final String MQTT_SERVER_NAME_2 = \"Mqtt Server Two\";\n    private static final String MQTT_CLIENT_ID_2 = \"Tahu_Host_Application\";\n    private static final String MQTT_SERVER_URL_2 = \"tcp://localhost:1884\";\n    private static final String USERNAME_2 = null;\n    private static final String PASSWORD_2 = null;\n    private static final int KEEP_ALIVE_TIMEOUT = 30;\n    ```\n  * The Sparkplug Tahu Host compatible implementation is capable of sending CMD messages to Edge Nodes. This is done using the filesystem and the configuration. By default, the Tahu Host Application looks for files in the '/tmp/commands' directory for files that fit a format to convert into a CMD message. This file location can be changed using the 'COMMAND_LISTENER_DIRECTORY' static variable. If this directory location is left unchanged, the following Linux scripts can used to send CMD messages.\n    * Send an Edge Node 'Node Control/Rebirth' request\n      ```console\n      #!/bin/sh\n\n      TIMESTAMP=`date +%s`\n      TIMESTAMP=${TIMESTAMP}000\n      echo ${TIMESTAMP}\n\n      PAYLOAD=\"{\\\"topic\\\":{\\\"namespace\\\":\\\"spBv1.0\\\",\\\"edgeNodeDescriptor\\\":\\\"G1/E1\\\",\\\"groupId\\\":\\\"G1\\\",\\\"edgeNodeId\\\":\\\"E1\\\",\\\"type\\\":\\\"NCMD\\\"},\\\"payload\\\":{\\\"timestamp\\\":\"${TIMESTAMP}\",\\\"metrics\\\":[{\\\"name\\\":\\\"Node Control/Rebirth\\\",\\\"timestamp\\\": \"${TIMESTAMP}\",\\\"dataType\\\":\\\"Boolean\\\",\\\"value\\\":true}]}}\"\n      echo ${PAYLOAD}\n\n      echo ${PAYLOAD} > /tmp/commands/rebirth.json\n      ```\n\n    * Send an Edge Node NCMD message\n      ```console\n      #!/bin/sh\n\n      TIMESTAMP=`date +%s`\n      TIMESTAMP=${TIMESTAMP}000\n      echo ${TIMESTAMP}\n\n      PAYLOAD=\"{\\\"topic\\\":{\\\"namespace\\\":\\\"spBv1.0\\\",\\\"edgeNodeDescriptor\\\":\\\"G1/E1\\\",\\\"groupId\\\":\\\"G1\\\",\\\"edgeNodeId\\\":\\\"E1\\\",\\\"type\\\":\\\"NCMD\\\"},\\\"payload\\\":{\\\"timestamp\\\":\"${TIMESTAMP}\",\\\"metrics\\\":[{\\\"name\\\":\\\"TCK_metric/Boolean\\\",\\\"timestamp\\\": \"${TIMESTAMP}\",\\\"dataType\\\":\\\"Boolean\\\",\\\"value\\\":true}]}}\"\n      echo ${PAYLOAD}\n\n      echo ${PAYLOAD} > /tmp/commands/edge_metric.json\n      ```\n\n    * Send an Device Rebirth request message\n      ```console\n      #!/bin/sh\n\n      TIMESTAMP=`date +%s`\n      TIMESTAMP=${TIMESTAMP}000\n      echo ${TIMESTAMP}\n\n      PAYLOAD=\"{\\\"topic\\\":{\\\"namespace\\\":\\\"spBv1.0\\\",\\\"edgeNodeDescriptor\\\":\\\"G1/E1\\\",\\\"groupId\\\":\\\"G1\\\",\\\"edgeNodeId\\\":\\\"E1\\\",\\\"deviceId\\\":\\\"D1\\\",\\\"type\\\":\\\"DCMD\\\"},\\\"payload\\\":{\\\"timestamp\\\":\"${TIMESTAMP}\",\\\"metrics\\\":[{\\\"name\\\":\\\"Device Control/Rebirth\\\",\\\"timestamp\\\": \"${TIMESTAMP}\",\\\"dataType\\\":\\\"Boolean\\\",\\\"value\\\":true}]}}\"\n      echo ${PAYLOAD}\n\n      echo ${PAYLOAD} > /tmp/commands/rebirth.json\n      ```\n\n    * Send an Device Rebirth request message\n      ```console\n      #!/bin/sh\n\n      TIMESTAMP=`date +%s`\n      TIMESTAMP=${TIMESTAMP}000\n      echo ${TIMESTAMP}\n\n      PAYLOAD=\"{\\\"topic\\\":{\\\"namespace\\\":\\\"spBv1.0\\\",\\\"edgeNodeDescriptor\\\":\\\"G1/E1\\\",\\\"groupId\\\":\\\"G1\\\",\\\"edgeNodeId\\\":\\\"E1\\\",\\\"deviceId\\\":\\\"D1\\\",\\\"type\\\":\\\"DCMD\\\"},\\\"payload\\\":{\\\"timestamp\\\":\"${TIMESTAMP}\",\\\"metrics\\\":[{\\\"name\\\":\\\"Inputs/0\\\",\\\"timestamp\\\": \"${TIMESTAMP}\",\\\"dataType\\\":\\\"Boolean\\\",\\\"value\\\":true}]}}\"\n      echo ${PAYLOAD}\n\n      echo ${PAYLOAD} > /tmp/commands/edge_metric.json\n      ```\n"
  },
  {
    "path": "java/compat_impl/edge/pom.xml",
    "content": "<!--/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n  xsi:schemaLocation=\"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">\n  <modelVersion>4.0.0</modelVersion>\n\n  <parent>\n    <groupId>org.eclipse.tahu</groupId>\n    <artifactId>tahu</artifactId>\n    <version>1.0.7</version>\n    <relativePath>../../pom.xml</relativePath>\n  </parent>\n\n  <artifactId>tahu-edge-compat</artifactId>\n  <packaging>bundle</packaging>\n  <name>Tahu Edge Compatible Implementaton</name>\n\n  <dependencies>\n    <dependency>\n      <groupId>org.eclipse.tahu</groupId>\n      <artifactId>tahu-core</artifactId>\n      <version>${project.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>org.eclipse.tahu</groupId>\n      <artifactId>tahu-edge</artifactId>\n      <version>${project.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>org.apache.commons</groupId>\n      <artifactId>commons-lang3</artifactId>\n      <version>3.12.0</version>\n    </dependency>\n    <dependency>\n      <groupId>javax.xml.bind</groupId>\n      <artifactId>jaxb-api</artifactId>\n      <version>2.3.0</version>\n    </dependency>\n  </dependencies>\n\n  <build>\n    <plugins>\n      <plugin>\n        <groupId>org.sonatype.plugins</groupId>\n        <artifactId>nexus-staging-maven-plugin</artifactId>\n        <configuration>\n          <skipNexusStagingDeployMojo>false</skipNexusStagingDeployMojo>\n        </configuration>\n      </plugin>\n      <plugin>\n        <groupId>org.codehaus.mojo</groupId>\n        <artifactId>license-maven-plugin</artifactId>\n        <version>1.8</version>\n        <executions>\n          <execution>\n            <id>add-third-party</id>\n            <phase>package</phase>\n            <goals>\n              <goal>add-third-party</goal>\n              <goal>download-licenses</goal>\n            </goals>\n            <configuration>\n              <useMissingFile>true</useMissingFile>\n              <excludedScopes>test</excludedScopes>\n              <excludedGroups> (org.eclipse.tahu*)\n              </excludedGroups>\n              <licenseMerges>\n                <licenseMerge>The Apache Software License, Version\n                  2.0|Apache License, Version 2.0|Apache Public License\n                  2.0|Apache License 2.0|Apache Software License -\n                  Version 2.0</licenseMerge>\n              </licenseMerges>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <groupId>org.apache.felix</groupId>\n        <artifactId>maven-bundle-plugin</artifactId>\n        <version>${maven.bundle.version}</version>\n        <extensions>true</extensions>\n        <configuration>\n          <instructions>\n            <Export-Package>org.eclipse.tahu.*</Export-Package>\n            <Import-Package>*;resolution:=optional</Import-Package>\n          </instructions>\n        </configuration>\n        <executions>\n          <execution>\n            <id>bundle-manifest</id>\n            <phase>process-classes</phase>\n            <goals>\n              <goal>manifest</goal>\n            </goals>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <groupId>org.apache.maven.plugins</groupId>\n        <artifactId>maven-shade-plugin</artifactId>\n        <version>3.2.2</version>\n        <configuration>\n          <createDependencyReducedPom>false</createDependencyReducedPom>\n        </configuration>\n        <executions>\n          <execution>\n            <phase>package</phase>\n            <goals>\n              <goal>shade</goal>\n            </goals>\n            <configuration>\n              <filters>\n                <filter>\n                  <artifact>*:*</artifact>\n                  <excludes>\n                    <exclude>META-INF/*.SF</exclude>\n                    <exclude>META-INF/*.DSA</exclude>\n                    <exclude>META-INF/*.RSA</exclude>\n                  </excludes>\n                </filter>\n              </filters>\n              <transformers>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ManifestResourceTransformer\">\n                  <mainClass>org.eclipse.tahu.edge.SparkplugEdgeNode</mainClass>\n                  <manifestEntries>\n                    <Tahu-Version>${project.version}</Tahu-Version>\n                    <Tahu-Build-Date>${timestamp}</Tahu-Build-Date>\n                  </manifestEntries>\n                </transformer>\n              </transformers>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n    </plugins>\n  </build>\n</project>\n"
  },
  {
    "path": "java/compat_impl/edge/src/main/java/org/eclipse/tahu/edge/CommandCallback.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.edge;\n\npublic interface CommandCallback {\n\n\tpublic void setDeviceOffline(String deviceId);\n\n\tpublic void setDeviceOnline(String deviceId);\n}\n"
  },
  {
    "path": "java/compat_impl/edge/src/main/java/org/eclipse/tahu/edge/CommandListener.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.edge;\n\nimport java.io.File;\nimport java.nio.charset.StandardCharsets;\nimport java.nio.file.FileSystems;\nimport java.util.Set;\nimport java.util.concurrent.Executors;\nimport java.util.concurrent.ScheduledExecutorService;\nimport java.util.concurrent.TimeUnit;\nimport java.util.stream.Collectors;\nimport java.util.stream.Stream;\n\nimport org.apache.commons.io.FileUtils;\nimport org.eclipse.tahu.exception.TahuErrorCode;\nimport org.eclipse.tahu.exception.TahuException;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class CommandListener implements Runnable {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(CommandListener.class.getName());\n\n\tprivate static final String SET_DEVICE_OFFLINE = \"Set device offline \";\n\n\tprivate static final String SET_DEVICE_ONLINE = \"Set device online \";\n\n\tprivate ScheduledExecutorService executor;\n\n\tprivate CommandCallback commandCallback;\n\n\tprivate File fileDirectory;\n\n\tprivate long scanRate;\n\n\tpublic CommandListener(CommandCallback commandCallback, String fileDirectoryPath, long scanRate) {\n\t\tthis.commandCallback = commandCallback;\n\t\tthis.fileDirectory = new File(fileDirectoryPath);\n\t\tthis.scanRate = scanRate;\n\t}\n\n\tpublic void start() throws TahuException {\n\t\tif (!fileDirectory.exists()) {\n\t\t\tlogger.info(\"Creating file command listener directory at {}\", fileDirectory.getPath());\n\t\t\tfileDirectory.mkdirs();\n\t\t} else if (!fileDirectory.isDirectory()) {\n\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT, \"The specified directory '{}' is not a directory\");\n\t\t}\n\n\t\texecutor = Executors.newSingleThreadScheduledExecutor();\n\t\texecutor.scheduleWithFixedDelay(this, 0, scanRate, TimeUnit.MILLISECONDS);\n\t}\n\n\tpublic void shutdown() {\n\t\texecutor.shutdownNow();\n\t\texecutor = null;\n\t}\n\n\t@Override\n\tpublic void run() {\n\t\ttry {\n\t\t\tSet<String> fileNames = Stream.of(fileDirectory.listFiles()).filter(file -> !file.isDirectory())\n\t\t\t\t\t.map(File::getName).collect(Collectors.toSet());\n\n\t\t\tif (fileNames != null && !fileNames.isEmpty()) {\n\t\t\t\tfor (String fileName : fileNames) {\n\t\t\t\t\tfileName = fileDirectory.getAbsolutePath() + FileSystems.getDefault().getSeparator() + fileName;\n\t\t\t\t\tlogger.info(\"Found file: {}\", fileName);\n\t\t\t\t\tFile commandFile = new File(fileName);\n\t\t\t\t\tString fileContents = FileUtils.readFileToString(commandFile, StandardCharsets.UTF_8);\n\n\t\t\t\t\tif (fileContents != null && fileContents.startsWith(SET_DEVICE_OFFLINE)) {\n\t\t\t\t\t\tString deviceId = fileContents.replace(SET_DEVICE_OFFLINE, \"\");\n\t\t\t\t\t\tcommandCallback.setDeviceOffline(deviceId);\n\t\t\t\t\t\tcommandFile.delete();\n\t\t\t\t\t} else if (fileContents != null && fileContents.startsWith(SET_DEVICE_ONLINE)) {\n\t\t\t\t\t\tString deviceId = fileContents.replace(SET_DEVICE_ONLINE, \"\");\n\t\t\t\t\t\tcommandCallback.setDeviceOnline(deviceId);\n\t\t\t\t\t\tcommandFile.delete();\n\t\t\t\t\t} else {\n\t\t\t\t\t\tlogger.error(\"Failed to handle input file {}\", fileName);\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"File scanning in the Comamnd Worker failed\", e);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/compat_impl/edge/src/main/java/org/eclipse/tahu/edge/PeriodicPublisher.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.edge;\n\nimport java.util.List;\n\nimport org.eclipse.tahu.edge.sim.DataSimulator;\nimport org.eclipse.tahu.message.model.DeviceDescriptor;\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class PeriodicPublisher implements Runnable {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(PeriodicPublisher.class.getName());\n\n\tprivate final long period;\n\tprivate final DataSimulator dataSimulator;\n\tprivate final EdgeClient edgeClient;\n\tprivate final EdgeNodeDescriptor edgeNodeDescriptor;\n\tprivate final List<DeviceDescriptor> deviceDescriptors;\n\n\tprivate volatile boolean stayRunning;\n\n\tpublic PeriodicPublisher(long period, DataSimulator dataSimulator, EdgeClient edgeClient,\n\t\t\tEdgeNodeDescriptor edgeNodeDescriptor, List<DeviceDescriptor> deviceDescriptors) {\n\t\tthis.period = period;\n\t\tthis.dataSimulator = dataSimulator;\n\t\tthis.edgeClient = edgeClient;\n\t\tthis.edgeNodeDescriptor = edgeNodeDescriptor;\n\t\tthis.deviceDescriptors = deviceDescriptors;\n\t\tthis.stayRunning = true;\n\t}\n\n\t@Override\n\tpublic void run() {\n\t\ttry {\n\t\t\twhile (stayRunning) {\n\t\t\t\t// Sleep a bit\n\t\t\t\tThread.sleep(period);\n\n\t\t\t\tSparkplugBPayload nDataPayload = dataSimulator.getNodeDataPayload(edgeNodeDescriptor);\n\t\t\t\tedgeClient.publishNodeData(nDataPayload);\n\n\t\t\t\tfor (DeviceDescriptor deviceDescriptor : deviceDescriptors) {\n\t\t\t\t\tSparkplugBPayload dDataPayload = dataSimulator.getDeviceDataPayload(deviceDescriptor);\n\t\t\t\t\tedgeClient.publishDeviceData(deviceDescriptor.getDeviceId(), dDataPayload);\n\t\t\t\t}\n\t\t\t}\n\t\t} catch (InterruptedException e) {\n\t\t\tlogger.error(\"Failed to continue periodic publishing\");\n\t\t}\n\t}\n\n\tpublic void shutdown() {\n\t\tstayRunning = false;\n\t}\n}\n"
  },
  {
    "path": "java/compat_impl/edge/src/main/java/org/eclipse/tahu/edge/SparkplugEdgeNode.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.edge;\n\nimport java.util.ArrayList;\nimport java.util.Arrays;\nimport java.util.Date;\nimport java.util.HashMap;\nimport java.util.List;\n\nimport org.eclipse.paho.client.mqttv3.MqttMessage;\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\nimport org.eclipse.tahu.SparkplugParsingException;\nimport org.eclipse.tahu.edge.api.MetricHandler;\nimport org.eclipse.tahu.edge.sim.DataSimulator;\nimport org.eclipse.tahu.edge.sim.RandomDataSimulator;\nimport org.eclipse.tahu.message.DefaultBdSeqManager;\nimport org.eclipse.tahu.message.PayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadEncoder;\nimport org.eclipse.tahu.message.model.DeviceDescriptor;\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\nimport org.eclipse.tahu.message.model.MessageType;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.eclipse.tahu.message.model.SparkplugBPayloadMap;\nimport org.eclipse.tahu.message.model.SparkplugBPayloadMap.SparkplugBPayloadMapBuilder;\nimport org.eclipse.tahu.message.model.SparkplugDescriptor;\nimport org.eclipse.tahu.message.model.SparkplugMeta;\nimport org.eclipse.tahu.message.model.StatePayload;\nimport org.eclipse.tahu.message.model.Topic;\nimport org.eclipse.tahu.model.MqttServerDefinition;\nimport org.eclipse.tahu.mqtt.ClientCallback;\nimport org.eclipse.tahu.mqtt.MqttClientId;\nimport org.eclipse.tahu.mqtt.MqttServerName;\nimport org.eclipse.tahu.mqtt.MqttServerUrl;\nimport org.eclipse.tahu.util.SparkplugUtil;\nimport org.eclipse.tahu.util.TopicUtil;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\nimport com.fasterxml.jackson.databind.ObjectMapper;\n\npublic class SparkplugEdgeNode implements Runnable, MetricHandler, ClientCallback, CommandCallback {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(SparkplugEdgeNode.class.getName());\n\n\tprivate static final String COMMAND_LISTENER_DIRECTORY = \"/tmp/commands\";\n\tprivate static final long COMMAND_LISTENER_POLL_RATE = 50L;\n\n\tprivate static final String GROUP_ID = \"G2\";\n\tprivate static final String EDGE_NODE_ID = \"E2\";\n\tprivate static final EdgeNodeDescriptor EDGE_NODE_DESCRIPTOR = new EdgeNodeDescriptor(GROUP_ID, EDGE_NODE_ID);\n\tprivate static final List<String> DEVICE_IDS = Arrays.asList(\"D2\");\n\tprivate static final List<DeviceDescriptor> DEVICE_DESCRIPTORS =\n\t\t\tArrays.asList(new DeviceDescriptor(EDGE_NODE_DESCRIPTOR, \"D2\"));\n\tprivate static final String PRIMARY_HOST_ID = \"IamHost\";\n\tprivate static final boolean USE_ALIASES = false;\n\tprivate static final Long REBIRTH_DEBOUNCE_DELAY = 5000L;\n\n\tprivate static final MqttServerName MQTT_SERVER_NAME_1 = new MqttServerName(\"Mqtt Server One\");\n\tprivate static final String MQTT_CLIENT_ID_1 = \"Sparkplug-Tahu-Compatible-Impl-One\";\n\tprivate static final MqttServerUrl MQTT_SERVER_URL_1 = MqttServerUrl.getMqttServerUrlSafe(\"tcp://localhost:1883\");\n\tprivate static final String USERNAME_1 = \"admin\";\n\tprivate static final String PASSWORD_1 = \"changeme\";\n\tprivate static final MqttServerName MQTT_SERVER_NAME_2 = new MqttServerName(\"Mqtt Server Two\");\n\tprivate static final String MQTT_CLIENT_ID_2 = \"Sparkplug-Tahu-Compatible-Impl-Two\";\n\tprivate static final MqttServerUrl MQTT_SERVER_URL_2 = MqttServerUrl.getMqttServerUrlSafe(\"tcp://localhost:1884\");\n\tprivate static final String USERNAME_2 = \"admin\";\n\tprivate static final String PASSWORD_2 = \"changeme\";\n\tprivate static final int KEEP_ALIVE_TIMEOUT = 30;\n\tprivate static final Topic NDEATH_TOPIC =\n\t\t\tnew Topic(SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX, GROUP_ID, EDGE_NODE_ID, MessageType.NDEATH);\n\n\tprivate static final List<MqttServerDefinition> mqttServerDefinitions = new ArrayList<>();\n\n\tprivate CommandListener commandListener;\n\n\t/*\n\t * Next Birth BD sequence number - same as last deathBdSeq\n\t */\n\tprivate long birthBdSeq;\n\n\t/*\n\t * Next Death BD sequence number\n\t */\n\tprivate long deathBdSeq;\n\n\tprivate final DataSimulator dataSimulator =\n\t\t\tnew RandomDataSimulator(10, new HashMap<SparkplugDescriptor, Integer>() {\n\n\t\t\t\tprivate static final long serialVersionUID = 1L;\n\n\t\t\t\t{\n\t\t\t\t\tfor (DeviceDescriptor deviceDescriptor : DEVICE_DESCRIPTORS) {\n\t\t\t\t\t\tput(deviceDescriptor, 50);\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t});\n\n\t/*\n\t * Lock for manipulating the sequence number\n\t */\n\tprivate Object clientLock = new Object();\n\n\tpublic static void main(String[] arg) {\n\t\ttry {\n\t\t\tmqttServerDefinitions\n\t\t\t\t\t.add(new MqttServerDefinition(MQTT_SERVER_NAME_1, new MqttClientId(MQTT_CLIENT_ID_1, false),\n\t\t\t\t\t\t\tMQTT_SERVER_URL_1, USERNAME_1, PASSWORD_1, KEEP_ALIVE_TIMEOUT, NDEATH_TOPIC));\n//\t\t\tmqttServerDefinitions\n//\t\t\t\t\t.add(new MqttServerDefinition(MQTT_SERVER_NAME_2, new MqttClientId(MQTT_CLIENT_ID_2, false),\n//\t\t\t\t\t\t\tMQTT_SERVER_URL_2, USERNAME_2, PASSWORD_2, KEEP_ALIVE_TIMEOUT, NDEATH_TOPIC));\n\n\t\t\tSystem.out.println(\"Starting the Sparkplug Edge Node\");\n\t\t\tSystem.out.println(\"\\tGroup ID: \" + GROUP_ID);\n\t\t\tSystem.out.println(\"\\tEdge Node ID: \" + EDGE_NODE_ID);\n\t\t\tSystem.out.println(\"\\tDevice IDs: \" + DEVICE_IDS);\n\t\t\tSystem.out.println(\"\\tPrimary Host ID: \" + PRIMARY_HOST_ID);\n\t\t\tSystem.out.println(\"\\tUsing Aliases: \" + USE_ALIASES);\n\t\t\tSystem.out.println(\"\\tRebirth Debounce Delay: \" + REBIRTH_DEBOUNCE_DELAY);\n\n\t\t\tfor (MqttServerDefinition mqttServerDefinition : mqttServerDefinitions) {\n\t\t\t\tSystem.out.println(\"\\tMQTT Server Name: \" + mqttServerDefinition.getMqttServerName());\n\t\t\t\tSystem.out.println(\"\\tMQTT Client ID: \" + mqttServerDefinition.getMqttClientId());\n\t\t\t\tSystem.out.println(\"\\tMQTT Server URL: \" + mqttServerDefinition.getMqttServerUrl());\n\t\t\t\tSystem.out.println(\"\\tUsername: \" + mqttServerDefinition.getUsername());\n\t\t\t\tSystem.out.println(\"\\tPassword: ********\");\n\t\t\t\tSystem.out.println(\"\\tKeep Alive Timeout: \" + mqttServerDefinition.getKeepAliveTimeout());\n\t\t\t}\n\n\t\t\tSparkplugEdgeNode sparkplugEdgeNode = new SparkplugEdgeNode();\n\t\t\tThread edgeNodeThread = new Thread(sparkplugEdgeNode);\n\t\t\tedgeNodeThread.start();\n\n\t\t\t// Run for a while and shutdown\n\t\t\tThread.sleep(360000);\n\t\t\tsparkplugEdgeNode.shutdown();\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to run the Edge Node\", e);\n\t\t}\n\t}\n\n\tprivate EdgeClient edgeClient;\n\tprivate Thread edgeClientThread;\n\tprivate PeriodicPublisher periodicPublisher;\n\tprivate DefaultBdSeqManager defaultBdSeqManager;\n\tprivate Thread periodicPublisherThread;\n\n\tpublic SparkplugEdgeNode() {\n\t\ttry {\n\t\t\tdefaultBdSeqManager = new DefaultBdSeqManager(\"SparkplugEdgeNode\");\n\t\t\tdeathBdSeq = defaultBdSeqManager.getNextDeathBdSeqNum();\n\t\t\tbirthBdSeq = deathBdSeq;\n\n\t\t\tedgeClient = new EdgeClient(this, EDGE_NODE_DESCRIPTOR, DEVICE_IDS, PRIMARY_HOST_ID, USE_ALIASES,\n\t\t\t\t\tREBIRTH_DEBOUNCE_DELAY, mqttServerDefinitions, this, null);\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to create the Sparkplug Edge Client\", e);\n\t\t}\n\t}\n\n\t@Override\n\tpublic void run() {\n\t\ttry {\n\t\t\tcommandListener = new CommandListener(this, COMMAND_LISTENER_DIRECTORY, COMMAND_LISTENER_POLL_RATE);\n\t\t\tcommandListener.start();\n\n\t\t\tedgeClientThread = new Thread(edgeClient);\n\t\t\tedgeClientThread.start();\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to start\", e);\n\t\t}\n\t}\n\n\t// MetricHandler API\n\t@Override\n\tpublic Topic getDeathTopic() {\n\t\treturn NDEATH_TOPIC;\n\t}\n\n\t// MetricHandler API\n\t@Override\n\tpublic byte[] getDeathPayloadBytes() throws Exception {\n\t\tSparkplugBPayload nDeathPayload = new SparkplugBPayloadBuilder().setTimestamp(new Date()).createPayload();\n\t\taddDeathSeqNum(nDeathPayload);\n\t\treturn new SparkplugBPayloadEncoder().getBytes(nDeathPayload, true);\n\t}\n\n\t// MetricHandler API\n\t@Override\n\tpublic void publishBirthSequence() {\n\t\ttry {\n\t\t\tSparkplugBPayloadMap nBirthPayload = dataSimulator.getNodeBirthPayload(EDGE_NODE_DESCRIPTOR);\n\t\t\tnBirthPayload = addBirthSeqNum(nBirthPayload);\n\t\t\tedgeClient.publishNodeBirth(nBirthPayload);\n\n\t\t\tfor (String deviceId : DEVICE_IDS) {\n\t\t\t\tSparkplugBPayload dBirthPayload =\n\t\t\t\t\t\tdataSimulator.getDeviceBirthPayload(new DeviceDescriptor(EDGE_NODE_DESCRIPTOR, deviceId));\n\t\t\t\tedgeClient.publishDeviceBirth(deviceId, dBirthPayload);\n\t\t\t}\n\n\t\t\t// The BIRTH sequence has been published - set up a periodic publisher\n\t\t\tperiodicPublisher =\n\t\t\t\t\tnew PeriodicPublisher(5000, dataSimulator, edgeClient, EDGE_NODE_DESCRIPTOR, DEVICE_DESCRIPTORS);\n\t\t\tperiodicPublisherThread = new Thread(periodicPublisher);\n\t\t\tperiodicPublisherThread.start();\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to publish the BIRTH sequence\", e);\n\t\t}\n\t}\n\n\t// MetricHandler API\n\t@Override\n\tpublic boolean hasMetric(SparkplugDescriptor sparkplugDescriptor, String metricName) {\n\t\treturn dataSimulator.hasMetric(sparkplugDescriptor, metricName);\n\t}\n\n\t// ClientCallback API\n\t@Override\n\tpublic void shutdown() {\n\t\tlogger.info(\"ClientCallback shutdown\");\n\n\t\tif (commandListener != null) {\n\t\t\tcommandListener.shutdown();\n\t\t\tcommandListener = null;\n\t\t}\n\t\tif (periodicPublisher != null) {\n\t\t\tperiodicPublisher.shutdown();\n\t\t\tperiodicPublisher = null;\n\t\t}\n\t\tif (periodicPublisherThread != null) {\n\t\t\tperiodicPublisherThread.interrupt();\n\t\t\tperiodicPublisherThread = null;\n\t\t}\n\n\t\tif (edgeClient != null) {\n\t\t\tedgeClient.shutdown();\n\t\t\tedgeClient = null;\n\t\t\tedgeClientThread = null;\n\t\t}\n\t}\n\n\t// ClientCallback API\n\t@Override\n\tpublic void messageArrived(MqttServerName mqttServerName, MqttServerUrl mqttServerUrl, MqttClientId clientId,\n\t\t\tString rawTopic, MqttMessage message) {\n\t\tlogger.info(\"{}: ClientCallback messageArrived on topic={}\", clientId, rawTopic);\n\n\t\tfinal Topic topic;\n\t\ttry {\n\t\t\ttopic = TopicUtil.parseTopic(rawTopic);\n\t\t} catch (SparkplugParsingException e) {\n\t\t\tlogger.error(\"Error parsing Sparkplug topic {}\", rawTopic, e);\n\t\t\treturn;\n\t\t}\n\n\t\tif (rawTopic.startsWith(\"spBv1.0/STATE/\")) {\n\t\t\ttry {\n\t\t\t\tlogger.info(\"Got STATE message: {} :: {}\", rawTopic, new String(message.getPayload()));\n\t\t\t\tObjectMapper mapper = new ObjectMapper();\n\t\t\t\tStatePayload statePayload = mapper.readValue(message.getPayload(), StatePayload.class);\n\t\t\t\tedgeClient.handleStateMessage(topic.getHostApplicationId(), statePayload);\n\t\t\t} catch (Exception e) {\n\t\t\t\tlogger.error(\"Failed to handle STATE message with topic={} and payload={}\", rawTopic,\n\t\t\t\t\t\tnew String(message.getPayload()));\n\t\t\t}\n\t\t\treturn;\n\t\t} else if (!SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX.equals(TopicUtil.getSplitTopic(rawTopic)[0])) {\n\t\t\tlogger.warn(\"Message received on erroneous topic: {}\", rawTopic);\n\t\t\treturn;\n\t\t} else {\n\t\t\t// Sparkplug message!\n\t\t\tfinal SparkplugBPayload payload;\n\n\t\t\ttry {\n\t\t\t\t// Handling case where the MQTT Server publishes an LWT on our behalf but we're actually online.\n\t\t\t\tif (MessageType.NDEATH.equals(topic.getType()) && topic.getGroupId().equals(GROUP_ID)\n\t\t\t\t\t\t&& topic.getEdgeNodeId().equals(EDGE_NODE_ID)) {\n\t\t\t\t\tif (!edgeClient.isDisconnectedOrDisconnecting()) {\n\t\t\t\t\t\tif (edgeClient.isConnectedToPrimaryHost()) {\n\t\t\t\t\t\t\t// Parse out the bdSeq number\n\t\t\t\t\t\t\tpayload = new SparkplugBPayloadDecoder().buildFromByteArray(message.getPayload(), null);\n\t\t\t\t\t\t\t// SparkplugUtils.decodePayload(message.getPayload());\n\t\t\t\t\t\t\tlong incomingBdSeq = SparkplugUtil.getBdSequenceNumber(payload);\n\t\t\t\t\t\t\ttry {\n\t\t\t\t\t\t\t\tif (birthBdSeq == incomingBdSeq) {\n\t\t\t\t\t\t\t\t\t// This is an LWT - but we're online and the bdSeq number matched - correct the\n\t\t\t\t\t\t\t\t\t// error by treating it as a rebirth\n\t\t\t\t\t\t\t\t\tlogger.info(\"Got unexpected LWT for {} - publishing BIRTH sequence\",\n\t\t\t\t\t\t\t\t\t\t\tEDGE_NODE_DESCRIPTOR);\n\t\t\t\t\t\t\t\t\tedgeClient.handleRebirthRequest(true);\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t} catch (Exception e) {\n\t\t\t\t\t\t\t\tlogger.warn(\"Got unexpected LWT but failed to publish a new BIRTH sequence for {}\",\n\t\t\t\t\t\t\t\t\t\tEDGE_NODE_DESCRIPTOR);\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tlogger.debug(\"Got unexpected LWT but not connected to primary host - ignoring\");\n\t\t\t\t\t\t}\n\t\t\t\t\t} else {\n\t\t\t\t\t\tlogger.debug(\"Got expected LWT for {}\", EDGE_NODE_DESCRIPTOR);\n\t\t\t\t\t}\n\n\t\t\t\t\treturn;\n\t\t\t\t}\n\t\t\t} catch (Exception e) {\n\t\t\t\tlogger.error(\"Failed to handle NDEATH when connected on {}\", topic, e);\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\tif (!MessageType.NCMD.equals(topic.getType()) && !MessageType.DCMD.equals(topic.getType())) {\n\t\t\t\tlogger.debug(\"Ignoring unexpected incoming Sparkplug message of type {}\", topic.getType());\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\ttry {\n\t\t\t\tlogger.debug(\"Decoding Sparkplug Payload\");\n\t\t\t\tPayloadDecoder<SparkplugBPayload> decoder = new SparkplugBPayloadDecoder();\n\t\t\t\tpayload = decoder.buildFromByteArray(message.getPayload(), null);\n\t\t\t\tlogger.debug(\"Message Timestamp: {}\", payload.getTimestamp());\n\t\t\t} catch (Exception e) {\n\t\t\t\tlogger.error(\"Failed to parse message - not acting on it\", e);\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\tif (MessageType.NCMD.equals(topic.getType())) {\n\t\t\t\ttry {\n\t\t\t\t\tfinal List<Metric> receivedMetrics = payload.getMetrics();\n\t\t\t\t\tfinal List<Metric> responseMetrics = new ArrayList<>();\n\t\t\t\t\tif (receivedMetrics != null && !receivedMetrics.isEmpty()) {\n\t\t\t\t\t\t// Prep the payload\n\t\t\t\t\t\tDate now = new Date();\n\t\t\t\t\t\tSparkplugBPayloadMapBuilder payloadBuilder = new SparkplugBPayloadMapBuilder();\n\t\t\t\t\t\tpayloadBuilder.setTimestamp(now);\n\n\t\t\t\t\t\t// Add the metrics\n\t\t\t\t\t\tfor (Metric metric : receivedMetrics) {\n\t\t\t\t\t\t\tString name = metric.getName();\n\n\t\t\t\t\t\t\tlogger.debug(\"Node Metric Name: {}\", name);\n\t\t\t\t\t\t\tObject value = metric.getValue();\n\t\t\t\t\t\t\tlogger.debug(\"Metric: {} :: {} :: {}\", name, value, metric.getDataType());\n\t\t\t\t\t\t\tif (SparkplugMeta.METRIC_NODE_REBIRTH.equals(name) && value.equals(true)) {\n\t\t\t\t\t\t\t\tedgeClient.handleRebirthRequest(true);\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\tMetric writtenMetric = dataSimulator.handleMetricWrite(EDGE_NODE_DESCRIPTOR, metric);\n\t\t\t\t\t\t\t\tif (writtenMetric != null) {\n\t\t\t\t\t\t\t\t\tresponseMetrics.add(writtenMetric);\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tif (!responseMetrics.isEmpty()) {\n\t\t\t\t\t\t\t// Publish the response NDATA message\n\t\t\t\t\t\t\tlogger.debug(\"Publishing NDATA based on NCMD message for {}\", EDGE_NODE_DESCRIPTOR);\n\t\t\t\t\t\t\tpayloadBuilder.addMetrics(responseMetrics);\n\t\t\t\t\t\t\tedgeClient.publishNodeData(payloadBuilder.createPayload());\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tlogger.warn(\"Received NCMD with no valid metrics to write for {}\", EDGE_NODE_DESCRIPTOR);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t} catch (Exception e) {\n\t\t\t\t\tlogger.error(\"Error parsing NCMD\", e);\n\t\t\t\t}\n\t\t\t} else if (MessageType.DCMD.equals(topic.getType())) {\n\t\t\t\ttry {\n\t\t\t\t\tfinal List<Metric> receivedMetrics = payload.getMetrics();\n\t\t\t\t\tfinal List<Metric> responseMetrics = new ArrayList<>();\n\t\t\t\t\tif (receivedMetrics != null && !receivedMetrics.isEmpty()) {\n\t\t\t\t\t\t// Prep the payload\n\t\t\t\t\t\tDate now = new Date();\n\t\t\t\t\t\tSparkplugBPayloadMapBuilder payloadBuilder = new SparkplugBPayloadMapBuilder();\n\t\t\t\t\t\tpayloadBuilder.setTimestamp(now);\n\n\t\t\t\t\t\t// Add the metrics\n\t\t\t\t\t\tfor (Metric metric : receivedMetrics) {\n\t\t\t\t\t\t\tString name = metric.getName();\n\n\t\t\t\t\t\t\tlogger.debug(\"Device Metric Name: {}\", name);\n\t\t\t\t\t\t\tObject value = metric.getValue();\n\t\t\t\t\t\t\tlogger.debug(\"Metric: {} :: {} :: {}\", name, value, metric.getDataType());\n\t\t\t\t\t\t\tMetric writtenMetric = dataSimulator.handleMetricWrite(\n\t\t\t\t\t\t\t\t\tnew DeviceDescriptor(EDGE_NODE_DESCRIPTOR, topic.getDeviceId()), metric);\n\t\t\t\t\t\t\tif (writtenMetric != null) {\n\t\t\t\t\t\t\t\tresponseMetrics.add(writtenMetric);\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tif (!responseMetrics.isEmpty()) {\n\t\t\t\t\t\t\t// Publish the response NDATA message\n\t\t\t\t\t\t\tlogger.debug(\"Publishing DDATA based on DCMD message for {}/{}\", EDGE_NODE_DESCRIPTOR,\n\t\t\t\t\t\t\t\t\ttopic.getDeviceId());\n\t\t\t\t\t\t\tpayloadBuilder.addMetrics(responseMetrics);\n\t\t\t\t\t\t\tedgeClient.publishDeviceData(topic.getDeviceId(), payloadBuilder.createPayload());\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tlogger.warn(\"Received DCMD with no valid metrics to write for {}/{}\", EDGE_NODE_DESCRIPTOR,\n\t\t\t\t\t\t\t\t\ttopic.getDeviceId());\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t} catch (Throwable t) {\n\t\t\t\t\tlogger.error(\"Error parsing DCMD\", t);\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\t// ClientCallback API\n\t@Override\n\tpublic void connectionLost(MqttServerName mqttServerName, MqttServerUrl mqttServerUrl, MqttClientId clientId,\n\t\t\tThrowable cause) {\n\t\tlogger.info(\"{}: ClientCallback connectionLost\", clientId);\n\t}\n\n\t// ClientCallback API\n\t@Override\n\tpublic void connectComplete(boolean reconnect, MqttServerName mqttServerName, MqttServerUrl mqttServerUrl,\n\t\t\tMqttClientId clientId) {\n\t\tlogger.info(\"{}: ClientCallback connectComplete\", clientId);\n\t}\n\n\t// CommandCallback API\n\t@Override\n\tpublic void setDeviceOffline(String deviceId) {\n\t\tedgeClient.publishDeviceDeath(deviceId);\n\t}\n\n\t// CommandCallback API\n\t@Override\n\tpublic void setDeviceOnline(String deviceId) {\n\t\tSparkplugBPayload dBirthPayload =\n\t\t\t\tdataSimulator.getDeviceBirthPayload(new DeviceDescriptor(EDGE_NODE_DESCRIPTOR, deviceId));\n\t\tedgeClient.publishDeviceBirth(deviceId, dBirthPayload);\n\t}\n\n\t/*\n\t * Used to add the death sequence number\n\t */\n\tprivate SparkplugBPayload addDeathSeqNum(SparkplugBPayload payload) {\n\t\tsynchronized (clientLock) {\n\t\t\tif (payload == null) {\n\t\t\t\tpayload = new SparkplugBPayloadBuilder().createPayload();\n\t\t\t}\n\t\t\tif (deathBdSeq == 256) {\n\t\t\t\tdeathBdSeq = 0;\n\t\t\t}\n\t\t\tlogger.trace(\"Death bdSeq(before) = {}\", deathBdSeq);\n\t\t\ttry {\n\t\t\t\tlogger.trace(\"Set bdSeq number in NDEATH to {}\", deathBdSeq);\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"bdSeq\", MetricDataType.Int64, deathBdSeq).createMetric());\n\n\t\t\t\t// Increment sequence numbers in preparation for the next new connect\n\t\t\t\tbirthBdSeq = deathBdSeq;\n\t\t\t\tdeathBdSeq++;\n\t\t\t\tdefaultBdSeqManager.storeNextDeathBdSeqNum(deathBdSeq);\n\t\t\t} catch (SparkplugInvalidTypeException e) {\n\t\t\t\tlogger.error(\"Failed to create death payload\", e);\n\t\t\t\treturn null;\n\t\t\t}\n\t\t\tlogger.trace(\"Death bdSeq(after) = {}\", deathBdSeq);\n\t\t\treturn payload;\n\t\t}\n\t}\n\n\t/*\n\t * Used to add the birth sequence number\n\t */\n\tprivate SparkplugBPayloadMap addBirthSeqNum(SparkplugBPayloadMap nBirthPayload) {\n\t\tsynchronized (clientLock) {\n\t\t\tif (nBirthPayload == null) {\n\t\t\t\tnBirthPayload = new SparkplugBPayloadMapBuilder().createPayload();\n\t\t\t}\n\t\t\tlogger.trace(\"Birth bdSeq(before) = {}\", birthBdSeq);\n\t\t\ttry {\n\t\t\t\tlogger.trace(\"Set bdSeq number in NBIRTH to {}\", birthBdSeq);\n\t\t\t\tnBirthPayload.addMetric(new MetricBuilder(\"bdSeq\", MetricDataType.Int64, birthBdSeq).createMetric());\n\t\t\t} catch (SparkplugInvalidTypeException e) {\n\t\t\t\tlogger.error(\"Failed to create birth payload\", e);\n\t\t\t\treturn null;\n\t\t\t}\n\t\t\tlogger.trace(\"Birth bdSeq(after) = {}\", birthBdSeq);\n\t\t\treturn nBirthPayload;\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/compat_impl/edge/src/main/java/org/eclipse/tahu/edge/sim/DataSimulator.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.edge.sim;\n\nimport org.eclipse.tahu.message.model.DeviceDescriptor;\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayloadMap;\nimport org.eclipse.tahu.message.model.SparkplugDescriptor;\n\npublic interface DataSimulator {\n\n\t/**\n\t * Getting for fetching a NBIRTH {@link SparkplugBPayloadMap}\n\t * \n\t * @param sparkplugDescriptor the {@link EdgeNodeDescriptor} to use when fetching the {@link SparkplugBPayloadMap}\n\t * \n\t * @return a {@link SparkplugBPayloadMap} representing an NBIRTH payload\n\t */\n\tpublic SparkplugBPayloadMap getNodeBirthPayload(EdgeNodeDescriptor edgeNodeDescriptor);\n\n\t/**\n\t * Getting for fetching a NDATA {@link SparkplugBPayload}\n\t *\n\t * @param sparkplugDescriptor the {@link EdgeNodeDescriptor} to use when fetching the {@link SparkplugBPayloadMap}\n\t *\n\t * @return a {@link SparkplugBPayload} representing an NDATA payload\n\t */\n\tpublic SparkplugBPayload getNodeDataPayload(EdgeNodeDescriptor edgeNodeDescriptor);\n\n\t/**\n\t * Getting for fetching a DBIRTH {@link SparkplugBPayload}\n\t * \n\t * @param deviceDescriptor the {@link DeviceDescriptor} to use when fetching the {@link SparkplugBPayload}\n\t * \n\t * @return a {@link SparkplugBPayload} representing an DBIRTH payload\n\t */\n\tpublic SparkplugBPayload getDeviceBirthPayload(DeviceDescriptor deviceDescriptor);\n\n\t/**\n\t * Getting for fetching a DDATA {@link SparkplugBPayload}\n\t * \n\t * @param deviceDescriptor the {@link DeviceDescriptor} to use when fetching the {@link SparkplugBPayload}\n\t * \n\t * @return a {@link SparkplugBPayload} representing an DDATA payload\n\t */\n\tpublic SparkplugBPayload getDeviceDataPayload(DeviceDescriptor deviceDescriptor);\n\n\t/**\n\t * \n\t * @param sparkplugDescriptor\n\t * @param metricName\n\t * @return\n\t */\n\tpublic boolean hasMetric(SparkplugDescriptor sparkplugDescriptor, String metricName);\n\n\tpublic Metric handleMetricWrite(SparkplugDescriptor sparkplugDescriptor, Metric metric);\n}\n"
  },
  {
    "path": "java/compat_impl/edge/src/main/java/org/eclipse/tahu/edge/sim/RandomDataSimulator.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.edge.sim;\n\nimport java.math.BigInteger;\nimport java.security.MessageDigest;\nimport java.util.ArrayList;\nimport java.util.Date;\nimport java.util.HashMap;\nimport java.util.List;\nimport java.util.Map;\nimport java.util.Random;\nimport java.util.UUID;\nimport java.util.concurrent.ThreadLocalRandom;\n\nimport javax.xml.bind.DatatypeConverter;\n\nimport org.apache.commons.lang3.RandomStringUtils;\nimport org.eclipse.tahu.SparkplugException;\nimport org.eclipse.tahu.message.model.DataSet;\nimport org.eclipse.tahu.message.model.DataSet.DataSetBuilder;\nimport org.eclipse.tahu.message.model.DataSetDataType;\nimport org.eclipse.tahu.message.model.DeviceDescriptor;\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\nimport org.eclipse.tahu.message.model.File;\nimport org.eclipse.tahu.message.model.MetaData;\nimport org.eclipse.tahu.message.model.MetaData.MetaDataBuilder;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.Parameter;\nimport org.eclipse.tahu.message.model.ParameterDataType;\nimport org.eclipse.tahu.message.model.Row.RowBuilder;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.eclipse.tahu.message.model.SparkplugBPayloadMap;\nimport org.eclipse.tahu.message.model.SparkplugBPayloadMap.SparkplugBPayloadMapBuilder;\nimport org.eclipse.tahu.message.model.SparkplugDescriptor;\nimport org.eclipse.tahu.message.model.Template;\nimport org.eclipse.tahu.message.model.Template.TemplateBuilder;\nimport org.eclipse.tahu.message.model.Value;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class RandomDataSimulator implements DataSimulator {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(RandomDataSimulator.class.getName());\n\n\tprivate final int numNodeMetrics;\n\tprivate final Map<SparkplugDescriptor, Integer> numDeviceMetrics;\n\n\tprivate final Random random = new Random();\n\tprivate final Map<SparkplugDescriptor, Map<String, Metric>> metricMaps = new HashMap<>();\n\tprivate final Map<SparkplugDescriptor, Long> lastUpdateMap = new HashMap<>();\n\n\tpublic RandomDataSimulator(int numNodeMetrics, Map<SparkplugDescriptor, Integer> numDeviceMetrics) {\n\t\tthis.numNodeMetrics = numNodeMetrics;\n\t\tthis.numDeviceMetrics = numDeviceMetrics;\n\t}\n\n\t// DataSimulator API\n\t@Override\n\tpublic SparkplugBPayloadMap getNodeBirthPayload(EdgeNodeDescriptor edgeNodeDescriptor) {\n\t\ttry {\n\t\t\tDate now = new Date();\n\t\t\tMap<String, Metric> metricMap = new HashMap<>();\n\n\t\t\tSparkplugBPayloadMapBuilder payloadBuilder = new SparkplugBPayloadMapBuilder();\n\t\t\tpayloadBuilder.setTimestamp(now);\n\n\t\t\t// Add the Template definitions\n\t\t\tpayloadBuilder\n\t\t\t\t\t.addMetric(new MetricBuilder(\"simpleType\", MetricDataType.Template, newSimpleTemplate(true, null))\n\t\t\t\t\t\t\t.createMetric());\n\t\t\tpayloadBuilder.addMetrics(newComplexTemplateDefs());\n\n\t\t\t// Add some random metrics\n\t\t\tfor (int i = 0; i < numNodeMetrics; i++) {\n\t\t\t\tMetric metric = getRandomMetric(\"NT\", i, true);\n\t\t\t\tmetricMap.put(metric.getName(), metric);\n\t\t\t\tpayloadBuilder.addMetric(metric);\n\t\t\t}\n\n\t\t\tmetricMaps.put(edgeNodeDescriptor, metricMap);\n\t\t\tlastUpdateMap.put(edgeNodeDescriptor, now.getTime());\n\t\t\treturn payloadBuilder.createPayload();\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to get the NBIRTH\", e);\n\t\t\treturn null;\n\t\t}\n\t}\n\n\t// DataSimulator API\n\t@Override\n\tpublic SparkplugBPayload getNodeDataPayload(EdgeNodeDescriptor edgeNodeDescriptor) {\n\t\ttry {\n\t\t\tDate now = new Date();\n\t\t\tMap<String, Metric> metricMap = new HashMap<>();\n\n\t\t\tSparkplugBPayloadBuilder payloadBuilder = new SparkplugBPayloadBuilder();\n\t\t\tpayloadBuilder.setTimestamp(now);\n\t\t\tlogger.info(\"Getting number of metrics for {}\", edgeNodeDescriptor);\n\t\t\tfor (int i = 0; i < numNodeMetrics; i++) {\n\t\t\t\tMetric metric = getRandomMetric(\"NT\", i, true);\n\t\t\t\tif (metric != null) {\n\t\t\t\t\tmetricMap.put(metric.getName(), metric);\n\t\t\t\t\tpayloadBuilder.addMetric(metric);\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tmetricMaps.put(edgeNodeDescriptor, metricMap);\n\t\t\tlastUpdateMap.put(edgeNodeDescriptor, now.getTime());\n\t\t\treturn payloadBuilder.createPayload();\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to get the NDATA\", e);\n\t\t\treturn null;\n\t\t}\n\t}\n\n\t// DataSimulator API\n\t@Override\n\tpublic SparkplugBPayload getDeviceBirthPayload(DeviceDescriptor deviceDescriptor) {\n\t\ttry {\n\t\t\tDate now = new Date();\n\t\t\tMap<String, Metric> metricMap = new HashMap<>();\n\n\t\t\tSparkplugBPayloadBuilder payloadBuilder = new SparkplugBPayloadBuilder();\n\t\t\tpayloadBuilder.setTimestamp(now);\n\t\t\tlogger.info(\"Getting number of metrics for {}\", deviceDescriptor);\n\t\t\tfor (int i = 0; i < numDeviceMetrics.get(deviceDescriptor); i++) {\n\t\t\t\tMetric metric = getRandomMetric(\"DT\", i, true);\n\t\t\t\tif (metric != null) {\n\t\t\t\t\tmetricMap.put(metric.getName(), metric);\n\t\t\t\t\tpayloadBuilder.addMetric(metric);\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tmetricMaps.put(deviceDescriptor, metricMap);\n\t\t\tlastUpdateMap.put(deviceDescriptor, now.getTime());\n\t\t\treturn payloadBuilder.createPayload();\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to get the DBIRTH\", e);\n\t\t\treturn null;\n\t\t}\n\t}\n\n\t// DataSimulator API\n\t@Override\n\tpublic SparkplugBPayload getDeviceDataPayload(DeviceDescriptor deviceDescriptor) {\n\t\ttry {\n\t\t\tDate now = new Date();\n\t\t\tMap<String, Metric> metricMap = new HashMap<>();\n\n\t\t\tSparkplugBPayloadBuilder payloadBuilder = new SparkplugBPayloadBuilder();\n\t\t\tpayloadBuilder.setTimestamp(now);\n\t\t\tlogger.info(\"Getting number of metrics for {}\", deviceDescriptor);\n\t\t\tfor (int i = 0; i < numDeviceMetrics.get(deviceDescriptor); i++) {\n\t\t\t\tMetric metric = getRandomMetric(\"DT\", i, true);\n\t\t\t\tif (metric != null) {\n\t\t\t\t\tmetricMap.put(metric.getName(), metric);\n\t\t\t\t\tpayloadBuilder.addMetric(metric);\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tmetricMaps.put(deviceDescriptor, metricMap);\n\t\t\tlastUpdateMap.put(deviceDescriptor, now.getTime());\n\t\t\treturn payloadBuilder.createPayload();\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to get the DDATA\", e);\n\t\t\treturn null;\n\t\t}\n\t}\n\n\t// DataSimulator API\n\t@Override\n\tpublic boolean hasMetric(SparkplugDescriptor sparkplugDescriptor, String metricName) {\n\t\tif (metricMaps.containsKey(sparkplugDescriptor)\n\t\t\t\t&& metricMaps.get(sparkplugDescriptor).get(metricName) != null) {\n\t\t\treturn true;\n\t\t} else {\n\t\t\treturn false;\n\t\t}\n\t}\n\n\t// DataSimulator API\n\t@Override\n\tpublic Metric handleMetricWrite(SparkplugDescriptor sparkplugDescriptor, Metric metric) {\n\t\t// No-op for this simulator - just return the metric as though the value was 'written'\n\t\treturn null;\n\t}\n\n\tprivate Metric getRandomMetric(String namePrefix, int index, boolean isBirth) throws Exception {\n\t\tint remainder = index % 34;\n\t\tint dataType = remainder + 1;\n\n\t\t// These are not valid MetricDataTypes - return an standard Int32\n\t\tif (dataType == 20 || dataType == 21) {\n\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.Int32, getRandomInt32()).createMetric();\n\t\t}\n\n\t\tswitch (dataType) {\n\t\t\tcase 1:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.Int8, getRandomInt8()).createMetric();\n\t\t\tcase 2:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.Int16, getRandomInt16())\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 3:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.Int32, getRandomInt32())\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 4:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.Int64, getRandomInt64())\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 5:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.UInt8, getRandomUInt8())\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 6:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.UInt16, getRandomUInt16())\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 7:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.UInt32, getRandomUInt32())\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 8:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.UInt64, getRandomUInt64())\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 9:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.Float, random.nextFloat())\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 10:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.Double, random.nextDouble())\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 11:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.Boolean, random.nextBoolean())\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 12:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.String, getRandomString(8))\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 13:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.DateTime, new Date(random.nextLong()))\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 14:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.Text, getRandomString(8))\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 15:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.UUID, UUID.randomUUID().toString())\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 16:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.DataSet, newDataSet()).createMetric();\n\t\t\tcase 17:\n\t\t\t\tbyte[] byteArray = new byte[10];\n\t\t\t\trandom.nextBytes(byteArray);\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.Bytes, byteArray).createMetric();\n\t\t\tcase 18:\n\t\t\t\tbyte[] fileDataArray = new byte[10];\n\t\t\t\trandom.nextBytes(fileDataArray);\n\t\t\t\tbyte[] md5 = MessageDigest.getInstance(\"MD5\").digest(fileDataArray);\n\t\t\t\tString hashString = DatatypeConverter.printHexBinary(md5);\n\t\t\t\tMetaData metaData = new MetaDataBuilder().fileName(\"Fake_File.bin\").md5(hashString).fileType(\"bin\")\n\t\t\t\t\t\t.createMetaData();\n\t\t\t\tFile file = new File(\"Fake_File.bin\", fileDataArray);\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.File, file).metaData(metaData)\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 19:\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.Template,\n\t\t\t\t\t\tnewComplexTemplateInstance()).createMetric();\n\t\t\tcase 22:\n\t\t\t\tByte[] int8ArrayValue = new Byte[5];\n\t\t\t\tfor (int i = 0; i < 5; i++) {\n\t\t\t\t\tint8ArrayValue[i] = getRandomInt8();\n\t\t\t\t}\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.Int8Array, int8ArrayValue)\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 23:\n\t\t\t\tShort[] int16ArrayValue = new Short[5];\n\t\t\t\tfor (int i = 0; i < 5; i++) {\n\t\t\t\t\tint16ArrayValue[i] = getRandomInt16();\n\t\t\t\t}\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.Int16Array, int16ArrayValue)\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 24:\n\t\t\t\tInteger[] int32ArrayValue = new Integer[5];\n\t\t\t\tfor (int i = 0; i < 5; i++) {\n\t\t\t\t\tint32ArrayValue[i] = getRandomInt32();\n\t\t\t\t}\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.Int32Array, int32ArrayValue)\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 25:\n\t\t\t\tLong[] int64ArrayValue = new Long[5];\n\t\t\t\tfor (int i = 0; i < 5; i++) {\n\t\t\t\t\tint64ArrayValue[i] = getRandomInt64();\n\t\t\t\t}\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.Int64Array, int64ArrayValue)\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 26:\n\t\t\t\tShort[] uInt8ArrayValue = new Short[5];\n\t\t\t\tfor (int i = 0; i < 5; i++) {\n\t\t\t\t\tuInt8ArrayValue[i] = getRandomUInt8();\n\t\t\t\t}\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.UInt8Array, uInt8ArrayValue)\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 27:\n\t\t\t\tInteger[] uInt16rrayValue = new Integer[5];\n\t\t\t\tfor (int i = 0; i < 5; i++) {\n\t\t\t\t\tuInt16rrayValue[i] = getRandomUInt16();\n\t\t\t\t}\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.UInt16Array, uInt16rrayValue)\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 28:\n\t\t\t\tLong[] uInt32ArrayValue = new Long[5];\n\t\t\t\tfor (int i = 0; i < 5; i++) {\n\t\t\t\t\tuInt32ArrayValue[i] = getRandomUInt32();\n\t\t\t\t}\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.UInt32Array, uInt32ArrayValue)\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 29:\n\t\t\t\tBigInteger[] uInt64ArrayValue = new BigInteger[5];\n\t\t\t\tfor (int i = 0; i < 5; i++) {\n\t\t\t\t\tuInt64ArrayValue[i] = getRandomUInt64();\n\t\t\t\t}\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.UInt64Array, uInt64ArrayValue)\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 30:\n\t\t\t\tFloat[] floatArrayValue = new Float[5];\n\t\t\t\tfor (int i = 0; i < 5; i++) {\n\t\t\t\t\tfloatArrayValue[i] = random.nextFloat();\n\t\t\t\t}\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.FloatArray, floatArrayValue)\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 31:\n\t\t\t\tDouble[] doubleArrayValue = new Double[5];\n\t\t\t\tfor (int i = 0; i < 5; i++) {\n\t\t\t\t\tdoubleArrayValue[i] = random.nextDouble();\n\t\t\t\t}\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.DoubleArray, doubleArrayValue)\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 32:\n\t\t\t\tBoolean[] booleanArrayValue = new Boolean[5];\n\t\t\t\tfor (int i = 0; i < 5; i++) {\n\t\t\t\t\tbooleanArrayValue[i] = random.nextBoolean();\n\t\t\t\t}\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.BooleanArray, booleanArrayValue)\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 33:\n\t\t\t\tString[] stringArrayValue = new String[5];\n\t\t\t\tfor (int i = 0; i < 5; i++) {\n\t\t\t\t\tstringArrayValue[i] = getRandomString(8);\n\t\t\t\t}\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.StringArray, stringArrayValue)\n\t\t\t\t\t\t.createMetric();\n\t\t\tcase 34:\n\t\t\t\tDate[] dateTimeArrayValue = new Date[5];\n\t\t\t\tfor (int i = 0; i < 5; i++) {\n\t\t\t\t\tdateTimeArrayValue[i] = new Date(getRandomInt64());\n\t\t\t\t}\n\t\t\t\treturn new MetricBuilder(namePrefix + \"-\" + index, MetricDataType.DateTimeArray, dateTimeArrayValue)\n\t\t\t\t\t\t.createMetric();\n\t\t\tdefault:\n\t\t\t\tlogger.error(\"Failed to get a metric for dataType {}\", dataType);\n\t\t\t\treturn null;\n\t\t}\n\t}\n\n\tprivate DataSet newDataSet() throws SparkplugException {\n\t\treturn new DataSetBuilder(14).addColumnName(\"Int8s\").addColumnName(\"Int16s\").addColumnName(\"Int32s\")\n\t\t\t\t.addColumnName(\"Int64s\").addColumnName(\"UInt8s\").addColumnName(\"UInt16s\").addColumnName(\"UInt32s\")\n\t\t\t\t.addColumnName(\"UInt64s\").addColumnName(\"Floats\").addColumnName(\"Doubles\").addColumnName(\"Booleans\")\n\t\t\t\t.addColumnName(\"Strings\").addColumnName(\"Dates\").addColumnName(\"Texts\").addType(DataSetDataType.Int8)\n\t\t\t\t.addType(DataSetDataType.Int16).addType(DataSetDataType.Int32).addType(DataSetDataType.Int64)\n\t\t\t\t.addType(DataSetDataType.UInt8).addType(DataSetDataType.UInt16).addType(DataSetDataType.UInt32)\n\t\t\t\t.addType(DataSetDataType.UInt64).addType(DataSetDataType.Float).addType(DataSetDataType.Double)\n\t\t\t\t.addType(DataSetDataType.Boolean).addType(DataSetDataType.String).addType(DataSetDataType.DateTime)\n\t\t\t\t.addType(DataSetDataType.Text)\n\t\t\t\t.addRow(new RowBuilder().addValue(new Value<Byte>(DataSetDataType.Int8, getRandomInt8()))\n\t\t\t\t\t\t.addValue(new Value<Short>(DataSetDataType.Int16, getRandomInt16()))\n\t\t\t\t\t\t.addValue(new Value<Integer>(DataSetDataType.Int32, getRandomInt32()))\n\t\t\t\t\t\t.addValue(new Value<Long>(DataSetDataType.Int64, getRandomInt64()))\n\t\t\t\t\t\t.addValue(new Value<Short>(DataSetDataType.UInt8, getRandomUInt8()))\n\t\t\t\t\t\t.addValue(new Value<Integer>(DataSetDataType.UInt16, getRandomUInt16()))\n\t\t\t\t\t\t.addValue(new Value<Long>(DataSetDataType.UInt32, getRandomUInt32()))\n\t\t\t\t\t\t.addValue(new Value<BigInteger>(DataSetDataType.UInt64, getRandomUInt64()))\n\t\t\t\t\t\t.addValue(new Value<Float>(DataSetDataType.Float, random.nextFloat()))\n\t\t\t\t\t\t.addValue(new Value<Double>(DataSetDataType.Double, random.nextDouble()))\n\t\t\t\t\t\t.addValue(new Value<Boolean>(DataSetDataType.Boolean, random.nextBoolean()))\n\t\t\t\t\t\t.addValue(new Value<String>(DataSetDataType.String, UUID.randomUUID().toString()))\n\t\t\t\t\t\t.addValue(new Value<Date>(DataSetDataType.DateTime, new Date()))\n\t\t\t\t\t\t.addValue(new Value<String>(DataSetDataType.Text, UUID.randomUUID().toString())).createRow())\n\t\t\t\t.addRow(new RowBuilder().addValue(new Value<Byte>(DataSetDataType.Int8, getRandomInt8()))\n\t\t\t\t\t\t.addValue(new Value<Short>(DataSetDataType.Int16, getRandomInt16()))\n\t\t\t\t\t\t.addValue(new Value<Integer>(DataSetDataType.Int32, getRandomInt32()))\n\t\t\t\t\t\t.addValue(new Value<Long>(DataSetDataType.Int64, getRandomInt64()))\n\t\t\t\t\t\t.addValue(new Value<Short>(DataSetDataType.UInt8, getRandomUInt8()))\n\t\t\t\t\t\t.addValue(new Value<Integer>(DataSetDataType.UInt16, getRandomUInt16()))\n\t\t\t\t\t\t.addValue(new Value<Long>(DataSetDataType.UInt32, getRandomUInt32()))\n\t\t\t\t\t\t.addValue(new Value<BigInteger>(DataSetDataType.UInt64, getRandomUInt64()))\n\t\t\t\t\t\t.addValue(new Value<Float>(DataSetDataType.Float, random.nextFloat()))\n\t\t\t\t\t\t.addValue(new Value<Double>(DataSetDataType.Double, random.nextDouble()))\n\t\t\t\t\t\t.addValue(new Value<Boolean>(DataSetDataType.Boolean, random.nextBoolean()))\n\t\t\t\t\t\t.addValue(new Value<String>(DataSetDataType.String, UUID.randomUUID().toString()))\n\t\t\t\t\t\t.addValue(new Value<Date>(DataSetDataType.DateTime, new Date()))\n\t\t\t\t\t\t.addValue(new Value<String>(DataSetDataType.Text, UUID.randomUUID().toString())).createRow())\n\t\t\t\t.createDataSet();\n\t}\n\n\tprivate Template newSimpleTemplate(boolean isDef, String templatRef) throws SparkplugException {\n\t\tList<Metric> metrics = new ArrayList<Metric>();\n\t\tmetrics.add(new MetricBuilder(\"MyInt8\", MetricDataType.Int8, getRandomInt8()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyInt16\", MetricDataType.Int16, getRandomInt16()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyInt32\", MetricDataType.Int32, getRandomInt32()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyInt64\", MetricDataType.Int64, getRandomInt64()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyUInt8\", MetricDataType.UInt8, getRandomUInt8()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyUInt16\", MetricDataType.UInt16, getRandomUInt16()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyUInt32\", MetricDataType.UInt32, getRandomUInt32()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyUInt64\", MetricDataType.UInt64, getRandomUInt64()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyFloat\", MetricDataType.Float, random.nextFloat()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyDouble\", MetricDataType.Double, random.nextDouble()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyBoolean\", MetricDataType.Boolean, random.nextBoolean()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyString\", MetricDataType.String, getRandomString(10)).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyDateTime\", MetricDataType.DateTime, new Date()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyText\", MetricDataType.Text, getRandomString(10)).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyUUID\", MetricDataType.UUID, UUID.randomUUID().toString()).createMetric());\n\n\t\treturn new TemplateBuilder().version(\"v1.0\").templateRef(templatRef).definition(isDef)\n\t\t\t\t.addParameters(newParams()).addMetrics(metrics).createTemplate();\n\t}\n\n\tprivate List<Parameter> newParams() throws SparkplugException {\n\t\tRandom random = new Random();\n\t\tList<Parameter> params = new ArrayList<Parameter>();\n\t\tparams.add(new Parameter(\"ParamInt32\", ParameterDataType.Int32, random.nextInt()));\n\t\tparams.add(new Parameter(\"ParamFloat\", ParameterDataType.Float, random.nextFloat()));\n\t\tparams.add(new Parameter(\"ParamDouble\", ParameterDataType.Double, random.nextDouble()));\n\t\tparams.add(new Parameter(\"ParamBoolean\", ParameterDataType.Boolean, random.nextBoolean()));\n\t\tparams.add(new Parameter(\"ParamString\", ParameterDataType.String, UUID.randomUUID().toString()));\n\t\treturn params;\n\t}\n\n\tprivate List<Metric> newComplexTemplateDefs() throws Exception {\n\t\tArrayList<Metric> metrics = new ArrayList<Metric>();\n\n\t\t// Add a new template \"subType\" definition with two primitive members\n\t\tmetrics.add(new MetricBuilder(\"subType\", MetricDataType.Template,\n\t\t\t\tnew TemplateBuilder().definition(true).addParameters(newParams())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"StringMember\", MetricDataType.String, \"value\").createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"IntegerMember\", MetricDataType.Int32, 0).createMetric())\n\t\t\t\t\t\t.createTemplate()).createMetric());\n\t\t// Add new template \"complexType\" definition that contains an instance of \"subType\" as a member\n\t\tmetrics.add(new MetricBuilder(\"complexType\", MetricDataType.Template, new TemplateBuilder().definition(true)\n\t\t\t\t.addParameters(newParams())\n\t\t\t\t.addMetric(new MetricBuilder(\"subType\", MetricDataType.Template, new TemplateBuilder().definition(false)\n\t\t\t\t\t\t.templateRef(\"subType\")\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"StringMember\", MetricDataType.String, \"value\").createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"IntegerMember\", MetricDataType.Int32, 0).createMetric())\n\t\t\t\t\t\t.createTemplate()).createMetric())\n\t\t\t\t.createTemplate()).createMetric());\n\n\t\treturn metrics;\n\t}\n\n\tprivate Template newComplexTemplateInstance() throws Exception {\n\t\t// Create and return the template\n\t\treturn new TemplateBuilder().definition(false).templateRef(\"complexType\").addParameters(newParams())\n\t\t\t\t.addMetric(new MetricBuilder(\"subType\", MetricDataType.Template, new TemplateBuilder().definition(false)\n\t\t\t\t\t\t.templateRef(\"subType\").addParameters(newParams())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"StringMember\", MetricDataType.String, \"myValue\").createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"IntegerMember\", MetricDataType.Int32, 1).createMetric())\n\t\t\t\t\t\t.createTemplate()).createMetric())\n\t\t\t\t.createTemplate();\n\t}\n\n\tprivate byte getRandomInt8() {\n\t\tbyte[] bytes = new byte[1];\n\t\trandom.nextBytes(bytes);\n\t\treturn bytes[0];\n\t}\n\n\tprivate short getRandomInt16() {\n\t\treturn (short) random.nextInt(1 << 16);\n\t}\n\n\tprivate int getRandomInt32() {\n\t\treturn random.nextInt();\n\t}\n\n\tprivate long getRandomInt64() {\n\t\treturn random.nextLong();\n\t}\n\n\tprivate short getRandomUInt8() {\n\t\treturn (short) random.nextInt(Short.MAX_VALUE + 1);\n\t}\n\n\tprivate int getRandomUInt16() {\n\t\treturn ThreadLocalRandom.current().nextInt(0, 64535);\n\n\t}\n\n\tprivate long getRandomUInt32() {\n\t\treturn ThreadLocalRandom.current().nextLong(4294967296L);\n\t}\n\n\tprivate BigInteger getRandomUInt64() {\n\t\tBigInteger minSize = new BigInteger(\"0\");\n\t\tBigInteger maxSize = new BigInteger(\"18446744073709551616\");\n\t\tBigInteger randomResult = new BigInteger(64, random);\n\t\twhile (randomResult.compareTo(minSize) <= 0 || randomResult.compareTo(maxSize) >= 0) {\n\t\t\trandomResult = new BigInteger(64, random);\n\t\t}\n\t\treturn randomResult;\n\t}\n\n\tprivate String getRandomString(int length) {\n\t\treturn RandomStringUtils.randomAlphanumeric(length).toUpperCase();\n\t}\n}\n"
  },
  {
    "path": "java/compat_impl/edge/src/main/resources/logback.xml",
    "content": "<configuration>\n\n  <appender name=\"STDOUT\" class=\"ch.qos.logback.core.ConsoleAppender\">\n    <!-- encoders are assigned the type\n         ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->\n    <encoder>\n      <pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>\n    </encoder>\n  </appender>\n\n<!--\n  <logger name=\"org.eclipse.tahu.host.HostApplication\" level=\"TRACE\"/>\n-->\n  <root level=\"TRACE\">\n    <appender-ref ref=\"STDOUT\" />\n  </root>\n</configuration>\n"
  },
  {
    "path": "java/compat_impl/host/pom.xml",
    "content": "<!--/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n  xsi:schemaLocation=\"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">\n  <modelVersion>4.0.0</modelVersion>\n\n  <parent>\n    <groupId>org.eclipse.tahu</groupId>\n    <artifactId>tahu</artifactId>\n    <version>1.0.7</version>\n    <relativePath>../../pom.xml</relativePath>\n  </parent>\n\n  <artifactId>tahu-host-compat</artifactId>\n  <packaging>bundle</packaging>\n  <name>Tahu Host Compatible Implementation</name>\n\n  <dependencies>\n    <dependency>\n      <groupId>org.eclipse.tahu</groupId>\n      <artifactId>tahu-core</artifactId>\n      <version>${project.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>org.eclipse.tahu</groupId>\n      <artifactId>tahu-host</artifactId>\n      <version>${project.version}</version>\n    </dependency>\n  </dependencies>\n\n  <build>\n    <plugins>\n      <plugin>\n        <groupId>org.sonatype.plugins</groupId>\n        <artifactId>nexus-staging-maven-plugin</artifactId>\n        <configuration>\n          <skipNexusStagingDeployMojo>false</skipNexusStagingDeployMojo>\n        </configuration>\n      </plugin>\n      <plugin>\n        <groupId>org.codehaus.mojo</groupId>\n        <artifactId>license-maven-plugin</artifactId>\n        <version>1.8</version>\n        <executions>\n          <execution>\n            <id>add-third-party</id>\n            <phase>package</phase>\n            <goals>\n              <goal>add-third-party</goal>\n              <goal>download-licenses</goal>\n            </goals>\n            <configuration>\n              <useMissingFile>true</useMissingFile>\n              <excludedScopes>test</excludedScopes>\n              <excludedGroups> (org.eclipse.tahu*)\n              </excludedGroups>\n              <licenseMerges>\n                <licenseMerge>The Apache Software License, Version\n                  2.0|Apache License, Version 2.0|Apache Public License\n                  2.0|Apache License 2.0|Apache Software License -\n                  Version 2.0</licenseMerge>\n              </licenseMerges>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <groupId>org.apache.felix</groupId>\n        <artifactId>maven-bundle-plugin</artifactId>\n        <version>${maven.bundle.version}</version>\n        <extensions>true</extensions>\n        <configuration>\n          <instructions>\n            <Export-Package>org.eclipse.tahu.*</Export-Package>\n            <Import-Package>*;resolution:=optional</Import-Package>\n          </instructions>\n        </configuration>\n        <executions>\n          <execution>\n            <id>bundle-manifest</id>\n            <phase>process-classes</phase>\n            <goals>\n              <goal>manifest</goal>\n            </goals>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <groupId>org.apache.maven.plugins</groupId>\n        <artifactId>maven-shade-plugin</artifactId>\n        <version>3.2.2</version>\n        <configuration>\n          <createDependencyReducedPom>false</createDependencyReducedPom>\n        </configuration>\n        <executions>\n          <execution>\n            <phase>package</phase>\n            <goals>\n              <goal>shade</goal>\n            </goals>\n            <configuration>\n              <filters>\n                <filter>\n                  <artifact>*:*</artifact>\n                  <excludes>\n                    <exclude>META-INF/*.SF</exclude>\n                    <exclude>META-INF/*.DSA</exclude>\n                    <exclude>META-INF/*.RSA</exclude>\n                  </excludes>\n                </filter>\n              </filters>\n              <transformers>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ManifestResourceTransformer\">\n                  <mainClass>org.eclipse.tahu.host.SparkplugHostApplication</mainClass>\n                  <manifestEntries>\n                    <Tahu-Version>${project.version}</Tahu-Version>\n                    <Tahu-Build-Date>${timestamp}</Tahu-Build-Date>\n                  </manifestEntries>\n                </transformer>\n              </transformers>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n    </plugins>\n  </build>\n</project>\n"
  },
  {
    "path": "java/compat_impl/host/src/main/java/org/eclipse/tahu/host/CommandListener.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host;\n\nimport java.io.File;\nimport java.nio.charset.StandardCharsets;\nimport java.nio.file.FileSystems;\nimport java.util.Set;\nimport java.util.concurrent.Executors;\nimport java.util.concurrent.ScheduledExecutorService;\nimport java.util.concurrent.TimeUnit;\nimport java.util.stream.Collectors;\nimport java.util.stream.Stream;\n\nimport org.apache.commons.io.FileUtils;\nimport org.eclipse.tahu.exception.TahuErrorCode;\nimport org.eclipse.tahu.exception.TahuException;\nimport org.eclipse.tahu.message.model.Message;\nimport org.eclipse.tahu.util.MessageUtil;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class CommandListener implements Runnable {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(CommandListener.class.getName());\n\n\tprivate ScheduledExecutorService executor;\n\n\tprivate CommandPublisher commandPublisher;\n\n\tprivate File fileDirectory;\n\n\tprivate long scanRate;\n\n\tpublic CommandListener(CommandPublisher commandPublisher, String fileDirectoryPath, long scanRate) {\n\t\tthis.commandPublisher = commandPublisher;\n\t\tthis.fileDirectory = new File(fileDirectoryPath);\n\t\tthis.scanRate = scanRate;\n\t}\n\n\tpublic void start() throws TahuException {\n\t\tif (!fileDirectory.exists()) {\n\t\t\tlogger.info(\"Creating file command listener directory at {}\", fileDirectory.getPath());\n\t\t\tfileDirectory.mkdirs();\n\t\t} else if (!fileDirectory.isDirectory()) {\n\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT, \"The specified directory '{}' is not a directory\");\n\t\t}\n\n\t\texecutor = Executors.newSingleThreadScheduledExecutor();\n\t\texecutor.scheduleWithFixedDelay(this, 0, scanRate, TimeUnit.MILLISECONDS);\n\t}\n\n\tpublic void shutdown() {\n\t\texecutor.shutdownNow();\n\t\texecutor = null;\n\t}\n\n\t@Override\n\tpublic void run() {\n\t\ttry {\n\t\t\tSet<String> fileNames = Stream.of(fileDirectory.listFiles()).filter(file -> !file.isDirectory())\n\t\t\t\t\t.map(File::getName).collect(Collectors.toSet());\n\n\t\t\tif (fileNames != null && !fileNames.isEmpty()) {\n\t\t\t\tfor (String fileName : fileNames) {\n\t\t\t\t\tfileName = fileDirectory.getAbsolutePath() + FileSystems.getDefault().getSeparator() + fileName;\n\t\t\t\t\tlogger.info(\"Found file: {}\", fileName);\n\t\t\t\t\tFile commandFile = new File(fileName);\n\t\t\t\t\tString jsonContents = FileUtils.readFileToString(commandFile, StandardCharsets.UTF_8);\n\n\t\t\t\t\ttry {\n\t\t\t\t\t\tMessage message = MessageUtil.fromJsonString(jsonContents, true);\n\t\t\t\t\t\tlogger.info(\"Message: {}\", message);\n\n\t\t\t\t\t\tif (message != null) {\n\t\t\t\t\t\t\tcommandPublisher.publishCommand(message.getTopic(), message.getPayload());\n\n\t\t\t\t\t\t\tlogger.info(\"Deleting the file {}\", commandFile);\n\t\t\t\t\t\t\tcommandFile.delete();\n\t\t\t\t\t\t}\n\t\t\t\t\t} catch (Exception e) {\n\t\t\t\t\t\tlogger.error(\"Failed to handle input file {}\", fileName, e);\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"File scanning in the Comamnd Worker failed\", e);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/compat_impl/host/src/main/java/org/eclipse/tahu/host/SparkplugHostApplication.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host;\n\nimport java.util.ArrayList;\nimport java.util.Arrays;\nimport java.util.List;\n\nimport org.eclipse.tahu.exception.TahuException;\nimport org.eclipse.tahu.host.api.HostApplicationEventHandler;\nimport org.eclipse.tahu.message.SparkplugBPayloadDecoder;\nimport org.eclipse.tahu.message.model.DeviceDescriptor;\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\nimport org.eclipse.tahu.message.model.Message;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.SparkplugDescriptor;\nimport org.eclipse.tahu.message.model.SparkplugMeta;\nimport org.eclipse.tahu.model.MqttServerDefinition;\nimport org.eclipse.tahu.mqtt.MqttClientId;\nimport org.eclipse.tahu.mqtt.MqttServerName;\nimport org.eclipse.tahu.mqtt.MqttServerUrl;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class SparkplugHostApplication implements HostApplicationEventHandler {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(SparkplugHostApplication.class.getName());\n\n\tprivate static final String COMMAND_LISTENER_DIRECTORY = \"/tmp/commands\";\n\tprivate static final long COMMAND_LISTENER_POLL_RATE = 50L;\n\n\tprivate static final String HOST_ID = \"IamHost\";\n\tprivate static final String MQTT_SERVER_NAME_1 = \"Mqtt Server One\";\n\tprivate static final String MQTT_CLIENT_ID_1 = \"Tahu_Host_Application\";\n\tprivate static final String MQTT_SERVER_URL_1 = \"tcp://localhost:1883\";\n\tprivate static final String USERNAME_1 = \"admin\";\n\tprivate static final String PASSWORD_1 = \"changeme\";\n\tprivate static final String MQTT_SERVER_NAME_2 = \"Mqtt Server Two\";\n\tprivate static final String MQTT_CLIENT_ID_2 = \"Tahu_Host_Application\";\n\tprivate static final String MQTT_SERVER_URL_2 = \"tcp://localhost:1884\";\n\tprivate static final String USERNAME_2 = null;\n\tprivate static final String PASSWORD_2 = null;\n\tprivate static final int KEEP_ALIVE_TIMEOUT = 30;\n\n\tprivate CommandListener commandListener;\n\tprivate HostApplication hostApplication;\n\n\tprivate static final List<MqttServerDefinition> mqttServerDefinitions = new ArrayList<>();\n\n\tpublic static void main(String[] arg) {\n\t\ttry {\n\t\t\tmqttServerDefinitions.add(new MqttServerDefinition(new MqttServerName(MQTT_SERVER_NAME_1),\n\t\t\t\t\tnew MqttClientId(MQTT_CLIENT_ID_1, false), new MqttServerUrl(MQTT_SERVER_URL_1), USERNAME_1,\n\t\t\t\t\tPASSWORD_1, KEEP_ALIVE_TIMEOUT, null));\n//\t\t\tmqttServerDefinitions.add(new MqttServerDefinition(new MqttServerName(MQTT_SERVER_NAME_2),\n//\t\t\t\t\tnew MqttClientId(MQTT_CLIENT_ID_2, false), new MqttServerUrl(MQTT_SERVER_URL_2), USERNAME_2,\n//\t\t\t\t\tPASSWORD_2, KEEP_ALIVE_TIMEOUT, null));\n\n\t\t\tSystem.out.println(\"Starting the Sparkplug Host Application\");\n\t\t\tSystem.out.println(\"\\tSparkplug Host Application ID: \" + HOST_ID);\n\t\t\tSystem.out.println(\"\\tKeep Alive Timeout: \" + KEEP_ALIVE_TIMEOUT);\n\t\t\tSystem.out.println(\"\\tCommand Listener Directory: \" + COMMAND_LISTENER_DIRECTORY);\n\t\t\tSystem.out.println(\"\\tCommand Listener Poll Rate: \" + COMMAND_LISTENER_POLL_RATE);\n\n\t\t\tfor (MqttServerDefinition mqttServerDefinition : mqttServerDefinitions) {\n\t\t\t\tSystem.out.println(\"\\tMQTT Server Name: \" + mqttServerDefinition.getMqttServerName());\n\t\t\t\tSystem.out.println(\"\\tMQTT Client ID: \" + mqttServerDefinition.getMqttClientId());\n\t\t\t\tSystem.out.println(\"\\tMQTT Server URL: \" + mqttServerDefinition.getMqttServerUrl());\n\t\t\t\tSystem.out.println(\"\\tUsername: \" + mqttServerDefinition.getUsername());\n\t\t\t\tSystem.out.println(\"\\tPassword: ********\");\n\t\t\t\tSystem.out.println(\"\\tKeep Alive Timeout: \" + mqttServerDefinition.getKeepAliveTimeout());\n\t\t\t}\n\n\t\t\t// Start the Host Application\n\t\t\tSparkplugHostApplication sparkplugHostApplication = new SparkplugHostApplication();\n\t\t\tsparkplugHostApplication.start();\n\n\t\t\t// Sleep a while\n\t\t\tThread.sleep(360000);\n\n\t\t\t// Shutdown\n\t\t\tsparkplugHostApplication.shutdown();\n\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to run the Edge Node\", e);\n\t\t}\n\t}\n\n\tpublic SparkplugHostApplication() {\n\t\ttry {\n\t\t\thostApplication = new HostApplication(this, HOST_ID,\n\t\t\t\t\tnew ArrayList<>(Arrays.asList(SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX + \"/#\")),\n\t\t\t\t\tmqttServerDefinitions, null, new SparkplugBPayloadDecoder());\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to create the HostApplication\", e);\n\t\t}\n\t}\n\n\tpublic void start() throws TahuException {\n\t\tcommandListener = new CommandListener(hostApplication, COMMAND_LISTENER_DIRECTORY, COMMAND_LISTENER_POLL_RATE);\n\t\tcommandListener.start();\n\t\thostApplication.start();\n\t}\n\n\tpublic void shutdown() {\n\t\tcommandListener.shutdown();\n\t\tcommandListener = null;\n\t\thostApplication.shutdown();\n\t}\n\n\t@Override\n\tpublic void onConnect() {\n\t\tlogger.info(\"onConnect...\");\n\t}\n\n\t@Override\n\tpublic void onDisconnect() {\n\t\tlogger.info(\"onDisconnect...\");\n\t}\n\n\t@Override\n\tpublic void onNodeBirthArrived(EdgeNodeDescriptor edgeNodeDescriptor, Message message) {\n\t\tlogger.info(\"onNodeBirthArrived from {}...\", edgeNodeDescriptor);\n\t}\n\n\t@Override\n\tpublic void onNodeBirthComplete(EdgeNodeDescriptor edgeNodeDescriptor) {\n\t\tlogger.info(\"onNodeBirthComplete from {}...\", edgeNodeDescriptor);\n\t}\n\n\t@Override\n\tpublic void onNodeDataArrived(EdgeNodeDescriptor edgeNodeDescriptor, Message message) {\n\t\tlogger.info(\"onNodeDataArrived from {}...\", edgeNodeDescriptor);\n\t}\n\n\t@Override\n\tpublic void onNodeDataComplete(EdgeNodeDescriptor edgeNodeDescriptor) {\n\t\tlogger.info(\"onNodeDataComplete from {}...\", edgeNodeDescriptor);\n\t}\n\n\t@Override\n\tpublic void onNodeDeath(EdgeNodeDescriptor edgeNodeDescriptor, Message message) {\n\t\tlogger.info(\"onNodeDeath from {}...\", edgeNodeDescriptor);\n\t}\n\n\t@Override\n\tpublic void onNodeDeathComplete(EdgeNodeDescriptor edgeNodeDescriptor) {\n\t\tlogger.info(\"onNodeDeathComplete from {}...\", edgeNodeDescriptor);\n\t}\n\n\t@Override\n\tpublic void onDeviceBirthArrived(DeviceDescriptor deviceDescriptor, Message message) {\n\t\tlogger.info(\"onDeviceBirthArrived from {}...\", deviceDescriptor);\n\t}\n\n\t@Override\n\tpublic void onDeviceBirthComplete(DeviceDescriptor deviceDescriptor) {\n\t\tlogger.info(\"onDeviceBirthComplete from {}...\", deviceDescriptor);\n\t}\n\n\t@Override\n\tpublic void onDeviceDataArrived(DeviceDescriptor deviceDescriptor, Message message) {\n\t\tlogger.info(\"onDeviceDataArrived from {}...\", deviceDescriptor);\n\t}\n\n\t@Override\n\tpublic void onDeviceDataComplete(DeviceDescriptor deviceDescriptor) {\n\t\tlogger.info(\"onDeviceDataComplete from {}...\", deviceDescriptor);\n\t}\n\n\t@Override\n\tpublic void onDeviceDeath(DeviceDescriptor deviceDescriptor, Message message) {\n\t\tlogger.info(\"onDeviceDeath from {}...\", deviceDescriptor);\n\t}\n\n\t@Override\n\tpublic void onDeviceDeathComplete(DeviceDescriptor deviceDescriptor) {\n\t\tlogger.info(\"onDeviceDeathComplete from {}...\", deviceDescriptor);\n\t}\n\n\t@Override\n\tpublic void onBirthMetric(SparkplugDescriptor sparkplugDescriptor, Metric metric) {\n\t\tlogger.info(\"onBirthMetric from {} with metric={}...\", sparkplugDescriptor, metric);\n\t}\n\n\t@Override\n\tpublic void onDataMetric(SparkplugDescriptor sparkplugDescriptor, Metric metric) {\n\t\tlogger.info(\"onDataMetric from {} with metric={}...\", sparkplugDescriptor, metric);\n\t}\n\n\tpublic void onStale(SparkplugDescriptor sparkplugDescriptor, Metric metric) {\n\t\tlogger.info(\"onStale from {} for {}...\", sparkplugDescriptor, metric.getName());\n\t}\n\n\t@Override\n\tpublic void onMessage(SparkplugDescriptor sparkplugDescriptor, Message message) {\n\t\tlogger.info(\"onMessage from {} with message={}...\", sparkplugDescriptor, message);\n\t}\n}\n"
  },
  {
    "path": "java/compat_impl/host/src/main/resources/logback.out.xml",
    "content": ""
  },
  {
    "path": "java/compat_impl/host/src/main/resources/logback.xml",
    "content": "<configuration>\n\n  <appender name=\"STDOUT\" class=\"ch.qos.logback.core.ConsoleAppender\">\n    <!-- encoders are assigned the type\n         ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->\n    <encoder>\n      <pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>\n    </encoder>\n  </appender>\n\n<!--\n  <logger name=\"org.eclipse.tahu.host.HostApplication\" level=\"TRACE\"/>\n-->\n  <root level=\"TRACE\">\n    <appender-ref ref=\"STDOUT\" />\n  </root>\n</configuration>\n"
  },
  {
    "path": "java/examples/device_timestamp/THIRD-PARTY.txt",
    "content": "\nLists of 5 third-party dependencies.\n     (New BSD license) Protocol Buffer Java API (com.google.protobuf:protobuf-java:2.6.1 - https://developers.google.com/protocol-buffers/)\n     (The Apache Software License, Version 2.0) Apache Log4j (log4j:log4j:1.2.17 - http://logging.apache.org/log4j/1.2/)\n     (Eclipse Public License - Version 1.0) org.eclipse.paho.client.mqttv3 (org.eclipse.paho:org.eclipse.paho.client.mqttv3:1.0.2 - http://www.eclipse.org/paho/org.eclipse.paho.client.mqttv3)\n     (MIT License) SLF4J API Module (org.slf4j:slf4j-api:1.7.5 - http://www.slf4j.org)\n     (MIT License) SLF4J LOG4J-12 Binding (org.slf4j:slf4j-log4j12:1.7.5 - http://www.slf4j.org)\n"
  },
  {
    "path": "java/examples/device_timestamp/pom.xml",
    "content": "<!--/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n  xsi:schemaLocation=\"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">\n  <modelVersion>4.0.0</modelVersion>\n\n  <parent>\n    <groupId>org.eclipse.tahu</groupId>\n    <artifactId>tahu-examples</artifactId>\n    <version>1.0.7</version>\n    <relativePath>../pom.xml</relativePath>\n  </parent>\n\n  <artifactId>example_device_timestamp</artifactId>\n  <packaging>jar</packaging>\n  <name>Sparkplug B Device Timestamp Example</name>\n\n  <dependencies>\n    <dependency>\n      <groupId>org.eclipse.tahu</groupId>\n      <artifactId>tahu-core</artifactId>\n      <version>${project.version}</version>\n    </dependency>\n  </dependencies>\n  <build>\n    <plugins>\n      <plugin>\n        <groupId>org.sonatype.plugins</groupId>\n        <artifactId>nexus-staging-maven-plugin</artifactId>\n        <configuration>\n          <skipNexusStagingDeployMojo>true</skipNexusStagingDeployMojo>\n        </configuration>\n      </plugin>\n\n      <!-- New Build mechanism - replaces maven-assembly-plugin -->\n      <plugin>\n        <groupId>org.apache.maven.plugins</groupId>\n        <artifactId>maven-shade-plugin</artifactId>\n        <version>2.4.1</version>\n        <executions>\n          <execution>\n            <phase>package</phase>\n            <goals>\n              <goal>shade</goal>\n            </goals>\n            <configuration>\n              <filters>\n                <filter>\n                  <artifact>*:*</artifact>\n                  <excludes>\n                    <exclude>META-INF/*.SF</exclude>\n                    <exclude>META-INF/*.DSA</exclude>\n                    <exclude>META-INF/*.RSA</exclude>\n                  </excludes>\n                </filter>\n              </filters>\n              <transformers>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ManifestResourceTransformer\">\n                  <mainClass>org.eclipse.tahu.SparkplugExample</mainClass>\n                </transformer>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ServicesResourceTransformer\" />\n              </transformers>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <groupId>org.codehaus.mojo</groupId>\n        <artifactId>license-maven-plugin</artifactId>\n        <version>1.8</version>\n        <executions>\n          <execution>\n            <id>add-third-party</id>\n            <phase>package</phase>\n            <goals>\n              <goal>add-third-party</goal>\n              <goal>download-licenses</goal>\n            </goals>\n            <configuration>\n              <useMissingFile>true</useMissingFile>\n              <excludedScopes>test</excludedScopes>\n              <excludedGroups> (org.eclipse.tahu*)\n              </excludedGroups>\n              <licenseMerges>\n                <licenseMerge>The Apache Software License, Version\n                  2.0|Apache License, Version 2.0|Apache Public License\n                  2.0|Apache License 2.0|Apache Software License -\n                  Version 2.0</licenseMerge>\n              </licenseMerges>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n    </plugins>\n  </build>\n</project>\n"
  },
  {
    "path": "java/examples/device_timestamp/src/main/java/org/eclipse/tahu/SparkplugExample.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu;\n\nimport static org.eclipse.tahu.message.model.MetricDataType.Boolean;\nimport static org.eclipse.tahu.message.model.MetricDataType.Int32;\nimport static org.eclipse.tahu.message.model.MetricDataType.Int64;\n\nimport java.util.ArrayList;\nimport java.util.Calendar;\nimport java.util.Date;\nimport java.util.List;\nimport java.util.concurrent.ExecutorService;\nimport java.util.concurrent.Executors;\n\nimport javax.net.SocketFactory;\nimport javax.net.ssl.SSLSocketFactory;\n\nimport org.eclipse.paho.client.mqttv3.IMqttDeliveryToken;\nimport org.eclipse.paho.client.mqttv3.MqttCallbackExtended;\nimport org.eclipse.paho.client.mqttv3.MqttClient;\nimport org.eclipse.paho.client.mqttv3.MqttConnectOptions;\nimport org.eclipse.paho.client.mqttv3.MqttException;\nimport org.eclipse.paho.client.mqttv3.MqttMessage;\nimport org.eclipse.paho.client.mqttv3.MqttPersistenceException;\nimport org.eclipse.tahu.message.SparkplugBPayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadEncoder;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\n\n/**\n * An example Sparkplug B application.\n */\npublic class SparkplugExample implements MqttCallbackExtended {\n\n\tprivate static final String NAMESPACE = \"spBv1.0\";\n\n\t// Configuration\n\tprivate static final boolean USING_REAL_TLS = false;\n\tprivate String serverUrl = \"tcp://192.168.1.53:1883\";\n\tprivate String groupId = \"Sparkplug B Devices\";\n\tprivate String edgeNode = \"Java Sparkplug B Example\";\n\tprivate String deviceId = \"SparkplugBExample\";\n\tprivate String clientId = \"SparkplugBExampleEdgeNode\";\n\tprivate String username = \"admin\";\n\tprivate String password = \"changeme\";\n\tprivate long PUBLISH_PERIOD = 1; // Publish period in milliseconds\n\tprivate ExecutorService executor;\n\tprivate MqttClient client;\n\n\tprivate int index = 0;\n\tprivate Calendar calendar = Calendar.getInstance();\n\n\tprivate int bdSeq = 0;\n\tprivate int seq = 0;\n\n\tprivate Object seqLock = new Object();\n\n\tpublic static void main(String[] args) {\n\t\tSparkplugExample example = new SparkplugExample();\n\t\texample.run();\n\t}\n\n\tpublic void run() {\n\t\ttry {\n\t\t\t// Random generator and thread pool for outgoing published messages\n\t\t\texecutor = Executors.newFixedThreadPool(1);\n\n\t\t\t// Build up DEATH payload - note DEATH payloads don't have a regular sequence number\n\t\t\tSparkplugBPayloadBuilder deathPayload = new SparkplugBPayloadBuilder().setTimestamp(new Date());\n\t\t\tdeathPayload = addBdSeqNum(deathPayload);\n\t\t\tbyte[] deathBytes = new SparkplugBPayloadEncoder().getBytes(deathPayload.createPayload(), false);\n\n\t\t\tMqttConnectOptions options = new MqttConnectOptions();\n\n\t\t\tif (USING_REAL_TLS) {\n\t\t\t\tSocketFactory sf = SSLSocketFactory.getDefault();\n\t\t\t\toptions.setSocketFactory(sf);\n\t\t\t}\n\n\t\t\t// Connect to the MQTT Server\n\t\t\toptions.setAutomaticReconnect(true);\n\t\t\toptions.setCleanSession(true);\n\t\t\toptions.setConnectionTimeout(30);\n\t\t\toptions.setKeepAliveInterval(30);\n\t\t\toptions.setUserName(username);\n\t\t\toptions.setPassword(password.toCharArray());\n\t\t\toptions.setWill(NAMESPACE + \"/\" + groupId + \"/NDEATH/\" + edgeNode, deathBytes, 0, false);\n\t\t\tclient = new MqttClient(serverUrl, clientId);\n\t\t\tclient.setTimeToWait(30000);\n\t\t\tclient.setCallback(this); // short timeout on failure to connect\n\t\t\tclient.connect(options);\n\n\t\t\t// Subscribe to control/command messages for both the edge of network node and the attached devices\n\t\t\tclient.subscribe(NAMESPACE + \"/\" + groupId + \"/NCMD/\" + edgeNode + \"/#\", 0);\n\t\t\tclient.subscribe(NAMESPACE + \"/\" + groupId + \"/DCMD/\" + edgeNode + \"/#\", 0);\n\n\t\t\tList<Metric> nodeMetrics = new ArrayList<Metric>();\n\t\t\tList<Metric> deviceMetrics = new ArrayList<Metric>();\n\n\t\t\t// Loop forever publishing data every PUBLISH_PERIOD\n\t\t\twhile (true) {\n\t\t\t\tThread.sleep(PUBLISH_PERIOD);\n\n\t\t\t\tsynchronized (seqLock) {\n\t\t\t\t\tif (client.isConnected()) {\n\n\t\t\t\t\t\tSystem.out.println(\"Time: \" + calendar.getTimeInMillis() + \"  Index: \" + index);\n\n\t\t\t\t\t\t// Add a 'real time' metric\n\t\t\t\t\t\tnodeMetrics.add(new MetricBuilder(\"MyNodeMetric\", Int32, index).timestamp(calendar.getTime())\n\t\t\t\t\t\t\t\t.createMetric());\n\n\t\t\t\t\t\t// Add a 'real time' metric\n\t\t\t\t\t\tdeviceMetrics.add(new MetricBuilder(\"MyDeviceMetric\", Int32, index + 50)\n\t\t\t\t\t\t\t\t.timestamp(calendar.getTime()).createMetric());\n\n\t\t\t\t\t\t// Publish, increment the calendar and index and reset\n\t\t\t\t\t\tcalendar.add(Calendar.MILLISECOND, 1);\n\t\t\t\t\t\tif (index == 50) {\n\t\t\t\t\t\t\tindex = 0;\n\n\t\t\t\t\t\t\tSystem.out.println(\"nodeMetrics: \" + nodeMetrics.size());\n\t\t\t\t\t\t\tSystem.out.println(\"deviceMetrics: \" + deviceMetrics.size());\n\n\t\t\t\t\t\t\tSparkplugBPayload nodePayload =\n\t\t\t\t\t\t\t\t\tnew SparkplugBPayload(new Date(), nodeMetrics, getSeqNum(), null, null);\n\n\t\t\t\t\t\t\tclient.publish(NAMESPACE + \"/\" + groupId + \"/NDATA/\" + edgeNode,\n\t\t\t\t\t\t\t\t\tnew SparkplugBPayloadEncoder().getBytes(nodePayload, false), 0, false);\n\n\t\t\t\t\t\t\tSparkplugBPayload devicePayload =\n\t\t\t\t\t\t\t\t\tnew SparkplugBPayload(new Date(), deviceMetrics, getSeqNum(), null, null);\n\n\t\t\t\t\t\t\tclient.publish(NAMESPACE + \"/\" + groupId + \"/DDATA/\" + edgeNode + \"/\" + deviceId,\n\t\t\t\t\t\t\t\t\tnew SparkplugBPayloadEncoder().getBytes(devicePayload, false), 0, false);\n\n\t\t\t\t\t\t\tnodeMetrics = new ArrayList<Metric>();\n\t\t\t\t\t\t\tdeviceMetrics = new ArrayList<Metric>();\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tindex++;\n\t\t\t\t\t\t}\n\t\t\t\t\t} else {\n\t\t\t\t\t\tSystem.out.println(\"Not connected - not publishing data\");\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\tprivate void publishBirth() {\n\t\ttry {\n\t\t\tsynchronized (seqLock) {\n\t\t\t\t// Reset the sequence number\n\t\t\t\tseq = 0;\n\n\t\t\t\t// Reset the index and time\n\t\t\t\tindex = 0;\n\t\t\t\tcalendar = Calendar.getInstance();\n\n\t\t\t\t// Create the BIRTH payload and set the position and other metrics\n\t\t\t\tSparkplugBPayload payload =\n\t\t\t\t\t\tnew SparkplugBPayload(calendar.getTime(), new ArrayList<Metric>(), getSeqNum(), null, null);\n\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"bdSeq\", Int64, (long) bdSeq).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"Node Control/Rebirth\", Boolean, false).createMetric());\n\n\t\t\t\tpayload.addMetric(\n\t\t\t\t\t\tnew MetricBuilder(\"MyNodeMetric\", Int32, index).timestamp(calendar.getTime()).createMetric());\n\n\t\t\t\tSystem.out.println(\"Publishing Edge Node Birth\");\n\t\t\t\texecutor.execute(new Publisher(NAMESPACE + \"/\" + groupId + \"/NBIRTH/\" + edgeNode, payload));\n\n\t\t\t\t// Create the payload and add a metric\n\t\t\t\tpayload = new SparkplugBPayload(calendar.getTime(), new ArrayList<Metric>(), getSeqNum(), null, null);\n\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"MyDeviceMetric\", Int32, index + 50).timestamp(calendar.getTime())\n\t\t\t\t\t\t.createMetric());\n\n\t\t\t\tSystem.out.println(\"Publishing Device Birth\");\n\t\t\t\texecutor.execute(\n\t\t\t\t\t\tnew Publisher(NAMESPACE + \"/\" + groupId + \"/DBIRTH/\" + edgeNode + \"/\" + deviceId, payload));\n\n\t\t\t\t// Increment the global vars\n\t\t\t\tcalendar.add(Calendar.MILLISECOND, 1);\n\t\t\t\tindex++;\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\t// Used to add the birth/death sequence number\n\tprivate SparkplugBPayloadBuilder addBdSeqNum(SparkplugBPayloadBuilder payload) throws Exception {\n\t\tif (payload == null) {\n\t\t\tpayload = new SparkplugBPayloadBuilder();\n\t\t}\n\t\tif (bdSeq == 256) {\n\t\t\tbdSeq = 0;\n\t\t}\n\t\tpayload.addMetric(new MetricBuilder(\"bdSeq\", Int64, (long) bdSeq).createMetric());\n\t\tbdSeq++;\n\t\treturn payload;\n\t}\n\n\t// Used to add the sequence number\n\tprivate long getSeqNum() throws Exception {\n\t\tSystem.out.println(\"seq: \" + seq);\n\t\tif (seq == 256) {\n\t\t\tseq = 0;\n\t\t}\n\t\treturn seq++;\n\t}\n\n\t@Override\n\tpublic void connectComplete(boolean reconnect, String serverURI) {\n\t\tSystem.out.println(\"Connected! - publishing birth\");\n\t\tpublishBirth();\n\t}\n\n\tpublic void connectionLost(Throwable cause) {\n\t\tcause.printStackTrace();\n\t\tSystem.out.println(\"The MQTT Connection was lost! - will auto-reconnect\");\n\t}\n\n\tpublic void messageArrived(String topic, MqttMessage message) throws Exception {\n\t\tSystem.out.println(\"Message Arrived on topic \" + topic);\n\n\t\tSparkplugBPayloadDecoder decoder = new SparkplugBPayloadDecoder();\n\t\tSparkplugBPayload inboundPayload = decoder.buildFromByteArray(message.getPayload(), null);\n\n\t\t// Debug\n\t\tfor (Metric metric : inboundPayload.getMetrics()) {\n\t\t\tSystem.out.println(\"Metric \" + metric.getName() + \"=\" + metric.getValue());\n\t\t}\n\n\t\tString[] splitTopic = topic.split(\"/\");\n\t\tif (splitTopic[0].equals(NAMESPACE) && splitTopic[1].equals(groupId) && splitTopic[2].equals(\"NCMD\")\n\t\t\t\t&& splitTopic[3].equals(edgeNode)) {\n\t\t\tfor (Metric metric : inboundPayload.getMetrics()) {\n\t\t\t\tif (\"Node Control/Rebirth\".equals(metric.getName()) && ((Boolean) metric.getValue())) {\n\t\t\t\t\tpublishBirth();\n\t\t\t\t} else {\n\t\t\t\t\tSystem.out.println(\"Unknown Node Command NCMD: \" + metric.getName());\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\tpublic void deliveryComplete(IMqttDeliveryToken token) {\n\t\tSystem.out.println(\"Published message: \" + token);\n\t}\n\n\tprivate class Publisher implements Runnable {\n\n\t\tprivate String topic;\n\t\tprivate SparkplugBPayload outboundPayload;\n\n\t\tpublic Publisher(String topic, SparkplugBPayload outboundPayload) {\n\t\t\tthis.topic = topic;\n\t\t\tthis.outboundPayload = outboundPayload;\n\t\t}\n\n\t\tpublic void run() {\n\t\t\ttry {\n\t\t\t\toutboundPayload.setTimestamp(new Date());\n\t\t\t\tSparkplugBPayloadEncoder encoder = new SparkplugBPayloadEncoder();\n\t\t\t\tclient.publish(topic, encoder.getBytes(outboundPayload, false), 0, false);\n\t\t\t} catch (MqttPersistenceException e) {\n\t\t\t\te.printStackTrace();\n\t\t\t} catch (MqttException e) {\n\t\t\t\te.printStackTrace();\n\t\t\t} catch (Exception e) {\n\t\t\t\te.printStackTrace();\n\t\t\t}\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/examples/edge_node_control/pom.xml",
    "content": "<!--/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n  xsi:schemaLocation=\"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">\n  <modelVersion>4.0.0</modelVersion>\n\n  <parent>\n    <groupId>org.eclipse.tahu</groupId>\n    <artifactId>tahu-examples</artifactId>\n    <version>1.0.7</version>\n    <relativePath>../pom.xml</relativePath>\n  </parent>\n\n  <artifactId>example_edge_node_control</artifactId>\n  <packaging>jar</packaging>\n  <name>Sparkplug B Edge Node Control Example</name>\n\n  <dependencies>\n    <dependency>\n      <groupId>org.eclipse.tahu</groupId>\n      <artifactId>tahu-core</artifactId>\n      <version>${project.version}</version>\n    </dependency>\n  </dependencies>\n\n  <build>\n    <plugins>\n      <plugin>\n        <groupId>org.sonatype.plugins</groupId>\n        <artifactId>nexus-staging-maven-plugin</artifactId>\n        <configuration>\n          <skipNexusStagingDeployMojo>true</skipNexusStagingDeployMojo>\n        </configuration>\n      </plugin>\n\n      <!-- New Build mechanism - replaces maven-assembly-plugin -->\n      <plugin>\n        <groupId>org.apache.maven.plugins</groupId>\n        <artifactId>maven-shade-plugin</artifactId>\n        <version>2.4.1</version>\n        <executions>\n          <execution>\n            <phase>package</phase>\n            <goals>\n              <goal>shade</goal>\n            </goals>\n            <configuration>\n              <filters>\n                <filter>\n                  <artifact>*:*</artifact>\n                  <excludes>\n                    <exclude>META-INF/*.SF</exclude>\n                    <exclude>META-INF/*.DSA</exclude>\n                    <exclude>META-INF/*.RSA</exclude>\n                  </excludes>\n                </filter>\n              </filters>\n              <transformers>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ManifestResourceTransformer\">\n                  <mainClass>org.eclipse.tahu.SparkplugExample</mainClass>\n                </transformer>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ServicesResourceTransformer\" />\n              </transformers>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <groupId>org.codehaus.mojo</groupId>\n        <artifactId>license-maven-plugin</artifactId>\n        <version>1.8</version>\n        <executions>\n          <execution>\n            <id>add-third-party</id>\n            <phase>package</phase>\n            <goals>\n              <goal>add-third-party</goal>\n              <goal>download-licenses</goal>\n            </goals>\n            <configuration>\n              <useMissingFile>true</useMissingFile>\n              <excludedScopes>test</excludedScopes>\n              <excludedGroups>\n                (org.eclipse.tahu*)\n              </excludedGroups>\n              <licenseMerges>\n                <licenseMerge>The Apache Software License, Version\n                  2.0|Apache License, Version 2.0|Apache Public License\n                  2.0|Apache License 2.0|Apache Software License -\n                  Version 2.0</licenseMerge>\n              </licenseMerges>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n    </plugins>\n  </build>\n\n</project>\n"
  },
  {
    "path": "java/examples/edge_node_control/src/main/java/org/eclipse/tahu/SparkplugExample.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu;\n\nimport java.io.BufferedReader;\nimport java.io.IOException;\nimport java.io.InputStreamReader;\nimport java.util.UUID;\n\nimport org.eclipse.paho.client.mqttv3.IMqttDeliveryToken;\nimport org.eclipse.paho.client.mqttv3.MqttCallbackExtended;\nimport org.eclipse.paho.client.mqttv3.MqttClient;\nimport org.eclipse.paho.client.mqttv3.MqttConnectOptions;\nimport org.eclipse.paho.client.mqttv3.MqttException;\nimport org.eclipse.paho.client.mqttv3.MqttMessage;\nimport org.eclipse.paho.client.mqttv3.MqttPersistenceException;\nimport org.eclipse.tahu.message.SparkplugBPayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadEncoder;\nimport org.eclipse.tahu.message.model.MessageType;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.eclipse.tahu.message.model.Topic;\nimport org.eclipse.tahu.util.TopicUtil;\nimport org.slf4j.LoggerFactory;\n\nimport com.fasterxml.jackson.annotation.JsonInclude.Include;\nimport com.fasterxml.jackson.databind.ObjectMapper;\n\nimport ch.qos.logback.classic.Level;\nimport ch.qos.logback.classic.Logger;\n\npublic class SparkplugExample implements MqttCallbackExtended {\n\n\tprivate static final String NAMESPACE = \"spBv1.0\";\n\n\tstatic {\n\t\t((Logger) LoggerFactory.getLogger(Logger.ROOT_LOGGER_NAME)).setLevel(Level.OFF);\n\t}\n\n\t// Configuration\n\tprivate String serverUrl = \"tcp://localhost:1883\";\n\tprivate String groupId;\n\tprivate String edgeNode;\n\tprivate String clientId = UUID.randomUUID().toString();\n\tprivate String username = \"admin\";\n\tprivate String password = \"changeme\";\n\tprivate MqttClient client;\n\n\tpublic SparkplugExample(String groupId, String edgeNodeId) {\n\t\tthis.groupId = groupId;\n\t\tthis.edgeNode = edgeNodeId;\n\t}\n\n\tpublic static void main(String[] args) {\n\t\tSparkplugExample example = new SparkplugExample(args[0], args[1]);\n\t\texample.run();\n\t}\n\n\tpublic void run() {\n\t\ttry {\n\t\t\t// Connect to the MQTT Server\n\t\t\tMqttConnectOptions options = new MqttConnectOptions();\n\t\t\toptions.setAutomaticReconnect(true);\n\t\t\toptions.setCleanSession(true);\n\t\t\toptions.setConnectionTimeout(30);\n\t\t\toptions.setKeepAliveInterval(30);\n\t\t\toptions.setUserName(username);\n\t\t\toptions.setPassword(password.toCharArray());\n\t\t\tclient = new MqttClient(serverUrl, clientId);\n\t\t\tclient.setTimeToWait(2000);\n\t\t\tclient.setCallback(this);\n\t\t\tclient.connect(options);\n\n\t\t\t// Subscribe to control/command messages for both the edge of network node and the attached devices\n\t\t\tclient.subscribe(NAMESPACE + \"/\" + groupId + \"/+/\" + edgeNode, 0);\n\t\t\tclient.subscribe(NAMESPACE + \"/\" + groupId + \"/+/\" + edgeNode + \"/*\", 0);\n\n\t\t\t// Loop to receive input commands\n\t\t\twhile (true) {\n\t\t\t\tSystem.out.print(\"\\n> \");\n\n\t\t\t\tBufferedReader br = new BufferedReader(new InputStreamReader(System.in));\n\t\t\t\tString line = br.readLine();\n\n\t\t\t\thandleCommand(line);\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\tprivate void handleCommand(String command)\n\t\t\tthrows SparkplugException, MqttPersistenceException, MqttException, IOException {\n\t\tString[] tokens = command.split(\" \");\n\n\t\tif (tokens.length > 0) {\n\t\t\tString cmd = tokens[0];\n\t\t\tif (cmd.equals(\"\")) {\n\t\t\t\treturn;\n\t\t\t}\n\t\t\tif (cmd.equals(\"?\") || cmd.toLowerCase().equals(\"help\")) {\n\t\t\t\t// Help with commands\n\t\t\t\tSystem.out.println(\"\\nCOMMANDS\");\n\t\t\t\tSystem.out.println(\" - rebirth: Publishes a rebirth command to the Edge Node\");\n\t\t\t\tSystem.out.println(\"     usage: rebirth\");\n\t\t\t\treturn;\n\t\t\t} else if (cmd.toLowerCase().equals(\"rebirth\")) {\n\t\t\t\t// Issue a rebirth\n\t\t\t\tclient.publish(NAMESPACE + \"/\" + groupId + \"/NCMD/\" + edgeNode,\n\t\t\t\t\t\tnew SparkplugBPayloadEncoder().getBytes(new SparkplugBPayloadBuilder().addMetric(\n\t\t\t\t\t\t\t\tnew MetricBuilder(\"Node Control/Rebirth\", MetricDataType.Boolean, true).createMetric())\n\t\t\t\t\t\t\t\t.createPayload(), false),\n\t\t\t\t\t\t0, false);\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t}\n\n\t\tSystem.out.println(\"\\nInvalid command: \" + command);\n\t}\n\n\t@Override\n\tpublic void connectComplete(boolean reconnect, String serverURI) {\n\t\tSystem.out.println(\"Connected!\");\n\t}\n\n\tpublic void connectionLost(Throwable cause) {\n\t\tSystem.out.println(\"The MQTT Connection was lost! - will auto-reconnect\");\n\t}\n\n\tpublic void messageArrived(String topic, MqttMessage message) throws Exception {\n\t\tTopic sparkplugTopic = TopicUtil.parseTopic(topic);\n\t\tObjectMapper mapper = new ObjectMapper();\n\t\tmapper.setSerializationInclusion(Include.NON_NULL);\n\n\t\tSparkplugBPayloadDecoder decoder = new SparkplugBPayloadDecoder();\n\t\tSparkplugBPayload inboundPayload = decoder.buildFromByteArray(message.getPayload(), null);\n\n\t\tif (sparkplugTopic.isType(MessageType.NBIRTH)) {\n\t\t\ttry {\n\t\t\t\tSystem.out.println(\"\\n\\nRecieved Node Birth\");\n\t\t\t\tSystem.out.println(mapper.writerWithDefaultPrettyPrinter().writeValueAsString(inboundPayload));\n\t\t\t\tSystem.out.print(\"\\n\\n> \");\n\t\t\t} catch (Exception e) {\n\t\t\t\te.printStackTrace();\n\t\t\t}\n\t\t}\n\t}\n\n\tpublic void deliveryComplete(IMqttDeliveryToken token) {\n\t\t// System.out.println(\"Published message: \" + token);\n\t}\n}\n"
  },
  {
    "path": "java/examples/host_file/THIRD-PARTY.txt",
    "content": "\nLists of 5 third-party dependencies.\n     (New BSD license) Protocol Buffer Java API (com.google.protobuf:protobuf-java:2.6.1 - https://developers.google.com/protocol-buffers/)\n     (The Apache Software License, Version 2.0) Apache Log4j (log4j:log4j:1.2.17 - http://logging.apache.org/log4j/1.2/)\n     (Eclipse Public License - Version 1.0) org.eclipse.paho.client.mqttv3 (org.eclipse.paho:org.eclipse.paho.client.mqttv3:1.0.2 - http://www.eclipse.org/paho/org.eclipse.paho.client.mqttv3)\n     (MIT License) SLF4J API Module (org.slf4j:slf4j-api:1.7.5 - http://www.slf4j.org)\n     (MIT License) SLF4J LOG4J-12 Binding (org.slf4j:slf4j-log4j12:1.7.5 - http://www.slf4j.org)\n"
  },
  {
    "path": "java/examples/host_file/pom.xml",
    "content": "<!--/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n  xsi:schemaLocation=\"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">\n  <modelVersion>4.0.0</modelVersion>\n\n  <parent>\n    <groupId>org.eclipse.tahu</groupId>\n    <artifactId>tahu-examples</artifactId>\n    <version>1.0.7</version>\n    <relativePath>../pom.xml</relativePath>\n  </parent>\n\n  <artifactId>example_host_file</artifactId>\n  <packaging>jar</packaging>\n  <name>Sparkplug B Host File Example</name>\n\n  <dependencies>\n    <dependency>\n      <groupId>org.eclipse.tahu</groupId>\n      <artifactId>tahu-core</artifactId>\n      <version>${project.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>javax.xml.bind</groupId>\n      <artifactId>jaxb-api</artifactId>\n      <version>2.3.0</version>\n    </dependency>\n  </dependencies>\n  <build>\n    <plugins>\n      <plugin>\n        <groupId>org.sonatype.plugins</groupId>\n        <artifactId>nexus-staging-maven-plugin</artifactId>\n        <configuration>\n          <skipNexusStagingDeployMojo>true</skipNexusStagingDeployMojo>\n        </configuration>\n      </plugin>\n\n      <!-- New Build mechanism - replaces maven-assembly-plugin -->\n      <plugin>\n        <groupId>org.apache.maven.plugins</groupId>\n        <artifactId>maven-shade-plugin</artifactId>\n        <version>2.4.1</version>\n        <executions>\n          <execution>\n            <phase>package</phase>\n            <goals>\n              <goal>shade</goal>\n            </goals>\n            <configuration>\n              <filters>\n                <filter>\n                  <artifact>*:*</artifact>\n                  <excludes>\n                    <exclude>META-INF/*.SF</exclude>\n                    <exclude>META-INF/*.DSA</exclude>\n                    <exclude>META-INF/*.RSA</exclude>\n                  </excludes>\n                </filter>\n              </filters>\n              <transformers>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ManifestResourceTransformer\">\n                  <mainClass>org.eclipse.tahu.SparkplugExample</mainClass>\n                </transformer>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ServicesResourceTransformer\" />\n              </transformers>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <groupId>org.codehaus.mojo</groupId>\n        <artifactId>license-maven-plugin</artifactId>\n        <version>1.8</version>\n        <executions>\n          <execution>\n            <id>add-third-party</id>\n            <phase>package</phase>\n            <goals>\n              <goal>add-third-party</goal>\n              <goal>download-licenses</goal>\n            </goals>\n            <configuration>\n              <useMissingFile>true</useMissingFile>\n              <excludedScopes>test</excludedScopes>\n              <excludedGroups> (org.eclipse.tahu*)\n              </excludedGroups>\n              <licenseMerges>\n                <licenseMerge>The Apache Software License, Version\n                  2.0|Apache License, Version 2.0|Apache Public License\n                  2.0|Apache License 2.0|Apache Software License -\n                  Version 2.0</licenseMerge>\n              </licenseMerges>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n    </plugins>\n  </build>\n</project>\n"
  },
  {
    "path": "java/examples/host_file/src/main/java/org/eclipse/tahu/example/host/file/FileAssembler.java",
    "content": "/********************************************************************************\n * Copyright (c) 2020-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.example.host.file;\n\nimport java.io.File;\nimport java.io.IOException;\nimport java.util.Date;\nimport java.util.concurrent.ExecutorService;\n\nimport org.apache.commons.io.FileUtils;\nimport org.eclipse.paho.client.mqttv3.MqttClient;\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\nimport org.eclipse.tahu.example.host.file.model.EdgeNode;\nimport org.eclipse.tahu.example.host.file.model.FilePublishStatus;\nimport org.eclipse.tahu.example.host.file.util.FileValidationUtils;\nimport org.eclipse.tahu.message.model.MessageType;\nimport org.eclipse.tahu.message.model.MetaData;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.PropertySet;\nimport org.eclipse.tahu.message.model.PropertyValue;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.eclipse.tahu.message.model.Topic;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\n/**\n * Defines file assembler\n */\npublic class FileAssembler {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(FileAssembler.class.getName());\n\n\t// Configurable constants\n\tprivate static final String FOLDER_PATH = \"/tmp/receiver/\";\n\tprivate static final boolean REPLACE_EXISTING_FILE = true; // Use false for 'keep existing file'\n\tprivate static final int MAX_NUMBER_RETRIES = 3;\n\n\t// Non-configurable constants\n\tprivate static final String TAG_PROVIDER_PROP_NAME = \"filePublishingTagProvider\";\n\tprivate static final String TAG_FOLDER_PATH_PROP_NAME = \"filePublishingTagFolderPath\";\n\tprivate static final String LAST_SEQ_NUM_PUBLISHED = \"Last Published Sequence Number\";\n\tprivate static final String PUBLISH_FILE_STATUS_CODE = \"Publish Operation Status Code\";\n\tprivate static final String FILE_SEPARATOR = System.getProperty(\"file.separator\");\n\n\tprivate final ExecutorService executor;\n\tprivate final MqttClient client;\n\n\tprivate final String name;\n\tprivate String filename;\n\tprivate final EdgeNode edgeNode;\n\tprivate String deviceId;\n\tprivate long lastSeqNumProcessed = -1L;\n\tprivate boolean multipart;\n\tprivate String expectedMd5;\n\tprivate long numberOfFileChunks;\n\tprivate int retryCnt;\n\n\t/**\n\t * FileAssembler constructor\n\t * \n\t * @param name - file assembler's name as {@link String}\n\t * @param edgeNode - Edge Node as {@link EdgeNode}\n\t */\n\tpublic FileAssembler(ExecutorService executor, MqttClient client, String name, EdgeNode edgeNode) {\n\t\tthis.executor = executor;\n\t\tthis.client = client;\n\t\tthis.name = name;\n\t\tthis.edgeNode = edgeNode;\n\t}\n\n\t/**\n\t * FileAssembler constructor\n\t * \n\t * @param name - file assembler's name as {@link String}\n\t * @param edgeNode - Edge Node as {@link EdgeNode}\n\t * @param deviceId - Device Id as {@link String}\n\t */\n\tpublic FileAssembler(ExecutorService executor, MqttClient client, String name, EdgeNode edgeNode, String deviceId) {\n\t\tthis(executor, client, name, edgeNode);\n\t\tthis.deviceId = deviceId;\n\t}\n\n\t/*\n\t * Process supplied metric\n\t */\n\tpublic FilePublishStatus processMetric(Metric metric) {\n\t\tif (!isValidMetric(metric)) {\n\t\t\treturn FilePublishStatus.INVALID_METRICS;\n\t\t}\n\t\tMetaData metaData = metric.getMetaData();\n\t\tif (metaData.getSeq() == 0 && metaData.isMultiPart()) {\n\t\t\tmultipart = true;\n\t\t}\n\t\tif (multipart && !metaData.isMultiPart()) {\n\t\t\treturn FilePublishStatus.INVALID_METRICS;\n\t\t}\n\t\tFilePublishStatus fileAssemblerStatus = null;\n\t\tif (multipart) {\n\t\t\tfileAssemblerStatus = processFileMetricMultipart(metric);\n\t\t\tif (metaData.getSeq() > 0) {\n\t\t\t\tpublishAckCommand(metric, fileAssemblerStatus);\n\t\t\t}\n\t\t\tlogger.trace(\"The FileAssemblerStatus is {}\", fileAssemblerStatus);\n\t\t\tif (!(fileAssemblerStatus == FilePublishStatus.CONTINUE\n\t\t\t\t\t|| fileAssemblerStatus == FilePublishStatus.SUCCESS)) {\n\t\t\t\tString fullDstFilePath = formAbsoluteDstFilePath(formDstFolderPath(), true);\n\t\t\t\tlogger.trace(\"Deleting partial {} file\", fullDstFilePath);\n\t\t\t\tFileUtils.deleteQuietly(new File(fullDstFilePath));\n\t\t\t}\n\t\t} else {\n\t\t\tlogger.debug(\"About to process non-multipart file metric\", metaData.getFileName());\n\t\t\tfileAssemblerStatus = processFileMetric(metric);\n\t\t\tpublishAckCommand(metric, fileAssemblerStatus);\n\t\t}\n\t\treturn fileAssemblerStatus;\n\t}\n\n\t/*\n\t * Reports file assembler name\n\t */\n\tString getName() {\n\t\treturn name;\n\t}\n\n\t/*\n\t * Processes metric of non-multipart file transfer\n\t */\n\tprivate FilePublishStatus processFileMetric(Metric metric) {\n\t\tMetaData metaData = metric.getMetaData();\n\t\tfilename = formDstFilename(metaData.getFileName());\n\t\texpectedMd5 = metaData.getMd5();\n\n\t\torg.eclipse.tahu.message.model.File fileValue = (org.eclipse.tahu.message.model.File) metric.getValue();\n\t\tbyte[] fileData = fileValue.getBytes();\n\n\t\tif (!isValidMd5Sum(fileData, metaData.getMd5())) {\n\t\t\tlogger.debug(\"MD5_ERR on {}\", filename);\n\t\t\treturn FilePublishStatus.MD5_ERR;\n\t\t}\n\t\tif (!writeToDstFile(fileData, false, false)) {\n\t\t\tlogger.debug(\"FILE_WRITE_ERR on {}\", filename);\n\t\t\treturn FilePublishStatus.FILE_WRITE_ERR;\n\t\t}\n\n\t\tlogger.trace(\"SUCCESS in processing file metric for {}\", filename);\n\t\treturn FilePublishStatus.SUCCESS;\n\t}\n\n\t/*\n\t * Process supplied multipart metrics\n\t */\n\tprivate FilePublishStatus processFileMetricMultipart(Metric metric) {\n\t\tMetaData metaData = metric.getMetaData();\n\n\t\tif (metaData.getSeq() == 0) {\n\t\t\tfilename = formDstFilename(metaData.getFileName());\n\t\t\texpectedMd5 = metaData.getMd5();\n\t\t\tnumberOfFileChunks = metaData.getSize();\n\t\t\tlastSeqNumProcessed = 0;\n\t\t\tlogger.debug(\"Processing multipart file metrics :: Sequence Nnumber: 0, Total number of file chunks: {}\",\n\t\t\t\t\tnumberOfFileChunks);\n\t\t\treturn FilePublishStatus.CONTINUE;\n\t\t}\n\n\t\tlogger.debug(\"Processing multipart file metrics :: Sequence Number: {}, Total number of file chunks: {}\",\n\t\t\t\tmetaData.getSeq(), numberOfFileChunks);\n\n\t\tif (!REPLACE_EXISTING_FILE) {\n\t\t\tint extInd = metaData.getFileName().lastIndexOf('.');\n\t\t\tString metaDataFilenameNoExt =\n\t\t\t\t\textInd > 0 ? metaData.getFileName().substring(0, extInd) : metaData.getFileName();\n\t\t\tif (!filename.startsWith(metaDataFilenameNoExt)) {\n\t\t\t\treturn FilePublishStatus.INVALID_METRICS;\n\t\t\t}\n\t\t} else {\n\t\t\tif (!filename.equals(metaData.getFileName())) {\n\t\t\t\treturn FilePublishStatus.INVALID_METRICS;\n\t\t\t}\n\t\t}\n\n\t\tif (metaData.getSeq() != lastSeqNumProcessed + 1) {\n\t\t\treturn FilePublishStatus.SEQ_NUM_ERR_ENGINE;\n\t\t} else if (metaData.getSeq() == lastSeqNumProcessed) {\n\t\t\tif (retryCnt < MAX_NUMBER_RETRIES) {\n\t\t\t\tretryCnt++;\n\t\t\t\treturn FilePublishStatus.CONTINUE;\n\t\t\t} else {\n\t\t\t\treturn FilePublishStatus.SEQ_NUM_ERR_ENGINE;\n\t\t\t}\n\t\t}\n\n\t\torg.eclipse.tahu.message.model.File fileValue = (org.eclipse.tahu.message.model.File) metric.getValue();\n\t\tbyte[] fileData = fileValue.getBytes();\n\t\tif (!isValidMd5Sum(fileData, metaData.getMd5())) {\n\t\t\treturn FilePublishStatus.PARTIAL_MD5_ERR;\n\t\t}\n\t\tif (!writeToDstFile(fileData, true, metaData.getSeq() > 1)) {\n\t\t\treturn FilePublishStatus.FILE_WRITE_ERR;\n\t\t}\n\t\tFilePublishStatus filePublishStatus = null;\n\t\tif (metaData.getSeq() == numberOfFileChunks) {\n\t\t\tFile partialDstFile = new File(formAbsoluteDstFilePath(true));\n\t\t\tString fullDstFilePath = formAbsoluteDstFilePath(false);\n\t\t\tif (isValidMd5Sum(partialDstFile, expectedMd5)) {\n\t\t\t\tFile dstFile = new File(formAbsoluteDstFilePath(false));\n\t\t\t\tif (partialDstFile.renameTo(dstFile)) {\n\t\t\t\t\tfilePublishStatus = FilePublishStatus.SUCCESS;\n\t\t\t\t} else {\n\t\t\t\t\tfilePublishStatus = FilePublishStatus.RENAME_ERR;\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tlogger.error(\"MD5 sum error after reassembly of the {} file\", fullDstFilePath);\n\t\t\t\tfilePublishStatus = FilePublishStatus.MD5_ERR;\n\t\t\t}\n\t\t} else if (metaData.getSeq() < numberOfFileChunks) {\n\t\t\tfilePublishStatus = FilePublishStatus.CONTINUE;\n\t\t} else {\n\t\t\tfilePublishStatus = FilePublishStatus.SEQ_NUM_ERR_ENGINE;\n\t\t}\n\t\tlastSeqNumProcessed = metaData.getSeq();\n\t\treturn filePublishStatus;\n\t}\n\n\t/*\n\t * Publishes ACK command\n\t */\n\tprivate boolean publishAckCommand(Metric metric, FilePublishStatus filePublishStatus) {\n\n\t\tlong seqNo = metric.getMetaData().getSeq();\n\t\tboolean ret = false;\n\t\tString cmdTopic = deviceId != null\n\t\t\t\t? new Topic(SparkplugExample.NAMESPACE, edgeNode.getGroupName(), edgeNode.getEdgeNodeName(), deviceId,\n\t\t\t\t\t\tMessageType.DCMD).toString()\n\t\t\t\t: new Topic(SparkplugExample.NAMESPACE, edgeNode.getGroupName(), edgeNode.getEdgeNodeName(),\n\t\t\t\t\t\tMessageType.NCMD).toString();\n\n\t\t// form payload\n\t\tSparkplugBPayload cmdPayload = new SparkplugBPayloadBuilder().setTimestamp(new Date()).createPayload();\n\n\t\ttry {\n\t\t\tPropertySet propertySet = new PropertySet();\n\t\t\tPropertySet fileMetricProperties = metric.getProperties();\n\t\t\tif (fileMetricProperties != null && fileMetricProperties.containsKey(TAG_PROVIDER_PROP_NAME)) {\n\t\t\t\tPropertyValue fileMetricProperty = fileMetricProperties.get(TAG_PROVIDER_PROP_NAME);\n\t\t\t\tpropertySet.put(TAG_PROVIDER_PROP_NAME, fileMetricProperty);\n\t\t\t}\n\t\t\tif (fileMetricProperties != null && fileMetricProperties.containsKey(TAG_FOLDER_PATH_PROP_NAME)) {\n\t\t\t\tPropertyValue fileMetricProperty = fileMetricProperties.get(TAG_FOLDER_PATH_PROP_NAME);\n\t\t\t\tpropertySet.put(TAG_FOLDER_PATH_PROP_NAME, fileMetricProperty);\n\t\t\t}\n\n\t\t\tMetric cmdMetricSeqNum =\n\t\t\t\t\tnew MetricBuilder(LAST_SEQ_NUM_PUBLISHED, MetricDataType.Int64, seqNo).createMetric();\n\t\t\tcmdMetricSeqNum.setProperties(propertySet);\n\n\t\t\tMetric cmdMetricStatusCode =\n\t\t\t\t\tnew MetricBuilder(PUBLISH_FILE_STATUS_CODE, MetricDataType.Int32, filePublishStatus.getCode())\n\t\t\t\t\t\t\t.createMetric();\n\t\t\tcmdMetricStatusCode.setProperties(propertySet);\n\n\t\t\tcmdPayload.addMetric(cmdMetricSeqNum);\n\t\t\tcmdPayload.addMetric(cmdMetricStatusCode);\n\t\t\tlogger.debug(\"Publishing file ACK to {}\", cmdTopic);\n\t\t\texecutor.execute(new Publisher(client, cmdTopic, cmdPayload, 0, false));\n\t\t} catch (SparkplugInvalidTypeException e) {\n\t\t\tlogger.error(\"Failed to publish ACK command\", e);\n\t\t}\n\t\treturn ret;\n\t}\n\n\t/*\n\t * Writes supplied data to the destination file\n\t */\n\tprivate boolean writeToDstFile(byte[] fileData, boolean isPartial, boolean append) {\n\t\tboolean ret = false;\n\n\t\t// Write the file to the file system\n\t\tString fullDstFilePath = formAbsoluteDstFilePath(formDstFolderPath(), isPartial);\n\t\tif (!append) {\n\t\t\tif (REPLACE_EXISTING_FILE) {\n\t\t\t\tFileUtils.deleteQuietly(new File(fullDstFilePath));\n\t\t\t}\n\t\t}\n\t\ttry {\n\t\t\tFileUtils.writeByteArrayToFile(new File(fullDstFilePath), fileData, append);\n\t\t\tret = true;\n\t\t} catch (IOException e) {\n\t\t\tlogger.error(\"Error writing file , {}, to file system\", fullDstFilePath, e);\n\t\t}\n\t\treturn ret;\n\t}\n\n\t/*\n\t * Forms a path to the destination folder \n\t */\n\tprivate String formDstFolderPath() {\n\t\tif (new File(FOLDER_PATH).mkdirs()) {\n\t\t\tlogger.trace(\"Created parent directories: {}\", FOLDER_PATH);\n\t\t}\n\t\treturn FOLDER_PATH;\n\t}\n\n\t/*\n\t * Forms destination filename from fileneme supplied by the file metrics based on the file storing policy\n\t */\n\tprivate String formDstFilename(String metaDataFilename) {\n\t\tif (REPLACE_EXISTING_FILE) {\n\t\t\treturn metaDataFilename;\n\t\t}\n\t\tString ret = null;\n\t\tint fileExtInd = metaDataFilename.lastIndexOf('.');\n\t\tString fileNameNoExt = fileExtInd > 0 ? metaDataFilename.substring(0, fileExtInd) : metaDataFilename;\n\t\tString fileExt = fileExtInd > 0 ? metaDataFilename.substring(fileExtInd + 1) : \"\";\n\n\t\tString folderPath = formDstFolderPath();\n\t\tint fileNumber = -1;\n\n\t\tfor (File f : new File(folderPath).listFiles((dir, fname) -> {\n\t\t\tint ind = fname.lastIndexOf('.');\n\t\t\tString ext = ind > 0 ? fname.substring(ind + 1) : \"\";\n\t\t\treturn fname.startsWith(fileNameNoExt) && fileExt.equals(ext);\n\t\t})) {\n\t\t\tint fn = getFileNumber(f.getName());\n\t\t\tif (fn > fileNumber) {\n\t\t\t\tfileNumber = fn;\n\t\t\t}\n\t\t}\n\n\t\tif (fileNumber > 0) {\n\t\t\tret = String.format(\"%s (%d).%s\", fileNameNoExt, fileNumber + 1, fileExt);\n\t\t} else if (fileNumber == 0) {\n\t\t\tret = String.format(\"%s (%d).%s\", fileNameNoExt, 1, fileExt);\n\t\t} else {\n\t\t\tret = metaDataFilename;\n\t\t}\n\t\treturn ret;\n\t}\n\n\t/*\n\t * Reports a number (i.e. copy number) for supplied file\n\t */\n\tprivate int getFileNumber(String fname) {\n\t\tint ret = 0;\n\t\tint ind1 = fname.lastIndexOf('(');\n\t\tint ind2 = fname.lastIndexOf(')');\n\t\tif (ind1 > 0 && ind2 > ind1) {\n\t\t\tret = Integer.parseInt(fname.substring(ind1 + 1, ind2));\n\t\t}\n\t\treturn ret;\n\t}\n\n\t/*\n\t * Forms an absolute path to the destination file\n\t */\n\tprivate String formAbsoluteDstFilePath(boolean isPartial) {\n\t\treturn formAbsoluteDstFilePath(formDstFolderPath(), isPartial);\n\t}\n\n\t/*\n\t * Forms an absolute path to the destination file for supplied folder path\n\t */\n\tprivate String formAbsoluteDstFilePath(String folderPath, boolean isPartial) {\n\t\treturn isPartial\n\t\t\t\t? new StringBuilder().append(folderPath).append(FILE_SEPARATOR).append(filename).append(\".part\")\n\t\t\t\t\t\t.toString()\n\t\t\t\t: new StringBuilder().append(folderPath).append(FILE_SEPARATOR).append(filename).toString();\n\t}\n\n\t/*\n\t * Reports if MD5 sum of supplied buffer containing file data matches expected one \n\t */\n\tprivate boolean isValidMd5Sum(byte[] fileData, String expectedMd5Sum) {\n\t\tboolean ret = true;\n\t\tif (expectedMd5Sum != null && !expectedMd5Sum.equalsIgnoreCase(FileValidationUtils.calculateMd5Sum(fileData))) {\n\t\t\tlogger.error(\"Invalid MD5 sum\");\n\t\t\tret = false;\n\t\t}\n\t\treturn ret;\n\t}\n\n\t/*\n\t * Reports if MD5 sum of supplied file matches expected one \n\t */\n\tprivate boolean isValidMd5Sum(File file, String expectedMd5Sum) {\n\t\tboolean ret = true;\n\t\tif (expectedMd5Sum != null) {\n\t\t\tString calculatedMd5Sum = FileValidationUtils.calculateMd5Sum(file);\n\t\t\tif (!expectedMd5Sum.equalsIgnoreCase(calculatedMd5Sum)) {\n\t\t\t\tlogger.error(\"Invalid MD5 sum: filename={}. Calculated MD5 Sum: {}. Expected MD5 Sum: {}\",\n\t\t\t\t\t\tfile.getAbsolutePath(), calculatedMd5Sum, expectedMd5Sum);\n\t\t\t\tret = false;\n\t\t\t} else {\n\t\t\t\tlogger.debug(\"MD5 sum match :: Calculated MD5 Sum: {}. Expected MD5 Sum: {}\", calculatedMd5Sum,\n\t\t\t\t\t\texpectedMd5Sum);\n\t\t\t}\n\t\t}\n\t\treturn ret;\n\t}\n\n\t/*\n\t * Checks if supplied file metric is valid\n\t */\n\tprivate boolean isValidMetric(Metric metric) {\n\t\tif (metric == null) {\n\t\t\tlogger.error(\"Invalid 'File' metric: null\");\n\t\t\treturn false;\n\t\t}\n\t\tMetaData metaData = metric.getMetaData();\n\t\tif (metaData == null || metaData.isMultiPart() == null || metaData.getFileName() == null\n\t\t\t\t|| metaData.getSeq() < 0 || metaData.getSize() < 0) {\n\t\t\tlogger.error(\"Invalid 'File' metric: {}\", metric);\n\t\t\treturn false;\n\t\t}\n\t\tlogger.trace(\"Valid file metric for {}\", metaData.getFileName());\n\t\treturn true;\n\t}\n}\n"
  },
  {
    "path": "java/examples/host_file/src/main/java/org/eclipse/tahu/example/host/file/Publisher.java",
    "content": "/********************************************************************************\n * Copyright (c) 2020-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.example.host.file;\n\nimport java.util.Date;\n\nimport org.eclipse.paho.client.mqttv3.MqttClient;\nimport org.eclipse.paho.client.mqttv3.MqttException;\nimport org.eclipse.paho.client.mqttv3.MqttPersistenceException;\nimport org.eclipse.tahu.message.SparkplugBPayloadEncoder;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\n\npublic class Publisher implements Runnable {\n\n\tprivate final MqttClient client;\n\tprivate final String topic;\n\tprivate final byte[] bytePayload;\n\tprivate final SparkplugBPayload sparkplugPayload;\n\tprivate final int qos;\n\tprivate final boolean retained;\n\n\tpublic Publisher(MqttClient client, String topic, byte[] bytePayload, int qos, boolean retained) {\n\t\tthis.client = client;\n\t\tthis.topic = topic;\n\t\tthis.bytePayload = bytePayload;\n\t\tthis.sparkplugPayload = null;\n\t\tthis.qos = qos;\n\t\tthis.retained = retained;\n\t}\n\n\tpublic Publisher(MqttClient client, String topic, SparkplugBPayload sparkplugPayload, int qos, boolean retained) {\n\t\tthis.client = client;\n\t\tthis.topic = topic;\n\t\tthis.bytePayload = null;\n\t\tthis.sparkplugPayload = sparkplugPayload;\n\t\tthis.qos = qos;\n\t\tthis.retained = retained;\n\t}\n\n\tpublic void run() {\n\t\ttry {\n\t\t\tif (bytePayload != null) {\n\t\t\t\tclient.publish(topic, bytePayload, qos, retained);\n\t\t\t} else if (sparkplugPayload != null) {\n\t\t\t\tsparkplugPayload.setTimestamp(new Date());\n\t\t\t\tSparkplugBPayloadEncoder encoder = new SparkplugBPayloadEncoder();\n\t\t\t\tclient.publish(topic, encoder.getBytes(sparkplugPayload, false), qos, retained);\n\t\t\t} else {\n\t\t\t\tclient.publish(topic, null, 0, false);\n\t\t\t}\n\t\t} catch (MqttPersistenceException e) {\n\t\t\te.printStackTrace();\n\t\t} catch (MqttException e) {\n\t\t\te.printStackTrace();\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/examples/host_file/src/main/java/org/eclipse/tahu/example/host/file/SparkplugExample.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.example.host.file;\n\nimport java.util.Date;\nimport java.util.Map;\nimport java.util.Timer;\nimport java.util.TimerTask;\nimport java.util.concurrent.ConcurrentHashMap;\nimport java.util.concurrent.ExecutorService;\nimport java.util.concurrent.Executors;\n\nimport javax.net.SocketFactory;\nimport javax.net.ssl.SSLSocketFactory;\n\nimport org.eclipse.paho.client.mqttv3.IMqttDeliveryToken;\nimport org.eclipse.paho.client.mqttv3.MqttCallbackExtended;\nimport org.eclipse.paho.client.mqttv3.MqttClient;\nimport org.eclipse.paho.client.mqttv3.MqttConnectOptions;\nimport org.eclipse.paho.client.mqttv3.MqttMessage;\nimport org.eclipse.tahu.SparkplugParsingException;\nimport org.eclipse.tahu.example.host.file.model.EdgeNode;\nimport org.eclipse.tahu.example.host.file.model.FilePublishStatus;\nimport org.eclipse.tahu.message.SparkplugBPayloadDecoder;\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\nimport org.eclipse.tahu.message.model.MessageType;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.eclipse.tahu.message.model.Topic;\nimport org.eclipse.tahu.util.TopicUtil;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\n/**\n * An example Sparkplug B application.\n */\npublic class SparkplugExample implements MqttCallbackExtended {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(SparkplugExample.class.getName());\n\n\tpublic static final String NAMESPACE = \"spBv1.0\";\n\tprivate static final String HOST_NAMESPACE = \"STATE\";\n\n\t// Configuration\n\tprivate static final boolean USING_REAL_TLS = false;\n\tprivate String serverUrl = \"tcp://localhost:1883\";\n\tprivate String primaryHostId = \"IamHost\";\n\tprivate String clientId = \"HostFileExample\";\n\tprivate String username = \"admin\";\n\tprivate String password = \"changeme\";\n\tprivate ExecutorService executor;\n\tprivate MqttClient client;\n\n\tprivate final Map<EdgeNodeDescriptor, EdgeNode> edgeNodeMap;\n\tprivate final Map<EdgeNodeDescriptor, Timer> rebirthTimers;\n\tprivate final Map<String, FileAssembler> fileAssemblers;\n\n\tpublic static void main(String[] args) {\n\t\tSparkplugExample example = new SparkplugExample();\n\t\texample.run();\n\t}\n\n\tpublic SparkplugExample() {\n\t\tedgeNodeMap = new ConcurrentHashMap<>();\n\t\trebirthTimers = new ConcurrentHashMap<>();\n\t\tfileAssemblers = new ConcurrentHashMap<>();\n\t}\n\n\tpublic void run() {\n\t\ttry {\n\t\t\t// Thread pool for outgoing published messages\n\t\t\texecutor = Executors.newFixedThreadPool(1);\n\n\t\t\t// Build up Host Will payload\n\t\t\tbyte[] willPayload = \"OFFLINE\".getBytes();\n\n\t\t\tMqttConnectOptions options = new MqttConnectOptions();\n\n\t\t\tif (USING_REAL_TLS) {\n\t\t\t\tSocketFactory sf = SSLSocketFactory.getDefault();\n\t\t\t\toptions.setSocketFactory(sf);\n\t\t\t}\n\n\t\t\t// Connect to the MQTT Server\n\t\t\toptions.setAutomaticReconnect(true);\n\t\t\toptions.setCleanSession(true);\n\t\t\toptions.setConnectionTimeout(30);\n\t\t\toptions.setKeepAliveInterval(30);\n\t\t\toptions.setUserName(username);\n\t\t\toptions.setPassword(password.toCharArray());\n\t\t\tif (primaryHostId != null && !primaryHostId.isEmpty()) {\n\t\t\t\toptions.setWill(HOST_NAMESPACE + \"/\" + primaryHostId, willPayload, 1, true);\n\t\t\t}\n\t\t\tclient = new MqttClient(serverUrl, clientId);\n\t\t\tclient.setTimeToWait(2000);\n\t\t\tclient.setCallback(this); // short timeout on failure to connect\n\t\t\tclient.connect(options);\n\n\t\t\t// Subscribe to control/command messages for both the edge of network node and the attached devices\n\t\t\tclient.subscribe(NAMESPACE + \"/#\", 0);\n\t\t\tif (primaryHostId != null && !primaryHostId.isEmpty()) {\n\t\t\t\tclient.subscribe(HOST_NAMESPACE + \"/\" + primaryHostId, 0);\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\tprivate void publishHostBirth() {\n\t\ttry {\n\t\t\tif (primaryHostId != null && !primaryHostId.isEmpty()) {\n\t\t\t\tlogger.info(\"Publishing Host Birth\");\n\t\t\t\texecutor.execute(\n\t\t\t\t\t\tnew Publisher(client, HOST_NAMESPACE + \"/\" + primaryHostId, \"ONLINE\".getBytes(), 1, true));\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\t@Override\n\tpublic void connectComplete(boolean reconnect, String serverURI) {\n\t\tlogger.info(\"Connected! - publishing birth\");\n\t\tpublishHostBirth();\n\t}\n\n\tpublic void connectionLost(Throwable cause) {\n\t\tcause.printStackTrace();\n\t\tlogger.info(\"The MQTT Connection was lost! - will auto-reconnect\");\n\t}\n\n\tpublic void messageArrived(String stringTopic, MqttMessage message) throws Exception {\n\t\tif (stringTopic != null && stringTopic.startsWith(NAMESPACE)) {\n\t\t\t// Get the topic tokens\n\t\t\tString[] sparkplugTokens = stringTopic.split(\"/\");\n\n\t\t\t// Parse the Topic\n\t\t\tTopic topic;\n\t\t\ttry {\n\t\t\t\ttopic = TopicUtil.parseTopic(sparkplugTokens);\n\t\t\t} catch (SparkplugParsingException e) {\n\t\t\t\tlogger.error(\"Error parsing topic\", e);\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\tif (topic.isType(MessageType.NCMD) || topic.isType(MessageType.DCMD)) {\n\t\t\t\tlogger.trace(\"Ignoring CMD message\");\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\t// Get the payload\n\t\t\tSparkplugBPayloadDecoder decoder = new SparkplugBPayloadDecoder();\n\t\t\tSparkplugBPayload inboundPayload = decoder.buildFromByteArray(message.getPayload(), null);\n\n\t\t\t// Get the EdgeNodeDescriptor\n\t\t\tEdgeNodeDescriptor edgeNodeDescriptor = new EdgeNodeDescriptor(topic.getGroupId(), topic.getEdgeNodeId());\n\n\t\t\t// Special case for NBIRTH\n\t\t\tEdgeNode edgeNode = edgeNodeMap.get(edgeNodeDescriptor);\n\t\t\tif (topic.getType().equals(MessageType.NBIRTH)) {\n\t\t\t\tedgeNode = new EdgeNode(topic.getGroupId(), topic.getEdgeNodeId());\n\t\t\t\tedgeNodeMap.put(edgeNodeDescriptor, edgeNode);\n\t\t\t}\n\n\t\t\t// Failed to handle the message\n\t\t\tif (edgeNode == null) {\n\t\t\t\tlogger.warn(\"Unexpected message on topic {} - requesting Rebirth\", topic);\n\t\t\t\trequestRebirth(edgeNodeDescriptor);\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\t// Check the sequence number\n\t\t\tif (handleSeqNumberCheck(edgeNode, inboundPayload.getSeq())) {\n\t\t\t\tlogger.info(\"Validated sequence number on topic: {}\", topic);\n\n\t\t\t\t// Iterate over the metrics looking only for file metrics\n\t\t\t\tfor (Metric metric : inboundPayload.getMetrics()) {\n\t\t\t\t\tif (MetricDataType.File.equals(metric.getDataType())) {\n\t\t\t\t\t\thandleFileMetric(edgeNode, topic.getDeviceId(), metric);\n\t\t\t\t\t} else {\n\t\t\t\t\t\tlogger.debug(\"Ignoring non-file metric: {}\", metric.getName());\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tlogger.error(\"Failed sequence number check for {}/{}\", topic.getGroupId(), topic.getEdgeNodeId());\n\t\t\t}\n\t\t} else if (stringTopic != null && stringTopic.startsWith(HOST_NAMESPACE)) {\n\t\t\tif (\"OFFLINE\".equals(new String(message.getPayload()))) {\n\t\t\t\tlogger.warn(\"The MQTT Server incorrectly reported the primary host is offline - correcting\");\n\t\t\t\tpublishHostBirth();\n\t\t\t}\n\t\t} else {\n\t\t\tlogger.debug(\"Ignoring non-Sparkplug messages\");\n\t\t}\n\t}\n\n\tpublic void deliveryComplete(IMqttDeliveryToken token) {\n\t\tlogger.info(\"Published message: \" + token);\n\t}\n\n\tprivate boolean handleSeqNumberCheck(EdgeNode edgeNode, long incomingSeqNum) {\n\t\t// Get the last stored sequence number\n\t\tLong storedSeqNum = edgeNode.getLastSeqNumber();\n\t\t// Conditionally wrap to 0\n\t\tlong expectedSeqNum = storedSeqNum + 1 == 256 ? 0 : storedSeqNum + 1;\n\t\t// Check if current sequence number is valid\n\t\tif (incomingSeqNum != expectedSeqNum) {\n\t\t\t// Sequence number is INVALID, set Edge Node offline\n\t\t\tedgeNode.setOnline(false);\n\t\t\t// Request a rebirth\n\t\t\trequestRebirth(edgeNode.getEdgeNodeId());\n\t\t\treturn false;\n\t\t} else {\n\t\t\tedgeNode.setLastSeqNumber(incomingSeqNum);\n\t\t\treturn true;\n\t\t}\n\t}\n\n\tprivate void requestRebirth(EdgeNodeDescriptor edgeNodeDescriptor) {\n\t\ttry {\n\t\t\tTimer rebirthDelayTimer = rebirthTimers.get(edgeNodeDescriptor);\n\t\t\tif (rebirthDelayTimer == null) {\n\t\t\t\tlogger.info(\"Requesting Rebirth from {}\", edgeNodeDescriptor);\n\t\t\t\trebirthDelayTimer = new Timer();\n\t\t\t\trebirthTimers.put(edgeNodeDescriptor, rebirthDelayTimer);\n\t\t\t\trebirthDelayTimer.schedule(new RebirthDelayTask(edgeNodeDescriptor), 5000);\n\n\t\t\t\tEdgeNode edgeNode = edgeNodeMap.get(edgeNodeDescriptor);\n\t\t\t\tif (edgeNode != null) {\n\t\t\t\t\t// Set the Edge Node offline\n\t\t\t\t\tedgeNode.setOnline(false);\n\t\t\t\t}\n\n\t\t\t\t// Request a device rebirth\n\t\t\t\tString rebirthTopic = new Topic(NAMESPACE, edgeNodeDescriptor.getGroupId(),\n\t\t\t\t\t\tedgeNodeDescriptor.getEdgeNodeId(), MessageType.NCMD).toString();\n\t\t\t\tSparkplugBPayload rebirthPayload = new SparkplugBPayloadBuilder().setTimestamp(new Date())\n\t\t\t\t\t\t.addMetric(\n\t\t\t\t\t\t\t\tnew MetricBuilder(\"Node Control/Rebirth\", MetricDataType.Boolean, true).createMetric())\n\t\t\t\t\t\t.createPayload();\n\n\t\t\t\texecutor.execute(new Publisher(client, rebirthTopic, rebirthPayload, 0, false));\n\t\t\t} else {\n\t\t\t\tlogger.debug(\"Not requesting Rebirth since we have in the last 5 seconds\");\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to create Rebirth request\", e);\n\t\t\treturn;\n\t\t}\n\t}\n\n\tprivate class RebirthDelayTask extends TimerTask {\n\t\tprivate EdgeNodeDescriptor edgeNodeDescriptor;\n\n\t\tpublic RebirthDelayTask(EdgeNodeDescriptor edgeNodeDescriptor) {\n\t\t\tthis.edgeNodeDescriptor = edgeNodeDescriptor;\n\t\t}\n\n\t\tpublic void run() {\n\t\t\tif (rebirthTimers.get(edgeNodeDescriptor) != null) {\n\t\t\t\trebirthTimers.get(edgeNodeDescriptor).cancel();\n\t\t\t\trebirthTimers.remove(edgeNodeDescriptor);\n\t\t\t}\n\t\t}\n\t}\n\n\tprivate void handleFileMetric(EdgeNode edgeNode, String deviceName, Metric metric) {\n\n\t\tString fileAssemblerName = null;\n\t\tif (deviceName == null || deviceName.trim().isEmpty()) {\n\t\t\tfileAssemblerName = new StringBuilder().append(edgeNode.getEdgeNodeId().getDescriptorString()).append(\"/\")\n\t\t\t\t\t.append(metric.getName()).toString();\n\t\t} else {\n\t\t\tfileAssemblerName = new StringBuilder().append(edgeNode.getEdgeNodeId().getDescriptorString()).append(\"/\")\n\t\t\t\t\t.append(deviceName).append(\"/\").append(metric.getName()).toString();\n\t\t}\n\t\tFileAssembler fileAssembler = fileAssemblers.containsKey(fileAssemblerName)\n\t\t\t\t? fileAssemblers.get(fileAssemblerName)\n\t\t\t\t: new FileAssembler(executor, client, fileAssemblerName, edgeNode);\n\t\thandleFileMetric(fileAssembler, metric);\n\t}\n\n\t/*\n\t * Handles supplied metrics for the file assembler \n\t */\n\tprivate void handleFileMetric(FileAssembler fileAssembler, Metric metric) {\n\t\tFilePublishStatus filePublishStatus = fileAssembler.processMetric(metric);\n\t\tif (filePublishStatus == FilePublishStatus.CONTINUE) {\n\t\t\tif (!fileAssemblers.containsKey(fileAssembler.getName())) {\n\t\t\t\tfileAssemblers.put(fileAssembler.getName(), fileAssembler);\n\t\t\t}\n\t\t} else {\n\t\t\tif (fileAssemblers.containsKey(fileAssembler.getName())) {\n\t\t\t\tfileAssemblers.remove(fileAssembler.getName());\n\t\t\t}\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/examples/host_file/src/main/java/org/eclipse/tahu/example/host/file/model/EdgeNode.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.example.host.file.model;\n\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\n\npublic class EdgeNode {\n\n\tprivate final String groupName;\n\tprivate final String edgeNodeName;\n\tprivate final EdgeNodeDescriptor edgeNodeDescriptor;\n\n\tprivate boolean online;\n\tprivate long lastSeqNumber;\n\n\tpublic EdgeNode(String groupName, String edgeNodeName) {\n\t\tthis.groupName = groupName;\n\t\tthis.edgeNodeName = edgeNodeName;\n\t\tthis.edgeNodeDescriptor = new EdgeNodeDescriptor(groupName, edgeNodeName);\n\t\tthis.online = false;\n\t\tthis.lastSeqNumber = 255;\n\t}\n\n\tpublic String getGroupName() {\n\t\treturn groupName;\n\t}\n\n\tpublic String getEdgeNodeName() {\n\t\treturn edgeNodeName;\n\t}\n\n\tpublic EdgeNodeDescriptor getEdgeNodeId() {\n\t\treturn edgeNodeDescriptor;\n\t}\n\n\tpublic boolean isOnline() {\n\t\treturn online;\n\t}\n\n\tpublic void setOnline(boolean online) {\n\t\tthis.online = online;\n\t}\n\n\tpublic long getLastSeqNumber() {\n\t\treturn lastSeqNumber;\n\t}\n\n\tpublic void setLastSeqNumber(long lastSeqNumber) {\n\t\tthis.lastSeqNumber = lastSeqNumber;\n\t}\n\n\t@Override\n\tpublic int hashCode() {\n\t\tfinal int prime = 31;\n\t\tint result = 1;\n\t\tresult = prime * result + ((edgeNodeDescriptor == null) ? 0 : edgeNodeDescriptor.hashCode());\n\t\treturn result;\n\t}\n\n\t@Override\n\tpublic boolean equals(Object obj) {\n\t\tif (this == obj)\n\t\t\treturn true;\n\t\tif (obj == null)\n\t\t\treturn false;\n\t\tif (getClass() != obj.getClass())\n\t\t\treturn false;\n\t\tEdgeNode other = (EdgeNode) obj;\n\t\tif (edgeNodeDescriptor == null) {\n\t\t\tif (other.edgeNodeDescriptor != null)\n\t\t\t\treturn false;\n\t\t} else if (!edgeNodeDescriptor.equals(other.edgeNodeDescriptor))\n\t\t\treturn false;\n\t\treturn true;\n\t}\n}\n"
  },
  {
    "path": "java/examples/host_file/src/main/java/org/eclipse/tahu/example/host/file/model/FilePublishStatus.java",
    "content": "/********************************************************************************\n * Copyright (c) 2020-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.example.host.file.model;\n\n/**\n * Defines File Publish status\n */\npublic enum FilePublishStatus {\n\n\tNOT_SET(0, \"Not Set\"),\n\n\t// 1xx Informational\n\tCONTINUE(100, \"Continue File Transfer\"),\n\tIN_PROGRESS(101, \"File Transfer In Progress\"),\n\tTERMINATED(102, \"File Transfer Terminated\"),\n\n\t// 2xx Success\n\tSUCCESS(200, \"Success\"),\n\n\t// 4xx Transmission Side Error\n\tPUBLISH_FAILED(400, \"File Transfer Failed\"),\n\tSEQ_NUM_ERR_TRANSMISSION(401, \"Sequence Number Error (Transmission)\"),\n\tSPARKPLUG_PARSING_ERR(402, \"Sparkplug Parsing Error\"),\n\tRESOURCE_NOT_FOUND(404, \"Resource Not Found\"),\n\tTCLIENT_NOT_CONNECTED(405, \"Transmission Client Is Not Connected\"),\n\tREQUEST_TOUT(408, \"Request Timeout\"),\n\n\t// 5xx Engine Side Error\n\tSEQ_NUM_ERR_ENGINE(500, \"Sequence Number Error (Engine)\"),\n\tINVALID_METRICS(501, \"Invalid Metrics\"),\n\n\tMD5_ERR(502, \"File MD5 Sum Error\"),\n\tPARTIAL_MD5_ERR(503, \"Message MD5 Sum Error\"),\n\tFILE_WRITE_ERR(504, \"Error Writing File\"),\n\tRENAME_ERR(505, \"Error Renaming Partial File\");\n\n\tprivate int code;\n\tprivate String description;\n\n\t/**\n\t * FilePublishStatus constructor\n\t * \n\t * @param code - status code as {@link int}\n\t * @param description - status code description as {@link String}\n\t */\n\tprivate FilePublishStatus(int code, String description) {\n\t\tthis.code = code;\n\t\tthis.description = description;\n\t}\n\n\t/**\n\t * Reports status code\n\t * \n\t * @return status code as {@link int}\n\t */\n\tpublic int getCode() {\n\t\treturn code;\n\t}\n\n\t/**\n\t * Returns status description\n\t * \n\t * @return status description as {@link String}\n\t */\n\tpublic String getDescription() {\n\t\treturn description;\n\t}\n\n\t/**\n\t * Returns an instance of the {@link FilePublishStatus} per supplied status code\n\t * \n\t * @param statusCode - status code as {@link int}\n\t * @return an instance of {@link filePublishStatus}\n\t */\n\tpublic static FilePublishStatus getInstance(int statusCode) {\n\t\tFilePublishStatus ret = null;\n\t\tfor (FilePublishStatus filePublishStatus : FilePublishStatus.values()) {\n\t\t\tif (filePublishStatus.getCode() == statusCode) {\n\t\t\t\tret = filePublishStatus;\n\t\t\t\tbreak;\n\t\t\t}\n\t\t}\n\t\tif (ret == null) {\n\t\t\tret = FilePublishStatus.NOT_SET;\n\t\t}\n\t\treturn ret;\n\t}\n}\n"
  },
  {
    "path": "java/examples/host_file/src/main/java/org/eclipse/tahu/example/host/file/util/FileValidationUtils.java",
    "content": "/********************************************************************************\n * Copyright (c) 2020-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.example.host.file.util;\n\nimport java.io.File;\nimport java.io.FileInputStream;\nimport java.io.InputStream;\nimport java.security.MessageDigest;\n\nimport javax.xml.bind.DatatypeConverter;\n\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\n/**\n * Defines file utilities\n */\npublic class FileValidationUtils {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(FileValidationUtils.class.getName());\n\n\t/**\n\t * Default FileValidationUtils constructor\n\t */\n\tprivate FileValidationUtils() {\n\t\t// no-op\n\t}\n\n\t/**\n\t * Calculates MD5 sum of supplied byte array\n\t * \n\t * @param bytes - data buffer as {@link byte[]}\n\t * @return MD5 sum as {@link String}\n\t */\n\tpublic static String calculateMd5Sum(byte[] bytes) {\n\t\tString hashString = null;\n\t\ttry {\n\t\t\tMessageDigest md = MessageDigest.getInstance(\"MD5\");\n\t\t\thashString = DatatypeConverter.printHexBinary(md.digest(bytes));\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Error checking MD5 sum\", e);\n\t\t}\n\t\treturn hashString != null ? hashString.toLowerCase() : null;\n\t}\n\n\t/**\n\t * Calculates MD5 sum of supplied file\n\t * \n\t * @param file - file name as {@link File}\n\t * @return MD5 sum as {@link String}\n\t */\n\tpublic static String calculateMd5Sum(String filename) {\n\t\treturn calculateMd5Sum(new File(filename));\n\t}\n\n\t/**\n\t * Calculates MD5 sum of supplied file\n\t * \n\t * @param file - file object as {@link File}\n\t * @return MD5 sum as {@link String}\n\t */\n\tpublic static String calculateMd5Sum(File file) {\n\t\tString hashString = null;\n\t\tbyte[] buffer = new byte[1024];\n\t\ttry (InputStream fis = new FileInputStream(file)) {\n\t\t\tMessageDigest md = MessageDigest.getInstance(\"MD5\");\n\t\t\tint numRead = 0;\n\t\t\twhile (numRead != -1) {\n\t\t\t\tnumRead = fis.read(buffer);\n\t\t\t\tif (numRead > 0) {\n\t\t\t\t\tmd.update(buffer, 0, numRead);\n\t\t\t\t}\n\t\t\t}\n\t\t\thashString = DatatypeConverter.printHexBinary(md.digest()).toLowerCase();\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Error checking MD5 sum of a file: {}\", file, e);\n\t\t}\n\t\treturn hashString != null ? hashString.toLowerCase() : null;\n\t}\n}\n"
  },
  {
    "path": "java/examples/listener/THIRD-PARTY.txt",
    "content": "\nLists of 5 third-party dependencies.\n     (New BSD license) Protocol Buffer Java API (com.google.protobuf:protobuf-java:2.6.1 - https://developers.google.com/protocol-buffers/)\n     (The Apache Software License, Version 2.0) Apache Log4j (log4j:log4j:1.2.17 - http://logging.apache.org/log4j/1.2/)\n     (Eclipse Public License - Version 1.0) org.eclipse.paho.client.mqttv3 (org.eclipse.paho:org.eclipse.paho.client.mqttv3:1.0.2 - http://www.eclipse.org/paho/org.eclipse.paho.client.mqttv3)\n     (MIT License) SLF4J API Module (org.slf4j:slf4j-api:1.7.5 - http://www.slf4j.org)\n     (MIT License) SLF4J LOG4J-12 Binding (org.slf4j:slf4j-log4j12:1.7.5 - http://www.slf4j.org)\n"
  },
  {
    "path": "java/examples/listener/pom.xml",
    "content": "<!--/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n  xsi:schemaLocation=\"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">\n  <modelVersion>4.0.0</modelVersion>\n\n  <parent>\n    <groupId>org.eclipse.tahu</groupId>\n    <artifactId>tahu-examples</artifactId>\n    <version>1.0.7</version>\n    <relativePath>../pom.xml</relativePath>\n  </parent>\n\n  <artifactId>example_listener</artifactId>\n  <packaging>jar</packaging>\n  <name>Sparkplug B Listener Example</name>\n\n  <url>http://maven.apache.org</url>\n  <dependencies>\n    <dependency>\n      <groupId>org.eclipse.tahu</groupId>\n      <artifactId>tahu-core</artifactId>\n      <version>${project.version}</version>\n    </dependency>\n  </dependencies>\n  <build>\n    <plugins>\n      <plugin>\n        <groupId>org.sonatype.plugins</groupId>\n        <artifactId>nexus-staging-maven-plugin</artifactId>\n        <configuration>\n          <skipNexusStagingDeployMojo>true</skipNexusStagingDeployMojo>\n        </configuration>\n      </plugin>\n\n      <!-- New Build mechanism - replaces maven-assembly-plugin -->\n      <plugin>\n        <groupId>org.apache.maven.plugins</groupId>\n        <artifactId>maven-shade-plugin</artifactId>\n        <version>2.4.1</version>\n        <executions>\n          <execution>\n            <phase>package</phase>\n            <goals>\n              <goal>shade</goal>\n            </goals>\n            <configuration>\n              <createDependencyReducedPom>false</createDependencyReducedPom>\n              <filters>\n                <filter>\n                  <artifact>*:*</artifact>\n                  <excludes>\n                    <exclude>META-INF/*.SF</exclude>\n                    <exclude>META-INF/*.DSA</exclude>\n                    <exclude>META-INF/*.RSA</exclude>\n                  </excludes>\n                </filter>\n              </filters>\n              <transformers>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ManifestResourceTransformer\">\n                  <mainClass>org.eclipse.tahu.SparkplugListener</mainClass>\n                </transformer>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ServicesResourceTransformer\" />\n              </transformers>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <groupId>org.codehaus.mojo</groupId>\n        <artifactId>license-maven-plugin</artifactId>\n        <version>1.8</version>\n        <executions>\n          <execution>\n            <id>add-third-party</id>\n            <phase>package</phase>\n            <goals>\n              <goal>add-third-party</goal>\n              <goal>download-licenses</goal>\n            </goals>\n            <configuration>\n              <useMissingFile>true</useMissingFile>\n              <excludedScopes>test</excludedScopes>\n              <excludedGroups> (org.eclipse.tahu*)\n              </excludedGroups>\n              <licenseMerges>\n                <licenseMerge>The Apache Software License, Version\n                  2.0|Apache License, Version 2.0|Apache Public License\n                  2.0|Apache License 2.0|Apache Software License -\n                  Version 2.0</licenseMerge>\n              </licenseMerges>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n    </plugins>\n  </build>\n</project>\n"
  },
  {
    "path": "java/examples/listener/src/main/java/org/eclipse/tahu/SparkplugListener.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu;\n\nimport org.eclipse.paho.client.mqttv3.IMqttDeliveryToken;\nimport org.eclipse.paho.client.mqttv3.MqttCallbackExtended;\nimport org.eclipse.paho.client.mqttv3.MqttClient;\nimport org.eclipse.paho.client.mqttv3.MqttConnectOptions;\nimport org.eclipse.paho.client.mqttv3.MqttMessage;\nimport org.eclipse.tahu.message.SparkplugBPayloadDecoder;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.Topic;\nimport org.eclipse.tahu.util.TopicUtil;\n\nimport com.fasterxml.jackson.annotation.JsonInclude.Include;\nimport com.fasterxml.jackson.databind.ObjectMapper;\n\npublic class SparkplugListener implements MqttCallbackExtended {\n\n\t// Configuration\n\tprivate String serverUrl = \"tcp://localhost:1883\";\n\tprivate String clientId = \"SparkplugBListenerEdgeNode\";\n\tprivate String username = \"admin\";\n\tprivate String password = \"changeme\";\n\tprivate MqttClient client;\n\n\tpublic static void main(String[] args) {\n\t\tSparkplugListener listener = new SparkplugListener();\n\t\tlistener.run();\n\t}\n\n\tpublic void run() {\n\t\ttry {\n\t\t\t// Connect to the MQTT Server\n\t\t\tMqttConnectOptions options = new MqttConnectOptions();\n\t\t\toptions.setAutomaticReconnect(true);\n\t\t\toptions.setCleanSession(true);\n\t\t\toptions.setConnectionTimeout(30);\n\t\t\toptions.setKeepAliveInterval(30);\n\t\t\toptions.setUserName(username);\n\t\t\toptions.setPassword(password.toCharArray());\n\t\t\tclient = new MqttClient(serverUrl, clientId);\n\t\t\tclient.setTimeToWait(5000); // short timeout on failure to connect\n\t\t\tclient.connect(options);\n\t\t\tclient.setCallback(this);\n\n\t\t\t// Just listen to all DDATA messages on spAv1.0 topics and wait for inbound messages\n\t\t\tclient.subscribe(\"spBv1.0/#\", 0);\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\t@Override\n\tpublic void connectComplete(boolean reconnect, String serverURI) {\n\t\tSystem.out.println(\"Connected!\");\n\t}\n\n\t@Override\n\tpublic void connectionLost(Throwable cause) {\n\t\tSystem.out.println(\"The MQTT Connection was lost! - will auto-reconnect\");\n\t}\n\n\t@Override\n\tpublic void messageArrived(String topic, MqttMessage message) throws Exception {\n\t\tTopic sparkplugTopic = TopicUtil.parseTopic(topic);\n\t\tObjectMapper mapper = new ObjectMapper();\n\t\tmapper.setSerializationInclusion(Include.NON_NULL);\n\n\t\tSystem.out.println(\"Message Arrived on Sparkplug topic \" + sparkplugTopic.toString());\n\n\t\tSparkplugBPayloadDecoder decoder = new SparkplugBPayloadDecoder();\n\t\tSparkplugBPayload inboundPayload = decoder.buildFromByteArray(message.getPayload(), null);\n\n\t\t// Convert the message to JSON and print to system.out\n\t\ttry {\n\t\t\tString payloadString = mapper.writeValueAsString(inboundPayload);\n\t\t\tSystem.out.println(mapper.writerWithDefaultPrettyPrinter().writeValueAsString(inboundPayload));\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\t@Override\n\tpublic void deliveryComplete(IMqttDeliveryToken token) {\n\t\tSystem.out.println(\"Published message: \" + token);\n\t}\n}\n"
  },
  {
    "path": "java/examples/listener/src/main/resources/log4j.properties",
    "content": "#/********************************************************************************\n# * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n# *\n# * This program and the accompanying materials are made available under the\n# * terms of the Eclipse Public License 2.0 which is available at\n# * http://www.eclipse.org/legal/epl-2.0.\n# *\n# * SPDX-License-Identifier: EPL-2.0\n# *\n# * Contributors:\n# *   Cirrus Link Solutions - initial implementation\n# ********************************************************************************/\n \n# Root logger option\nlog4j.rootLogger=ERROR, stdout\n\n# Direct log messages to stdout\nlog4j.appender.stdout=org.apache.log4j.ConsoleAppender\nlog4j.appender.stdout.Target=System.out\nlog4j.appender.stdout.layout=org.apache.log4j.PatternLayout\nlog4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n"
  },
  {
    "path": "java/examples/pom.xml",
    "content": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<!--/********************************************************************************\n * Copyright (c) 2014-2020 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n  xsi:schemaLocation=\"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">\n  <modelVersion>4.0.0</modelVersion>\n\n  <properties>\n    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>\n    <main.basedir>${project.basedir}</main.basedir>\n    <paho.version>1.2.5</paho.version>\n    <jackson.version>2.13.4</jackson.version>\n    <jackson.databind.version>2.13.4.2</jackson.databind.version>\n    <logback.version>1.2.9</logback.version>\n    <protobuf.version>3.16.3</protobuf.version>\n    <slf4j.version>1.7.32</slf4j.version>\n  </properties>\n\n  <groupId>org.eclipse.tahu</groupId>\n  <artifactId>tahu-examples</artifactId>\n  <version>1.0.7</version>\n  <packaging>pom</packaging>\n\n  <name>Eclipse Tahu</name>\n  <url>http://www.eclipse.org/tahu</url>\n  <description>\n    The Tahu project provides open-source implementations of Eclipse Sparkplug\n  </description>\n\n  <organization>\n    <name>Eclipse Tahu</name>\n    <url>http://www.eclipse.org/tahu</url>\n  </organization>\n\n  <developers>\n    <developer>\n      <id>ibinshtok</id>\n      <name>Ilya Binshtok</name>\n      <organization>Cirrus Link Solutions</organization>\n      <organizationUrl>http://www.cirrus-link.com</organizationUrl>\n      <roles>\n        <role>Developer</role>\n      </roles>\n    </developer>\n    <developer>\n      <id>nathandavenport</id>\n      <name>Nathan Davenport</name>\n      <organization>Cirrus Link Solutions</organization>\n      <organizationUrl>http://www.cirrus-link.com</organizationUrl>\n      <roles>\n        <role>Developer</role>\n      </roles>\n    </developer>\n    <developer>\n      <id>wes-johnson</id>\n      <name>Wes Johnson</name>\n      <organization>Cirrus Link Solutions</organization>\n      <organizationUrl>http://www.cirrus-link.com</organizationUrl>\n      <roles>\n        <role>Developer</role>\n      </roles>\n    </developer>\n    <developer>\n      <id>ckienle</id>\n      <name>Chad Kienle</name>\n      <organization>Cirrus Link Solutions</organization>\n      <organizationUrl>http://www.cirrus-link.com</organizationUrl>\n      <roles>\n        <role>Developer</role>\n      </roles>\n    </developer>\n  </developers>\n  <licenses>\n    <license>\n      <name>Eclipse Public License - Version 2.0</name>\n      <url>https://www.eclipse.org/legal/epl-2.0</url>\n    </license>\n  </licenses>\n\n  <scm>\n    <url>https://github.com/eclipse/tahu.git</url>\n    <connection>scm:git:git@github.com:eclipse/tahu.git</connection>\n  </scm>\n\n  <distributionManagement>\n    <snapshotRepository>\n      <id>ossrh</id>\n      <url>https://oss.sonatype.org/content/repositories/snapshots</url>\n    </snapshotRepository>\n    <repository>\n      <id>ossrh</id>\n      <url>https://oss.sonatype.org/service/local/staging/deploy/maven2/</url>\n    </repository>\n  </distributionManagement>\n\n  <modules>\n    <module>records/pom.xml</module>\n    <module>listener/pom.xml</module>\n    <module>edge_node_control/pom.xml</module>\n    <module>simple/pom.xml</module>\n    <module>device_timestamp/pom.xml</module>\n    <module>udt/pom.xml</module>\n<!--\n    <module>raspberry_pi/pom.xml</module>\n-->\n    <module>host_file/pom.xml</module>\n  </modules>\n\n  <dependencies>\n    <dependency>\n      <groupId>junit</groupId>\n      <artifactId>junit</artifactId>\n      <version>4.13.2</version>\n      <scope>test</scope>\n    </dependency>\n    <dependency>\n      <groupId>org.testng</groupId>\n      <artifactId>testng</artifactId>\n      <version>6.9.10</version>\n      <scope>test</scope>\n    </dependency>\n    <dependency>\n      <groupId>org.assertj</groupId>\n      <artifactId>assertj-core</artifactId>\n      <version>3.5.1</version>\n      <scope>test</scope>\n    </dependency>\n    <dependency>\n      <groupId>org.eclipse.paho</groupId>\n      <artifactId>org.eclipse.paho.client.mqttv3</artifactId>\n      <version>${paho.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>com.google.protobuf</groupId>\n      <artifactId>protobuf-java</artifactId>\n      <version>${protobuf.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>commons-io</groupId>\n      <artifactId>commons-io</artifactId>\n      <version>2.11.0</version>\n    </dependency>\n    <dependency>\n      <groupId>com.fasterxml.jackson.core</groupId>\n      <artifactId>jackson-core</artifactId>\n      <version>${jackson.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>com.fasterxml.jackson.core</groupId>\n      <artifactId>jackson-annotations</artifactId>\n      <version>${jackson.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>com.fasterxml.jackson.core</groupId>\n      <artifactId>jackson-databind</artifactId>\n      <version>${jackson.databind.version}</version>\n    </dependency>\n\n    <!-- Logging -->\n    <dependency>\n      <groupId>org.slf4j</groupId>\n      <artifactId>slf4j-api</artifactId>\n      <version>${slf4j.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>ch.qos.logback</groupId>\n      <artifactId>logback-classic</artifactId>\n      <version>${logback.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>ch.qos.logback</groupId>\n      <artifactId>logback-core</artifactId>\n      <version>${logback.version}</version>\n    </dependency>\n  </dependencies>\n\n  <build>\n    <plugins>\n      <plugin>\n        <groupId>org.apache.maven.plugins</groupId>\n        <artifactId>maven-compiler-plugin</artifactId>\n        <version>3.7.0</version>\n        <configuration>\n          <source>1.8</source>\n          <target>1.8</target>\n          <encoding>UTF-8</encoding>\n        </configuration>\n      </plugin>\n      <plugin>\n        <groupId>org.apache.maven.plugins</groupId>\n        <artifactId>maven-gpg-plugin</artifactId>\n        <version>1.6</version>\n        <executions>\n          <execution>\n            <id>sign-artifacts</id>\n            <phase>deploy</phase>\n            <goals>\n              <goal>sign</goal>\n            </goals>\n            <configuration>\n                <gpgArguments>\n                    <arg>--pinentry-mode</arg>\n                    <arg>loopback</arg>\n                </gpgArguments>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <!--\n          Use the Nexus Staging plugin as a full replacement for the standard\n          Maven Deploy plugin.\n          See https://github.com/sonatype/nexus-maven-plugins/tree/master/staging/maven-plugin\n          why this makes sense :-)\n          We can control whether we want to deploy to the Eclipse repo or Maven Central\n          by a combination of the version being a SNAPHOT or release version and property\n          skipStaging=true/false.\n          In any case we can take advantage of the plugin's \"deferred deploy\" feature which\n          makes sure that all artifacts of a multi-module project are deployed as a whole\n          at the end of the build process instead of deploying each module's artifacts\n          individually as part of building the module.\n        -->\n        <groupId>org.sonatype.plugins</groupId>\n        <artifactId>nexus-staging-maven-plugin</artifactId>\n        <version>1.6.8</version>\n        <extensions>true</extensions>\n        <configuration>\n          <serverId>ossrh</serverId>\n          <nexusUrl>https://oss.sonatype.org/</nexusUrl>\n          <autoReleaseAfterClose>true</autoReleaseAfterClose>\n          <skipNexusStagingDeployMojo>false</skipNexusStagingDeployMojo>\n        </configuration>\n      </plugin>\n    </plugins>\n    <extensions>\n      <extension>\n        <groupId>org.kuali.maven.wagons</groupId>\n        <artifactId>maven-s3-wagon</artifactId>\n        <version>1.2.1</version>\n      </extension>\n    </extensions>\n  </build>\n</project>\n"
  },
  {
    "path": "java/examples/raspberry_pi/THIRD-PARTY.txt",
    "content": "\nLists of 5 third-party dependencies.\n     (New BSD license) Protocol Buffer Java API (com.google.protobuf:protobuf-java:2.6.1 - https://developers.google.com/protocol-buffers/)\n     (The Apache Software License, Version 2.0) Apache Log4j (log4j:log4j:1.2.17 - http://logging.apache.org/log4j/1.2/)\n     (Eclipse Public License - Version 1.0) org.eclipse.paho.client.mqttv3 (org.eclipse.paho:org.eclipse.paho.client.mqttv3:1.0.2 - http://www.eclipse.org/paho/org.eclipse.paho.client.mqttv3)\n     (MIT License) SLF4J API Module (org.slf4j:slf4j-api:1.7.5 - http://www.slf4j.org)\n     (MIT License) SLF4J LOG4J-12 Binding (org.slf4j:slf4j-log4j12:1.7.5 - http://www.slf4j.org)\n"
  },
  {
    "path": "java/examples/raspberry_pi/pom.xml",
    "content": "<!--/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n  xsi:schemaLocation=\"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">\n  <modelVersion>4.0.0</modelVersion>\n\n  <parent>\n    <groupId>org.eclipse.tahu</groupId>\n    <artifactId>tahu-examples</artifactId>\n    <version>1.0.7</version>\n    <relativePath>../pom.xml</relativePath>\n  </parent>\n\n  <artifactId>example_raspberry_pi</artifactId>\n  <packaging>jar</packaging>\n  <name>Sparkplug B Raspberry Pi Example</name>\n\n  <repositories>\n  \t<repository>\n  \t  <id>Kura Releases</id>\n  \t  <name>Kura Repository - Releases</name>\n  \t  <url>https://repo.eclipse.org/content/repositories/kura-releases/</url>\n  \t</repository>\n  </repositories>\n  <dependencies>\n    <dependency>\n      <groupId>org.eclipse.tahu</groupId>\n      <artifactId>tahu-core</artifactId>\n      <version>${project.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>org.eclipse.kura</groupId>\n      <artifactId>jdk.dio</artifactId>\n      <version>1.0.400</version>\n    </dependency>\n    <dependency>\n      <groupId>javax.xml.bind</groupId>\n      <artifactId>jaxb-api</artifactId>\n      <version>2.3.0</version>\n    </dependency>\n  </dependencies>\n  <build>\n    <plugins>\n      <plugin>\n        <groupId>org.sonatype.plugins</groupId>\n        <artifactId>nexus-staging-maven-plugin</artifactId>\n        <configuration>\n          <skipNexusStagingDeployMojo>true</skipNexusStagingDeployMojo>\n        </configuration>\n      </plugin>\n\n      <!-- New Build mechanism - replaces maven-assembly-plugin -->\n      <plugin>\n        <groupId>org.apache.maven.plugins</groupId>\n        <artifactId>maven-shade-plugin</artifactId>\n        <version>2.4.1</version>\n        <executions>\n          <execution>\n            <phase>package</phase>\n            <goals>\n              <goal>shade</goal>\n            </goals>\n            <configuration>\n              <filters>\n                <filter>\n                  <artifact>*:*</artifact>\n                  <excludes>\n                    <exclude>META-INF/*.SF</exclude>\n                    <exclude>META-INF/*.DSA</exclude>\n                    <exclude>META-INF/*.RSA</exclude>\n                  </excludes>\n                </filter>\n              </filters>\n              <transformers>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ManifestResourceTransformer\">\n                  <mainClass>org.eclipse.tahu.SparkplugRaspberryPiExample</mainClass>\n                </transformer>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ServicesResourceTransformer\" />\n              </transformers>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <groupId>org.codehaus.mojo</groupId>\n        <artifactId>license-maven-plugin</artifactId>\n        <version>1.8</version>\n        <executions>\n          <execution>\n            <id>add-third-party</id>\n            <phase>package</phase>\n            <goals>\n              <goal>add-third-party</goal>\n              <goal>download-licenses</goal>\n            </goals>\n            <configuration>\n              <useMissingFile>true</useMissingFile>\n              <excludedScopes>test</excludedScopes>\n              <excludedGroups> (org.eclipse.tahu*)\n              </excludedGroups>\n              <licenseMerges>\n                <licenseMerge>The Apache Software License, Version\n                  2.0|Apache License, Version 2.0|Apache Public License\n                  2.0|Apache License 2.0|Apache Software License -\n                  Version 2.0</licenseMerge>\n              </licenseMerges>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n    </plugins>\n  </build>\n</project>\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/SparkplugRaspberryPiExample.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu;\n\nimport java.net.Inet4Address;\nimport java.net.Inet6Address;\nimport java.net.InetAddress;\nimport java.net.NetworkInterface;\nimport java.util.Date;\nimport java.util.Enumeration;\nimport java.util.concurrent.ExecutorService;\nimport java.util.concurrent.Executors;\n\nimport javax.net.SocketFactory;\nimport javax.net.ssl.SSLSocketFactory;\n\nimport org.eclipse.paho.client.mqttv3.IMqttDeliveryToken;\nimport org.eclipse.paho.client.mqttv3.MqttCallbackExtended;\nimport org.eclipse.paho.client.mqttv3.MqttClient;\nimport org.eclipse.paho.client.mqttv3.MqttConnectOptions;\nimport org.eclipse.paho.client.mqttv3.MqttException;\nimport org.eclipse.paho.client.mqttv3.MqttMessage;\nimport org.eclipse.paho.client.mqttv3.MqttPersistenceException;\nimport org.eclipse.tahu.message.SparkplugBPayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadEncoder;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.eclipse.tahu.pi.dio.DioException;\nimport org.eclipse.tahu.pi.dio.DioLibrary;\nimport org.eclipse.tahu.pi.system.SystemInfo;\nimport org.eclipse.tahu.pibrella.Pibrella;\nimport org.eclipse.tahu.pibrella.PibrellaInputPin;\nimport org.eclipse.tahu.pibrella.PibrellaInputPins;\nimport org.eclipse.tahu.pibrella.PibrellaLED;\nimport org.eclipse.tahu.pibrella.PibrellaLEDs;\nimport org.eclipse.tahu.pibrella.PibrellaOutputPin;\nimport org.eclipse.tahu.pibrella.PibrellaOutputPins;\nimport org.eclipse.tahu.pibrella.PibrellaPins;\n\nimport jdk.dio.gpio.PinEvent;\nimport jdk.dio.gpio.PinListener;\n\n/**\n * An example Sparkplug B application.\n */\npublic class SparkplugRaspberryPiExample implements MqttCallbackExtended {\n\n\tprivate Pibrella pibrella = Pibrella.getInstance();\n\n\tprivate static final String NAMESPACE = \"spBv1.0\";\n\n\tprivate static final String BUTTON_CNT_SETPOINT_METRICS_NAME = \"button count setpoint\";\n\tprivate static final String DFLT_MQTT_SERVER_HOST_NAME = \"192.168.1.53\";\n\tprivate static final int DFLT_MQTT_PORT = 1883;\n\n\t// Configuration\n\tprivate static final boolean USING_REAL_TLS = false;\n\tprivate static String mqttServerHostName;\n\tprivate static int mqttServerPort;\n\n\tprivate String groupId = \"Sparkplug B Devices\";\n\tprivate String edgeNode = \"Java Raspberry Pi Example\";\n\tprivate String deviceId = \"Pibrella\";\n\tprivate String clientId = \"SparkplugRaspberryPiExampleEdgeNode\";\n\tprivate String username = \"admin\";\n\tprivate String password = \"changeme\";\n\tprivate ExecutorService executor;\n\tprivate MqttClient client;\n\n\t// Some control and parameter points for this demo\n\tprivate int configChangeCount = 1;\n\tprivate int scanRateMs = 1000;\n\tprivate long upTimeStart = System.currentTimeMillis();\n\tprivate int buttonCounter = 0;\n\tprivate int buttonCounterSetpoint = 10;\n\n\tprivate long bdSeq = 0;\n\tprivate long seq = 0;\n\n\tprivate Object lock = new Object();\n\n\tpublic static void main(String[] args) {\n\n\t\tparseCommandLineArguments(args);\n\n\t\tRuntime.getRuntime().addShutdownHook(new Thread() {\n\t\t\t@Override\n\t\t\tpublic void run() {\n\t\t\t\tshutdownPibrella();\n\t\t\t}\n\t\t});\n\t\ttry {\n\t\t\tDioLibrary diolib = DioLibrary.getInstance();\n\t\t\tdiolib.setDioLibrary();\n\t\t\tdiolib.setJavaLibraryPath();\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\n\t\tSparkplugRaspberryPiExample example = new SparkplugRaspberryPiExample();\n\t\texample.run();\n\t}\n\n\tprivate static String formServerUrl(String mqttServerhostname, int port) {\n\t\tStringBuilder sb = new StringBuilder(\"tcp://\");\n\t\tif (mqttServerhostname != null) {\n\t\t\tsb.append(mqttServerhostname);\n\t\t} else {\n\t\t\tsb.append(DFLT_MQTT_SERVER_HOST_NAME);\n\t\t}\n\t\tsb.append(':');\n\t\tif (port > 0) {\n\t\t\tsb.append(port);\n\t\t} else {\n\t\t\tsb.append(DFLT_MQTT_PORT);\n\t\t}\n\t\treturn sb.toString();\n\t}\n\n\tpublic void run() {\n\t\ttry {\n\t\t\t// Thread pool for outgoing published messages\n\t\t\texecutor = Executors.newFixedThreadPool(1);\n\n\t\t\t// Establish the session with autoreconnect = true;\n\t\t\testablishMqttSession();\n\n\t\t\t// Create the Raspberry Pi Pibrella board listeners\n\t\t\tcreatePibrellaListeners();\n\n\t\t\t// Wait for 'ctrl c' to exit\n\t\t\twhile (true) {\n\t\t\t\t//\n\t\t\t\t// This is a very simple loop for the demo that keeps the MQTT Session\n\t\t\t\t// up, and publishes the Up Time metric based on the current value of\n\t\t\t\t// the scanRateMs process variable.\n\t\t\t\t//\n\t\t\t\tif (client.isConnected()) {\n\t\t\t\t\tsynchronized (lock) {\n\t\t\t\t\t\tSparkplugBPayload payload = new SparkplugBPayloadBuilder(getNextSeqNum())\n\t\t\t\t\t\t\t\t.setTimestamp(new Date()).addMetric(new MetricBuilder(\"Up Time ms\",\n\t\t\t\t\t\t\t\t\t\tMetricDataType.Int64, System.currentTimeMillis() - upTimeStart).createMetric())\n\t\t\t\t\t\t\t\t.createPayload();\n\n\t\t\t\t\t\t// Publish current Up Time\n\t\t\t\t\t\texecutor.execute(new Publisher(NAMESPACE + \"/\" + groupId + \"/NDATA/\" + edgeNode, payload));\n\t\t\t\t\t}\n\t\t\t\t} else {\n\t\t\t\t\tSystem.out.println(\"Connection is not established - not sending data\");\n\t\t\t\t}\n\t\t\t\tThread.sleep(scanRateMs);\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\t/**\n\t * Establish an MQTT Session with Sparkplug defined Death Certificate. It may not be Immediately intuitive that the\n\t * Death Certificate is created prior to publishing the Birth Certificate, but the Death Certificate is actually\n\t * part of the MQTT Session establishment. For complete details of the actual MQTT wire protocol refer to the latest\n\t * OASyS MQTT V3.1.1 standards at: http://docs.oasis-open.org/mqtt/mqtt/v3.1.1/mqtt-v3.1.1.html\n\t * \n\t * @return true = MQTT Session Established\n\t */\n\tpublic boolean establishMqttSession() {\n\t\ttry {\n\n\t\t\t//\n\t\t\t// Setup the MQTT connection parameters using the Paho MQTT Client.\n\t\t\t//\n\t\t\tMqttConnectOptions options = new MqttConnectOptions();\n\n\t\t\tif (USING_REAL_TLS) {\n\t\t\t\tSocketFactory sf = SSLSocketFactory.getDefault();\n\t\t\t\toptions.setSocketFactory(sf);\n\t\t\t}\n\n\t\t\t// Autoreconnect enable\n\t\t\toptions.setAutomaticReconnect(true);\n\t\t\t// MQTT session parameters Clean Start = true\n\t\t\toptions.setCleanSession(true);\n\t\t\t// Session connection attempt timeout period in seconds\n\t\t\toptions.setConnectionTimeout(10);\n\t\t\t// MQTT session parameter Keep Alive Period in Seconds\n\t\t\toptions.setKeepAliveInterval(30);\n\t\t\t// MQTT Client Username\n\t\t\toptions.setUserName(username);\n\t\t\t// MQTT Client Password\n\t\t\toptions.setPassword(password.toCharArray());\n\t\t\t//\n\t\t\t// Build up the Death Certificate MQTT Payload. Note that the Death\n\t\t\t// Certificate payload sequence number\n\t\t\t// is not tied to the normal message sequence numbers.\n\t\t\t//\n\t\t\tSparkplugBPayload payload = new SparkplugBPayloadBuilder(getNextSeqNum()).setTimestamp(new Date())\n\t\t\t\t\t.addMetric(new MetricBuilder(\"bdSeq\", MetricDataType.Int64, bdSeq).createMetric()).createPayload();\n\t\t\tbyte[] bytes = new SparkplugBPayloadEncoder().getBytes(payload, false);\n\t\t\t//\n\t\t\t// Setup the Death Certificate Topic/Payload into the MQTT session\n\t\t\t// parameters\n\t\t\t//\n\t\t\toptions.setWill(NAMESPACE + \"/\" + groupId + \"/NDEATH/\" + edgeNode, bytes, 0, false);\n\n\t\t\t//\n\t\t\t// Create a new Paho MQTT Client\n\t\t\t//\n\t\t\tString serverUrl = formServerUrl(mqttServerHostName, mqttServerPort);\n\t\t\tclient = new MqttClient(serverUrl, clientId);\n\t\t\t//\n\t\t\t// Using the parameters set above, try to connect to the define MQTT\n\t\t\t// server now.\n\t\t\t//\n\t\t\tSystem.out.println(\"Trying to establish an MQTT Session to the MQTT Server @ :\" + serverUrl);\n\t\t\tclient.connect(options);\n\t\t\tSystem.out.println(\"MQTT Session Established\");\n\t\t\tclient.setCallback(this);\n\t\t\t//\n\t\t\t// With a successful MQTT Session in place, now issue subscriptions\n\t\t\t// for the EoN Node and Device \"Command\" Topics of 'NCMD' and 'DCMD'\n\t\t\t// defined in Sparkplug\n\t\t\t//\n\t\t\tclient.subscribe(NAMESPACE + \"/\" + groupId + \"/NCMD/\" + edgeNode + \"/#\", 0);\n\t\t\tclient.subscribe(NAMESPACE + \"/\" + groupId + \"/DCMD/\" + edgeNode + \"/#\", 0);\n\t\t} catch (Exception e) {\n\t\t\tSystem.out.println(\"Error Establishing an MQTT Session:\");\n\t\t\te.printStackTrace();\n\t\t\treturn false;\n\t\t}\n\t\treturn true;\n\t}\n\n\t/**\n\t * Publish the EoN Node Birth Certificate and the Device Birth Certificate per the Sparkplug Specification\n\t */\n\tpublic void publishBirth() {\n\t\ttry {\n\t\t\tsynchronized (lock) {\n\t\t\t\t// Since this is a birth - reset the seq number\n\t\t\t\t// Note that message sequence numbers will appear in\n\t\t\t\t// the \"Node Metrics\" folder in Ignition.\n\t\t\t\tseq = 0;\n\n\t\t\t\t//\n\t\t\t\t// Create the NBIRTH Certificate per the Sparkplug\n\t\t\t\t// specification\n\t\t\t\t//\n\n\t\t\t\t//\n\t\t\t\t// Create the EoN Node BIRTH payload with any number of\n\t\t\t\t// read/write properties for this node. These parameters will\n\t\t\t\t// appear in\n\t\t\t\t// folders under this Node in the Ignition tag structure.\n\t\t\t\t//\n\t\t\t\tSparkplugBPayloadBuilder payloadBuilder = new SparkplugBPayloadBuilder(getNextSeqNum())\n\t\t\t\t\t\t.setTimestamp(new Date())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"bdSeq\", MetricDataType.Int64, bdSeq).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"Up Time ms\", MetricDataType.Int64,\n\t\t\t\t\t\t\t\tSystem.currentTimeMillis() - upTimeStart).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"Node Control/Next Server\", MetricDataType.Boolean, false)\n\t\t\t\t\t\t\t\t.createMetric())\n\t\t\t\t\t\t.addMetric(\n\t\t\t\t\t\t\t\tnew MetricBuilder(\"Node Control/Rebirth\", MetricDataType.Boolean, false).createMetric())\n\t\t\t\t\t\t.addMetric(\n\t\t\t\t\t\t\t\tnew MetricBuilder(\"Node Control/Reboot\", MetricDataType.Boolean, false).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"Node Control/Scan Rate ms\", MetricDataType.Int32, scanRateMs)\n\t\t\t\t\t\t\t\t.createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"Properties/Board Model\", MetricDataType.String,\n\t\t\t\t\t\t\t\tSystemInfo.getInstance().getModel()).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"Properties/Board Manufacturer\", MetricDataType.String,\n\t\t\t\t\t\t\t\tSystemInfo.getInstance().getManufacturer()).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"Properties/Hardware\", MetricDataType.String,\n\t\t\t\t\t\t\t\tSystemInfo.getInstance().getHardware()).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"Properties/OS FW Build\", MetricDataType.String,\n\t\t\t\t\t\t\t\tSystemInfo.getInstance().getOsFirmwareBuild()).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"Config Change Count\", MetricDataType.Int32, configChangeCount)\n\t\t\t\t\t\t\t\t.createMetric());\n\n\t\t\t\t// Increment the bdSeq number for the next use\n\t\t\t\tincrementBdSeqNum();\n\n\t\t\t\ttry {\n\t\t\t\t\t// Add the Raspberry Pi's real network addresses\n\t\t\t\t\tEnumeration<NetworkInterface> e = NetworkInterface.getNetworkInterfaces();\n\t\t\t\t\twhile (e.hasMoreElements()) {\n\t\t\t\t\t\tNetworkInterface n = e.nextElement();\n\t\t\t\t\t\tEnumeration<InetAddress> ee = n.getInetAddresses();\n\t\t\t\t\t\twhile (ee.hasMoreElements()) {\n\t\t\t\t\t\t\tInetAddress i = ee.nextElement();\n\t\t\t\t\t\t\tif (i instanceof Inet4Address) {\n\t\t\t\t\t\t\t\tpayloadBuilder.addMetric(\n\t\t\t\t\t\t\t\t\t\tnew MetricBuilder(\"Properties/IP Addresses/\" + n.getName() + \"/\" + \"IPV4\",\n\t\t\t\t\t\t\t\t\t\t\t\tMetricDataType.String, i.getHostAddress()).createMetric());\n\t\t\t\t\t\t\t} else if (i instanceof Inet6Address) {\n\t\t\t\t\t\t\t\tpayloadBuilder.addMetric(new MetricBuilder(\n\t\t\t\t\t\t\t\t\t\t\"Properties/IP Addresses/\" + n.getName() + \"/\" + \"IPV6\", MetricDataType.String,\n\t\t\t\t\t\t\t\t\t\ti.getHostAddress().substring(0, i.getHostAddress().indexOf('%')))\n\t\t\t\t\t\t\t\t\t\t\t\t.createMetric());\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t} catch (Exception e) {\n\t\t\t\t\te.printStackTrace();\n\t\t\t\t}\n\n\t\t\t\t//\n\t\t\t\t// Now publish the EoN Node Birth Certificate.\n\t\t\t\t// Note that the required \"Sequence Number\" metric 'seq' needs\n\t\t\t\t// to\n\t\t\t\t// be RESET TO A VALUE OF ZERO for the message. The 'timestamp'\n\t\t\t\t// metric\n\t\t\t\t// is added into the payload by the Publisher() thread.\n\t\t\t\t//\n\t\t\t\texecutor.execute(new Publisher(NAMESPACE + \"/\" + groupId + \"/NBIRTH/\" + edgeNode,\n\t\t\t\t\t\tpayloadBuilder.createPayload()));\n\n\t\t\t\t//\n\t\t\t\t// Create the Device BIRTH Certificate now. The tags defined\n\t\t\t\t// here will appear in a\n\t\t\t\t// folder hierarchy under the associated Device.\n\t\t\t\t//\n\t\t\t\tSparkplugBPayload payload = new SparkplugBPayloadBuilder(getNextSeqNum()).setTimestamp(new Date())\n\t\t\t\t\t\t// Create an \"Inputs\" folder of process variables\n\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaInputPins.A.getPin().getDescription(),\n\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getInput(PibrellaInputPins.A).isHigh()).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaInputPins.B.getPin().getDescription(),\n\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getInput(PibrellaInputPins.B).isHigh()).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaInputPins.C.getPin().getDescription(),\n\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getInput(PibrellaInputPins.C).isHigh()).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaInputPins.D.getPin().getDescription(),\n\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getInput(PibrellaInputPins.D).isHigh()).createMetric())\n\t\t\t\t\t\t// Create an \"Outputs\" folder of process variables\n\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaOutputPins.E.getPin().getDescription(),\n\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getOutput(PibrellaOutputPins.E).isHigh())\n\t\t\t\t\t\t\t\t\t\t.createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaOutputPins.F.getPin().getDescription(),\n\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getOutput(PibrellaOutputPins.F).isHigh())\n\t\t\t\t\t\t\t\t\t\t.createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaOutputPins.G.getPin().getDescription(),\n\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getOutput(PibrellaOutputPins.G).isHigh())\n\t\t\t\t\t\t\t\t\t\t.createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaOutputPins.H.getPin().getDescription(),\n\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getOutput(PibrellaOutputPins.H).isHigh())\n\t\t\t\t\t\t\t\t\t\t.createMetric())\n\t\t\t\t\t\t// Create an additional folder under \"Outputs\" called \"LEDs\"\n\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaLEDs.GREEN.getPin().getDescription(),\n\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getLED(PibrellaLEDs.GREEN).isOn()).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaLEDs.RED.getPin().getDescription(), MetricDataType.Boolean,\n\t\t\t\t\t\t\t\tpibrella.getLED(PibrellaLEDs.RED).isOn()).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaLEDs.YELLOW.getPin().getDescription(),\n\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getLED(PibrellaLEDs.YELLOW).isOn()).createMetric())\n\t\t\t\t\t\t// Place the button process variables at the root level of the\n\t\t\t\t\t\t// tag hierarchy\n\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaPins.BUTTON.getDescription(), MetricDataType.Boolean,\n\t\t\t\t\t\t\t\tpibrella.getButton().isPressed()).createMetric())\n\t\t\t\t\t\t.addMetric(\n\t\t\t\t\t\t\t\tnew MetricBuilder(\"button count\", MetricDataType.Int32, buttonCounter).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(BUTTON_CNT_SETPOINT_METRICS_NAME, MetricDataType.Int32,\n\t\t\t\t\t\t\t\tbuttonCounterSetpoint).createMetric())\n\t\t\t\t\t\t.addMetric(\n\t\t\t\t\t\t\t\tnew MetricBuilder(PibrellaPins.BUZZER.getDescription(), MetricDataType.Boolean, false)\n\t\t\t\t\t\t\t\t\t\t.createMetric())\n\t\t\t\t\t\t.createPayload();\n\n\t\t\t\t// Publish the Device BIRTH Certificate now\n\t\t\t\texecutor.execute(\n\t\t\t\t\t\tnew Publisher(NAMESPACE + \"/\" + groupId + \"/DBIRTH/\" + edgeNode + \"/\" + deviceId, payload));\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\t// Used to get the sequence number\n\tprivate void incrementBdSeqNum() {\n\t\tif (bdSeq == 256) {\n\t\t\tbdSeq = 0;\n\t\t} else {\n\t\t\tbdSeq++;\n\t\t}\n\t}\n\n\t// Used to get the sequence number\n\tprivate long getNextSeqNum() {\n\t\tlong retSeq = seq;\n\t\tif (seq == 256) {\n\t\t\tseq = 0;\n\t\t} else {\n\t\t\tseq++;\n\t\t}\n\t\treturn retSeq;\n\t}\n\n\t@Override\n\tpublic void connectComplete(boolean reconnect, String serverURI) {\n\t\tSystem.out.println(\"Connected! - publishing birth\");\n\t\tpublishBirth();\n\t}\n\n\tpublic void connectionLost(Throwable cause) {\n\t\tSystem.out.println(\"The MQTT Connection was lost!\");\n\t}\n\n\t/**\n\t * Based on our subscriptions to the MQTT Server, the messageArrived() callback is called on all arriving MQTT\n\t * messages. Based on the Sparkplug Topic Namespace, each message is parsed and an appropriate action is taken.\n\t * \n\t */\n\tpublic void messageArrived(String topic, MqttMessage message) throws Exception {\n\t\tSystem.out.println(\"Message Arrived on topic \" + topic);\n\n\t\t// Initialize the outbound payload if required.\n\t\tSparkplugBPayloadBuilder outboundPayloadBuilder =\n\t\t\t\tnew SparkplugBPayloadBuilder(getNextSeqNum()).setTimestamp(new Date());\n\n\t\tString[] splitTopic = topic.split(\"/\");\n\t\tif (splitTopic[0].equals(NAMESPACE) && splitTopic[1].equals(groupId) && splitTopic[2].equals(\"NCMD\")\n\t\t\t\t&& splitTopic[3].equals(edgeNode)) {\n\n\t\t\tSparkplugBPayload inboundPayload =\n\t\t\t\t\tnew SparkplugBPayloadDecoder().buildFromByteArray(message.getPayload(), null);\n\n\t\t\tfor (Metric metric : inboundPayload.getMetrics()) {\n\t\t\t\tSystem.out.println(\"Metric: \" + metric.getName() + \" :: \" + metric.getValue());\n\n\t\t\t\tif (metric.getName().equals(\"Node Control/Next Server\")) {\n\t\t\t\t\tSystem.out.println(\"Received a Next Server command.\");\n\t\t\t\t} else if (metric.getName().equals(\"Node Control/Rebirth\")) {\n\t\t\t\t\tpublishBirth();\n\t\t\t\t} else if (metric.getName().equals(\"Node Control/Reboot\")) {\n\t\t\t\t\tSystem.out.println(\"Received a Reboot command.\");\n\t\t\t\t} else if (metric.getName().equals(\"Node Control/Scan Rate ms\")) {\n\t\t\t\t\tscanRateMs = (Integer) metric.getValue();\n\t\t\t\t\tif (scanRateMs < 100) {\n\t\t\t\t\t\t// Limit Scan Rate to a minimum of 100ms\n\t\t\t\t\t\tscanRateMs = 100;\n\t\t\t\t\t}\n\t\t\t\t\toutboundPayloadBuilder\n\t\t\t\t\t\t\t.addMetric(new MetricBuilder(\"Node Control/Scan Rate ms\", MetricDataType.Int32, scanRateMs)\n\t\t\t\t\t\t\t\t\t.createMetric());\n\n\t\t\t\t\t// Publish the message in a new thread\n\t\t\t\t\tsynchronized (lock) {\n\t\t\t\t\t\texecutor.execute(new Publisher(NAMESPACE + \"/\" + groupId + \"/NDATA/\" + edgeNode,\n\t\t\t\t\t\t\t\toutboundPayloadBuilder.createPayload()));\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t} else if (splitTopic[0].equals(NAMESPACE) && splitTopic[1].equals(groupId) && splitTopic[2].equals(\"DCMD\")\n\t\t\t\t&& splitTopic[3].equals(edgeNode)) {\n\t\t\tsynchronized (lock) {\n\t\t\t\tSystem.out.println(\"Command recevied for device: \" + splitTopic[4] + \" on topic: \" + topic);\n\n\t\t\t\t// Get the incoming metric key and value\n\t\t\t\tSparkplugBPayload inboundPayload =\n\t\t\t\t\t\tnew SparkplugBPayloadDecoder().buildFromByteArray(message.getPayload(), null);\n\n\t\t\t\tfor (Metric metric : inboundPayload.getMetrics()) {\n\t\t\t\t\tSystem.out.println(\"Metric: \" + metric.getName() + \" :: \" + metric.getValue());\n\n\t\t\t\t\tif (metric.getName().equals(PibrellaOutputPins.E.getPin().getDescription())) {\n\t\t\t\t\t\tpibrella.getOutput(PibrellaOutputPins.E).setState((Boolean) metric.getValue());\n\t\t\t\t\t\toutboundPayloadBuilder\n\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaOutputPins.E.getPin().getDescription(),\n\t\t\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getOutput(PibrellaOutputPins.E).isHigh())\n\t\t\t\t\t\t\t\t\t\t\t\t.createMetric());\n\t\t\t\t\t} else if (metric.getName().equals(PibrellaOutputPins.F.getPin().getDescription())) {\n\t\t\t\t\t\tpibrella.getOutput(PibrellaOutputPins.F).setState((Boolean) metric.getValue());\n\t\t\t\t\t\toutboundPayloadBuilder\n\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaOutputPins.F.getPin().getDescription(),\n\t\t\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getOutput(PibrellaOutputPins.F).isHigh())\n\t\t\t\t\t\t\t\t\t\t\t\t.createMetric());\n\t\t\t\t\t} else if (metric.getName().equals(PibrellaOutputPins.G.getPin().getDescription())) {\n\t\t\t\t\t\tpibrella.getOutput(PibrellaOutputPins.G).setState((Boolean) metric.getValue());\n\t\t\t\t\t\toutboundPayloadBuilder\n\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaOutputPins.G.getPin().getDescription(),\n\t\t\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getOutput(PibrellaOutputPins.G).isHigh())\n\t\t\t\t\t\t\t\t\t\t\t\t.createMetric());\n\t\t\t\t\t} else if (metric.getName().equals(PibrellaOutputPins.H.getPin().getDescription())) {\n\t\t\t\t\t\tpibrella.getOutput(PibrellaOutputPins.H).setState((Boolean) metric.getValue());\n\t\t\t\t\t\toutboundPayloadBuilder\n\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaOutputPins.H.getPin().getDescription(),\n\t\t\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getOutput(PibrellaOutputPins.H).isHigh())\n\t\t\t\t\t\t\t\t\t\t\t\t.createMetric());\n\t\t\t\t\t} else if (metric.getName().equals(PibrellaLEDs.GREEN.getPin().getDescription())) {\n\t\t\t\t\t\tif (((Boolean) metric.getValue())) {\n\t\t\t\t\t\t\tpibrella.getLED(PibrellaLEDs.GREEN).turnOn();\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tpibrella.getLED(PibrellaLEDs.GREEN).turnOff();\n\t\t\t\t\t\t}\n\t\t\t\t\t\toutboundPayloadBuilder.addMetric(new MetricBuilder(PibrellaLEDs.GREEN.getPin().getDescription(),\n\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getLED(PibrellaLEDs.GREEN).isOn()).createMetric());\n\t\t\t\t\t} else if (metric.getName().equals(PibrellaLEDs.RED.getPin().getDescription())) {\n\t\t\t\t\t\tif (((Boolean) metric.getValue())) {\n\t\t\t\t\t\t\tpibrella.getLED(PibrellaLEDs.RED).turnOn();\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tpibrella.getLED(PibrellaLEDs.RED).turnOff();\n\t\t\t\t\t\t}\n\t\t\t\t\t\toutboundPayloadBuilder.addMetric(new MetricBuilder(PibrellaLEDs.RED.getPin().getDescription(),\n\t\t\t\t\t\t\t\tMetricDataType.Boolean, pibrella.getLED(PibrellaLEDs.RED).isOn()).createMetric());\n\t\t\t\t\t} else if (metric.getName().equals(PibrellaLEDs.YELLOW.getPin().getDescription())) {\n\t\t\t\t\t\tif (((Boolean) metric.getValue())) {\n\t\t\t\t\t\t\tpibrella.getLED(PibrellaLEDs.YELLOW).turnOn();\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tpibrella.getLED(PibrellaLEDs.YELLOW).turnOff();\n\t\t\t\t\t\t}\n\t\t\t\t\t\toutboundPayloadBuilder.addMetric(\n\t\t\t\t\t\t\t\tnew MetricBuilder(PibrellaLEDs.YELLOW.getPin().getDescription(), MetricDataType.Boolean,\n\t\t\t\t\t\t\t\t\t\tpibrella.getLED(PibrellaLEDs.YELLOW).isOn()).createMetric());\n\t\t\t\t\t} else if (metric.getName().equals(BUTTON_CNT_SETPOINT_METRICS_NAME)) {\n\t\t\t\t\t\tbuttonCounterSetpoint = (Integer) metric.getValue();\n\t\t\t\t\t\toutboundPayloadBuilder.addMetric(new MetricBuilder(BUTTON_CNT_SETPOINT_METRICS_NAME,\n\t\t\t\t\t\t\t\tMetricDataType.Int32, buttonCounterSetpoint).createMetric());\n\t\t\t\t\t} else if (metric.getName().equals(PibrellaPins.BUZZER.getDescription())) {\n\t\t\t\t\t\tpibrella.getBuzzer().buzz(100, 2000);\n\t\t\t\t\t} else {\n\t\t\t\t\t\tSystem.out.println(\"Received unknown command for metric: \" + metric.getName());\n\t\t\t\t\t}\n\t\t\t\t}\n\n\t\t\t\t// Publish the message in a new thread\n\t\t\t\texecutor.execute(new Publisher(NAMESPACE + \"/\" + groupId + \"/DDATA/\" + edgeNode + \"/\" + deviceId,\n\t\t\t\t\t\toutboundPayloadBuilder.createPayload()));\n\t\t\t}\n\t\t}\n\t}\n\n\tpublic void deliveryComplete(IMqttDeliveryToken token) {\n\t\t// System.out.println(\"Published message: \" + token);\n\t}\n\n\tprivate class Publisher implements Runnable {\n\n\t\tprivate String topic;\n\t\tprivate SparkplugBPayload payload;\n\n\t\tpublic Publisher(String topic, SparkplugBPayload payload) {\n\t\t\tthis.topic = topic;\n\t\t\tthis.payload = payload;\n\t\t}\n\n\t\tpublic void run() {\n\t\t\ttry {\n\t\t\t\tbyte[] bytes = new SparkplugBPayloadEncoder().getBytes(payload, false);\n\t\t\t\tclient.publish(topic, bytes, 0, false);\n\t\t\t} catch (MqttPersistenceException e) {\n\t\t\t\te.printStackTrace();\n\t\t\t} catch (MqttException e) {\n\t\t\t\te.printStackTrace();\n\t\t\t} catch (Exception e) {\n\t\t\t\te.printStackTrace();\n\t\t\t}\n\t\t}\n\t}\n\n\tprivate void createPibrellaListeners() throws DioException {\n\t\ttry {\n\t\t\tpibrella.getButton().getGpioPin().setInputListener(new PinListener() {\n\t\t\t\t@Override\n\t\t\t\tpublic void valueChanged(PinEvent pinEvent) {\n\t\t\t\t\ttry {\n\t\t\t\t\t\tsynchronized (lock) {\n\t\t\t\t\t\t\tSparkplugBPayloadBuilder outboundPayloadBuilder =\n\t\t\t\t\t\t\t\t\tnew SparkplugBPayloadBuilder(getNextSeqNum()).setTimestamp(new Date());\n\t\t\t\t\t\t\tif (pinEvent.getValue()) {\n\t\t\t\t\t\t\t\toutboundPayloadBuilder.addMetric(new MetricBuilder(PibrellaPins.BUTTON.getDescription(),\n\t\t\t\t\t\t\t\t\t\tMetricDataType.Boolean, true).createMetric());\n\t\t\t\t\t\t\t\tbuttonCounter++;\n\t\t\t\t\t\t\t\tif (buttonCounter > buttonCounterSetpoint) {\n\t\t\t\t\t\t\t\t\tbuttonCounter = 0;\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\toutboundPayloadBuilder.addMetric(\n\t\t\t\t\t\t\t\t\t\tnew MetricBuilder(\"button count\", MetricDataType.Int32, buttonCounter)\n\t\t\t\t\t\t\t\t\t\t\t\t.createMetric());\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\toutboundPayloadBuilder.addMetric(new MetricBuilder(PibrellaPins.BUTTON.getDescription(),\n\t\t\t\t\t\t\t\t\t\tMetricDataType.Boolean, false).createMetric());\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\texecutor.execute(\n\t\t\t\t\t\t\t\t\tnew Publisher(NAMESPACE + \"/\" + groupId + \"/DDATA/\" + edgeNode + \"/\" + deviceId,\n\t\t\t\t\t\t\t\t\t\t\toutboundPayloadBuilder.createPayload()));\n\t\t\t\t\t\t}\n\t\t\t\t\t} catch (Exception e) {\n\t\t\t\t\t\te.printStackTrace();\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t});\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(\"failed to set InputListener for \" + PibrellaPins.BUTTON.getName(), e);\n\t\t}\n\n\t\ttry {\n\t\t\tpibrella.getInput(PibrellaInputPins.A).getGpioPin().setInputListener(new PinListener() {\n\t\t\t\tpublic void valueChanged(PinEvent pinEvent) {\n\t\t\t\t\ttry {\n\t\t\t\t\t\tsynchronized (lock) {\n\t\t\t\t\t\t\tSparkplugBPayloadBuilder outboundPayloadBuilder =\n\t\t\t\t\t\t\t\t\tnew SparkplugBPayloadBuilder(getNextSeqNum()).setTimestamp(new Date());\n\t\t\t\t\t\t\tif (pinEvent.getValue()) {\n\t\t\t\t\t\t\t\toutboundPayloadBuilder\n\t\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaInputPins.A.getPin().getDescription(),\n\t\t\t\t\t\t\t\t\t\t\t\tMetricDataType.Boolean, true).createMetric());\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\toutboundPayloadBuilder\n\t\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaInputPins.A.getPin().getDescription(),\n\t\t\t\t\t\t\t\t\t\t\t\tMetricDataType.Boolean, false).createMetric());\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\texecutor.execute(\n\t\t\t\t\t\t\t\t\tnew Publisher(NAMESPACE + \"/\" + groupId + \"/DDATA/\" + edgeNode + \"/\" + deviceId,\n\t\t\t\t\t\t\t\t\t\t\toutboundPayloadBuilder.createPayload()));\n\t\t\t\t\t\t}\n\t\t\t\t\t} catch (Exception e) {\n\t\t\t\t\t\te.printStackTrace();\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t});\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(\"failed to set InputListener for \" + PibrellaInputPins.A.getName(), e);\n\t\t}\n\n\t\ttry {\n\t\t\tpibrella.getInput(PibrellaInputPins.B).getGpioPin().setInputListener(new PinListener() {\n\t\t\t\tpublic void valueChanged(PinEvent pinEvent) {\n\t\t\t\t\ttry {\n\t\t\t\t\t\tsynchronized (lock) {\n\t\t\t\t\t\t\tSparkplugBPayloadBuilder outboundPayloadBuilder =\n\t\t\t\t\t\t\t\t\tnew SparkplugBPayloadBuilder(getNextSeqNum()).setTimestamp(new Date());\n\t\t\t\t\t\t\tif (pinEvent.getValue()) {\n\t\t\t\t\t\t\t\toutboundPayloadBuilder\n\t\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaInputPins.B.getPin().getDescription(),\n\t\t\t\t\t\t\t\t\t\t\t\tMetricDataType.Boolean, true).createMetric());\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\toutboundPayloadBuilder\n\t\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaInputPins.B.getPin().getDescription(),\n\t\t\t\t\t\t\t\t\t\t\t\tMetricDataType.Boolean, false).createMetric());\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\texecutor.execute(\n\t\t\t\t\t\t\t\t\tnew Publisher(NAMESPACE + \"/\" + groupId + \"/DDATA/\" + edgeNode + \"/\" + deviceId,\n\t\t\t\t\t\t\t\t\t\t\toutboundPayloadBuilder.createPayload()));\n\t\t\t\t\t\t}\n\t\t\t\t\t} catch (Exception e) {\n\t\t\t\t\t\te.printStackTrace();\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t});\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(\"failed to set InputListener for \" + PibrellaInputPins.B.getName(), e);\n\t\t}\n\n\t\ttry {\n\t\t\tpibrella.getInput(PibrellaInputPins.C).getGpioPin().setInputListener(new PinListener() {\n\t\t\t\tpublic void valueChanged(PinEvent pinEvent) {\n\t\t\t\t\ttry {\n\t\t\t\t\t\tsynchronized (lock) {\n\t\t\t\t\t\t\tSparkplugBPayloadBuilder outboundPayloadBuilder =\n\t\t\t\t\t\t\t\t\tnew SparkplugBPayloadBuilder(getNextSeqNum()).setTimestamp(new Date());\n\t\t\t\t\t\t\tif (pinEvent.getValue()) {\n\t\t\t\t\t\t\t\toutboundPayloadBuilder\n\t\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaInputPins.C.getPin().getDescription(),\n\t\t\t\t\t\t\t\t\t\t\t\tMetricDataType.Boolean, true).createMetric());\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\toutboundPayloadBuilder\n\t\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaInputPins.C.getPin().getDescription(),\n\t\t\t\t\t\t\t\t\t\t\t\tMetricDataType.Boolean, false).createMetric());\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\texecutor.execute(\n\t\t\t\t\t\t\t\t\tnew Publisher(NAMESPACE + \"/\" + groupId + \"/DDATA/\" + edgeNode + \"/\" + deviceId,\n\t\t\t\t\t\t\t\t\t\t\toutboundPayloadBuilder.createPayload()));\n\t\t\t\t\t\t}\n\t\t\t\t\t} catch (Exception e) {\n\t\t\t\t\t\te.printStackTrace();\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t});\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(\"failed to set InputListener for \" + PibrellaInputPins.C.getName(), e);\n\t\t}\n\n\t\ttry {\n\t\t\tpibrella.getInput(PibrellaInputPins.D).getGpioPin().setInputListener(new PinListener() {\n\t\t\t\tpublic void valueChanged(PinEvent pinEvent) {\n\t\t\t\t\ttry {\n\t\t\t\t\t\tsynchronized (lock) {\n\t\t\t\t\t\t\tSparkplugBPayloadBuilder outboundPayloadBuilder =\n\t\t\t\t\t\t\t\t\tnew SparkplugBPayloadBuilder(getNextSeqNum()).setTimestamp(new Date());\n\t\t\t\t\t\t\tif (pinEvent.getValue()) {\n\t\t\t\t\t\t\t\toutboundPayloadBuilder\n\t\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaInputPins.D.getPin().getDescription(),\n\t\t\t\t\t\t\t\t\t\t\t\tMetricDataType.Boolean, true).createMetric());\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\toutboundPayloadBuilder\n\t\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(PibrellaInputPins.D.getPin().getDescription(),\n\t\t\t\t\t\t\t\t\t\t\t\tMetricDataType.Boolean, false).createMetric());\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\texecutor.execute(\n\t\t\t\t\t\t\t\t\tnew Publisher(NAMESPACE + \"/\" + groupId + \"/DDATA/\" + edgeNode + \"/\" + deviceId,\n\t\t\t\t\t\t\t\t\t\t\toutboundPayloadBuilder.createPayload()));\n\t\t\t\t\t\t}\n\t\t\t\t\t} catch (Exception e) {\n\t\t\t\t\t\te.printStackTrace();\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t});\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(\"failed to set InputListener for \" + PibrellaInputPins.D.getName(), e);\n\t\t}\n\t}\n\n\tprivate static void shutdownPibrella() {\n\t\tSystem.out.println(\"Shutting down Sparkplug RaspberryPi Example ...\");\n\n\t\tSystem.out.println(\"Closing Pibrella LEDs\");\n\t\tPibrellaLED.closeAll();\n\n\t\tSystem.out.println(\"Closing Pibrella outputs\");\n\t\tPibrellaOutputPin.closeAll();\n\n\t\tSystem.out.println(\"Closing Pibrella inputs\");\n\t\tPibrellaInputPin.closeAll();\n\n\t\tSystem.out.println(\"Closing Pibrella button\");\n\t\ttry {\n\t\t\tPibrella.getInstance().getButton().close();\n\t\t} catch (DioException e1) {\n\t\t\tSystem.out.println(\"failed to close Pibrella button\");\n\t\t}\n\n\t\tSystem.out.println(\"Closing Pibrella buzzer\");\n\t\ttry {\n\t\t\tPibrella.getInstance().getBuzzer().close();\n\t\t} catch (Exception e) {\n\t\t\tSystem.out.println(\"failed to close Pibrella buzzer\");\n\t\t}\n\t}\n\n\tprivate static void parseCommandLineArguments(String[] args) {\n\t\tswitch (args.length) {\n\t\t\tcase 1:\n\t\t\t\tmqttServerHostName = args[0];\n\t\t\t\tmqttServerPort = DFLT_MQTT_PORT;\n\t\t\t\tbreak;\n\t\t\tcase 2:\n\t\t\t\tmqttServerHostName = args[0];\n\t\t\t\tmqttServerPort = Integer.parseInt(args[1]);\n\t\t\t\tbreak;\n\t\t\tdefault:\n\t\t\t\tmqttServerHostName = DFLT_MQTT_SERVER_HOST_NAME;\n\t\t\t\tmqttServerPort = DFLT_MQTT_PORT;\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pi/dio/DigitalOutputPin.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pi.dio;\n\n/**\n * Defines digital output pin\n */\npublic class DigitalOutputPin extends DioPin {\n\n\tpublic DigitalOutputPin(String name) {\n\t\tsuper(name);\n\t}\n\n\t/**\n\t * Sets pin high\n\t * \n\t * @throws DioException\n\t */\n\tpublic void setHigh() throws DioException {\n\t\ttry {\n\t\t\tgetGpioPin().setValue(true);\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(\"failed to set \" + getPinName() + \" HIGH\", e);\n\t\t}\n\t}\n\n\t/**\n\t * Sets pin low\n\t * \n\t * @throws DioException\n\t */\n\tpublic void setLow() throws DioException {\n\t\ttry {\n\t\t\tgetGpioPin().setValue(false);\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(\"failed to set \" + getPinName() + \" LOW\", e);\n\t\t}\n\t}\n\n\t/**\n\t * Sets pin state\n\t * \n\t * @param state - pin state as {@link boolean}\n\t * @throws DioException\n\t */\n\tpublic void setState(boolean state) throws DioException {\n\t\ttry {\n\t\t\tgetGpioPin().setValue(state);\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(\"failed to set state of \" + getPinName() + \" to \" + state, e);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pi/dio/DioException.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pi.dio;\n\n/**\n * Defines DIO Exception\n */\npublic class DioException extends Exception {\n\n\tprivate static final long serialVersionUID = 1L;\n\n\tpublic DioException(String msg) {\n\t\tsuper(msg);\n\t}\n\n\tpublic DioException(String message, Throwable cause) {\n\t\tsuper(message, cause);\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pi/dio/DioLibrary.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pi.dio;\n\nimport java.io.File;\nimport java.io.FileInputStream;\nimport java.io.InputStream;\nimport java.lang.reflect.Field;\nimport java.nio.file.Files;\nimport java.nio.file.Path;\nimport java.nio.file.StandardCopyOption;\nimport java.security.DigestInputStream;\nimport java.security.MessageDigest;\nimport java.security.NoSuchAlgorithmException;\nimport java.util.ArrayList;\nimport java.util.Arrays;\nimport java.util.List;\n\nimport javax.xml.bind.annotation.adapters.HexBinaryAdapter;\n\n/**\n * Defines a class that handles libdio.so native library\n */\npublic class DioLibrary {\n\n\tprivate static final String DFLT_LIBDIO_PATH = \"/usr/local/lib/dio\";\n\tprivate static final String LIBDIO = \"/libdio.so\";\n\tprivate static final String JAVA_LIBRARY_PATH_PROP_NAME = \"java.library.path\";\n\tprivate static final String FAILED_CALCULATE_MD5_MSG = \"Failed to calculate MD5 sum\";\n\n\tprivate static DioLibrary instance;\n\tprivate String libdioPath;\n\n\tprivate DioLibrary() {\n\t\tlibdioPath = DFLT_LIBDIO_PATH;\n\t}\n\n\tprivate DioLibrary(String pathToLibdio) {\n\t\tlibdioPath = pathToLibdio;\n\t}\n\n\t/**\n\t * Gets an instance of the DioLibrary class\n\t * \n\t * @return an instance of the DioLibrary class as {@link DioLibrary}\n\t */\n\tpublic static DioLibrary getInstance() {\n\t\tif (instance == null) {\n\t\t\tinstance = new DioLibrary();\n\t\t}\n\t\treturn instance;\n\t}\n\n\t/**\n\t * Gets an instance of the DioLibrary class\n\t * \n\t * @param pathToLibdio - path to the libdio.so as {@link String}\n\t * @return\n\t */\n\tpublic static DioLibrary getInstance(String pathToLibdio) {\n\t\tif (instance == null) {\n\t\t\tinstance = new DioLibrary(pathToLibdio);\n\t\t}\n\t\treturn instance;\n\t}\n\n\t/**\n\t * Reports java library path\n\t * \n\t * @return java library path as {@link String}\n\t */\n\tpublic String getJavaLibraryPath() {\n\t\treturn this.libdioPath;\n\t}\n\n\t/**\n\t * Sets java library path\n\t * \n\t * @param pathToLibdio - path to the libdio.so as {@link String}\n\t * @throws DioException\n\t */\n\tpublic void setJavaLibraryPath(String pathToLibdio) throws DioException {\n\t\tthis.libdioPath = pathToLibdio;\n\t\tsetJavaLibraryPath();\n\t}\n\n\t/**\n\t * Sets java library path\n\t * \n\t * @throws DioException\n\t */\n\tpublic void setJavaLibraryPath() throws DioException {\n\t\ttry {\n\t\t\tField field = ClassLoader.class.getDeclaredField(\"usr_paths\");\n\t\t\tfield.setAccessible(true);\n\t\t\tList<String> paths = new ArrayList<>(Arrays.asList((String[]) field.get(null)));\n\t\t\tpaths.forEach(path -> {\n\t\t\t\tif (this.libdioPath.equals(path)) {\n\t\t\t\t\treturn;\n\t\t\t\t}\n\t\t\t});\n\t\t\tpaths.add(this.libdioPath);\n\t\t\tfield.set(null, paths.stream().toArray(String[]::new));\n\t\t\taddToJavaLibraryPath(this.libdioPath);\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(\"Failed to set java path to the libdio.so library\", e);\n\t\t}\n\t}\n\n\t/**\n\t * Sets libdio.so native library by copying it from the libdio.so resource if necessary\n\t * \n\t * @throws DioException\n\t */\n\tpublic void setDioLibrary() throws DioException {\n\t\tFile targetLibdioPath = new File(libdioPath);\n\t\tFile targetLibdioFile = new File(formAbsolutePathToTargetLibdioFile());\n\t\tif (targetLibdioFile.exists()\n\t\t\t\t&& getMD5sumOfTargetLibdio(targetLibdioFile).equals(getMD5sumOfResourceLibdio())) {\n\t\t\t// the /usr/local/lib/dio/libdio.so file is already in place\n\t\t\treturn;\n\t\t}\n\t\tif (!targetLibdioPath.exists()) {\n\t\t\ttargetLibdioPath.mkdirs();\n\t\t}\n\t\tcopyDioLibrary(targetLibdioFile.toPath());\n\t}\n\n\tprivate void copyDioLibrary(Path path) throws DioException {\n\t\ttry (InputStream is = getClass().getResourceAsStream(LIBDIO)) {\n\t\t\tFiles.copy(is, path, StandardCopyOption.REPLACE_EXISTING);\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(\"Failed to set LIBDIO shared library\", e);\n\t\t}\n\t}\n\n\tprivate String getMD5sumOfResourceLibdio() throws DioException {\n\t\tString md5;\n\t\ttry (InputStream is = getClass().getResourceAsStream(LIBDIO)) {\n\t\t\tmd5 = getMD5(is);\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(\"FAILED_CALCULATE_MD5_MSG\" + \" of libdio.so resource file\", e);\n\t\t}\n\t\treturn md5;\n\t}\n\n\tprivate String getMD5sumOfTargetLibdio(File file) throws DioException {\n\t\tString md5;\n\t\ttry (InputStream is = new FileInputStream(file)) {\n\t\t\tmd5 = getMD5(is);\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(FAILED_CALCULATE_MD5_MSG + \" of the \" + file.getAbsolutePath() + \" file\", e);\n\t\t}\n\t\treturn md5;\n\t}\n\n\tprivate String getMD5(InputStream is) throws DioException {\n\t\tString md5;\n\t\tMessageDigest md;\n\t\ttry {\n\t\t\tmd = MessageDigest.getInstance(\"MD5\");\n\t\t} catch (NoSuchAlgorithmException e) {\n\t\t\tthrow new DioException(\"Failed to get an instance of MD5\", e);\n\t\t}\n\t\ttry (DigestInputStream dis = new DigestInputStream(is, md)) {\n\t\t\twhile (dis.available() > 0) {\n\t\t\t\tdis.read();\n\t\t\t}\n\t\t\tmd5 = (new HexBinaryAdapter()).marshal(md.digest());\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(FAILED_CALCULATE_MD5_MSG, e);\n\t\t}\n\t\treturn md5;\n\t}\n\n\tprivate void addToJavaLibraryPath(String path) {\n\t\tStringBuilder sbPath = new StringBuilder().append(System.getProperty(JAVA_LIBRARY_PATH_PROP_NAME))\n\t\t\t\t.append(File.pathSeparator).append(path);\n\t\tSystem.setProperty(JAVA_LIBRARY_PATH_PROP_NAME, sbPath.toString());\n\t}\n\n\tprivate String formAbsolutePathToTargetLibdioFile() {\n\t\treturn new StringBuilder().append(this.libdioPath).append(LIBDIO).toString();\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pi/dio/DioPin.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pi.dio;\n\nimport jdk.dio.DeviceManager;\nimport jdk.dio.gpio.GPIOPin;\nimport jdk.dio.gpio.GPIOPinConfig;\n\n/**\n * Defines digital I/O pin\n */\npublic class DioPin {\n\n\tprivate String pinName;\n\tprivate GPIOPin gpioPin;\n\n\tpublic DioPin(String name) {\n\t\tthis.pinName = name;\n\t}\n\n\t/**\n\t * Reports pin name\n\t * \n\t * @return pin name as {@link String}\n\t */\n\tpublic String getPinName() {\n\t\treturn this.pinName;\n\t}\n\n\t/**\n\t * Gets GPIO pin\n\t * \n\t * @return GPIO pin as {@link GPIOPin}\n\t */\n\tpublic GPIOPin getGpioPin() {\n\t\treturn this.gpioPin;\n\t}\n\n\t/**\n\t * Sets GPIO pin\n\t * \n\t * @param gpioPin - GPIO pin as {@link GPIOPin}\n\t */\n\tpublic void setGpioPin(GPIOPin gpioPin) {\n\t\tthis.gpioPin = gpioPin;\n\t}\n\n\tpublic static GPIOPin open(String pinName, GPIOPinConfig gpioPinConfig) throws DioException {\n\t\tGPIOPin pin;\n\t\ttry {\n\t\t\tpin = DeviceManager.open(GPIOPin.class, gpioPinConfig);\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(\"failed to open GPIO pin: \" + pinName, e);\n\t\t}\n\t\treturn pin;\n\t}\n\n\t/**\n\t * Closes GPIO pin\n\t * \n\t * @throws DioException\n\t */\n\tpublic void close() throws DioException {\n\t\ttry {\n\t\t\tif (this.gpioPin.isOpen()) {\n\t\t\t\tthis.gpioPin.close();\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(\"failed to close \" + this.pinName, e);\n\t\t}\n\t}\n\n\t/**\n\t * Reports if GPIO pin state\n\t * \n\t * @return GPIO pin state as {@link boolean}\n\t * @throws DioException\n\t */\n\tpublic boolean isHigh() throws DioException {\n\t\tboolean ret = false;\n\t\ttry {\n\t\t\tret = this.gpioPin.getValue();\n\t\t} catch (Exception e) {\n\t\t\tthrow new DioException(\"Failed obtain a state of \" + this.pinName, e);\n\t\t}\n\t\treturn ret;\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pi/dio/PinDirection.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pi.dio;\n\n/**\n * Enumerates DPIO pin directions\n */\npublic enum PinDirection {\n\tINPUT,\n\tOUTPUT;\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pi/dio/Pins.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pi.dio;\n\n/**\n * Enumerates Raspbery Pi pins\n */\npublic enum Pins {\n\tP3(2, \"GPIO2 (SDA1, I2C)\"),\n\tP5(3, \"GPIO3 (SCL1, I2C)\"),\n\tP7(4, \"GPIO4 (GPIO_GCLK)\"),\n\tP8(14, \"GPIO14 (TXD0)\"),\n\tP10(15, \"GPIO15 (RXD0)\"),\n\tP11(17, \"GPIO17 (GPIO_GEN0)\"),\n\tP12(18, \"GPIO18 (GPIO_GEN1)\"),\n\tP13(27, \"GPIO27 (GPIO_GEN2)\"),\n\tP15(22, \"GPIO22 (GPIO_GEN3)\"),\n\tP16(23, \"GPIO23 (GPIO_GEN4)\"),\n\tP18(24, \"GPIO24 (GPIO_GEN5)\"),\n\tP19(10, \"GPIO10, (SPI_MOSI)\"),\n\tP21(9, \"GPIO9, (SPI_MISO)\"),\n\tP22(25, \"GPIO25, (GPIO_GEN6)\"),\n\tP23(11, \"GPIO11 (SPI_CLK)\"),\n\tP24(8, \"GPIO8 (SPI_CE0_N)\"),\n\tP26(7, \"GPIO7 (SPI_CE1_N)\"),\n\tP27(0, \"ID_SD (I2C ID EEPROM)\"),\n\tP28(1, \"ID_SC (I2C ID EEPROM)\"),\n\tP29(5, \"GPIO5\"),\n\tP31(6, \"GPIO6\"),\n\tP32(12, \"GPIO12\"),\n\tP33(13, \"GPIO13\"),\n\tP35(19, \"GPIO19\"),\n\tP36(16, \"GPIO16\"),\n\tP337(26, \"GPIO26\"),\n\tP38(20, \"GPIO20\"),\n\tP40(21, \"GPIO21\");\n\n\tprivate int gpio;\n\tprivate String name;\n\n\tprivate Pins(int gpio, String name) {\n\t\tthis.gpio = gpio;\n\t\tthis.name = name;\n\t}\n\n\tpublic int getGPIO() {\n\t\treturn this.gpio;\n\t}\n\n\tpublic String getName() {\n\t\treturn this.name;\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pi/system/BoardModels.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pi.system;\n\n/**\n * Enumerates Raspberry Pi board models\n */\npublic enum BoardModels {\n\t// Obtained from https://www.raspberrypi.org/documentation/hardware/raspberrypi/revision-codes/README.md\n\tCODE_900021(\"900021\", Constants.A_PLUS, 1.1f, 512, Constants.SONY_UK),\n\tCODE_900032(\"900032\", Constants.B_PLUS, 1.2f, 512, Constants.SONY_UK),\n\tCODE_900092(\"900092\", Constants.ZERO, 1.2f, 512, Constants.SONY_UK),\n\tCODE_920092(\"920092\", Constants.ZERO, 1.2f, 512, Constants.EMBEST),\n\tCODE_900093(\"900093\", Constants.ZERO, 1.3f, 512, Constants.SONY_UK),\n\tCODE_9000C1(\"9000c1\", Constants.ZERO_W, 1.1f, 512, Constants.SONY_UK),\n\tCODE_920093(\"920093\", Constants.ZERO, 1.3f, 512, Constants.SONY_UK),\n\tCODE_A01040(\"a01040\", Constants.B2, 1.0f, 10124, Constants.SONY_UK),\n\tCODE_A01041(\"a01041\", Constants.B2, 1.1f, 1024, Constants.SONY_UK),\n\tCODE_A02082(\"a02082\", Constants.B3, 1.2f, 1024, Constants.SONY_UK),\n\tCODE_A020A0(\"a020a0\", Constants.CM3, 1.0f, 1024, Constants.SONY_UK),\n\tCODE_A21041(\"a21041\", Constants.B2, 1.1f, 1024, Constants.EMBEST),\n\tCODE_A22042(\"a22042\", Constants.B2_BCM2837, 1.2f, 1024, Constants.EMBEST),\n\tCODE_A22082(\"a22082\", Constants.B3, 1.2f, 1024, Constants.EMBEST),\n\tCODE_A32082(\"a32082\", Constants.B3, 1.2f, 1024, Constants.SONY_JAPAN),\n\tCODE_A52082(\"a52082\", Constants.B3, 1.2f, 1024, Constants.STADIUM),\n\tCODE_A020D3(\"a020d3\", Constants.B3_PLUS, 1.3f, 10124, Constants.SONY_UK),\n\tCODE_9020E0(\"9020e0\", Constants.A3_PLUS, 1.0f, 512, Constants.SONY_UK);\n\n\tprivate String code;\n\tprivate String model;\n\tprivate float revision;\n\tprivate int ramSize;\n\tprivate String manufacturer;\n\n\tprivate BoardModels(String code, String model, float revision, int ramSize, String manufacturer) {\n\t\tthis.code = code;\n\t\tthis.model = model;\n\t\tthis.revision = revision;\n\t\tthis.ramSize = ramSize;\n\t\tthis.manufacturer = manufacturer;\n\t}\n\n\tpublic String getCode() {\n\t\treturn code;\n\t}\n\n\tpublic String getModel() {\n\t\treturn this.model;\n\t}\n\n\tpublic float getRevision() {\n\t\treturn this.revision;\n\t}\n\n\tpublic int getRamSize() {\n\t\treturn this.ramSize;\n\t}\n\n\tpublic String getManufacturer() {\n\t\treturn this.manufacturer;\n\t}\n\n\tprivate static class Constants {\n\t\tprivate static final String A_PLUS = \"A+\";\n\t\tprivate static final String B_PLUS = \"B+\";\n\t\tprivate static final String ZERO = \"Zero\";\n\t\tprivate static final String ZERO_W = \"Zero W\";\n\t\tprivate static final String B2 = \"2B\";\n\t\tprivate static final String B3 = \"3B\";\n\t\tprivate static final String CM3 = \"CM3\";\n\t\tprivate static final String B2_BCM2837 = \"2B (with BCM2837)\";\n\t\tprivate static final String A3_PLUS = \"3A+\";\n\t\tprivate static final String B3_PLUS = \"3B+\";\n\n\t\tprivate static final String SONY_UK = \"Sony UK\";\n\t\tprivate static final String EMBEST = \"Embest\";\n\t\tprivate static final String SONY_JAPAN = \"Sony Japan\";\n\t\tprivate static final String STADIUM = \"Stadium\";\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pi/system/SystemInfo.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pi.system;\n\nimport java.io.BufferedReader;\nimport java.io.FileReader;\nimport java.io.InputStreamReader;\nimport java.util.HashMap;\nimport java.util.Map;\n\n/**\n * Defines Raspberry Pi system info\n */\npublic class SystemInfo {\n\n\tprivate static final String OS_FW_BUILD_KEY = \"getOsFirmwareBuild\";\n\tprivate static final String HARDWARE_KEY = \"Hardware\";\n\tprivate static final String REVISION_KEY = \"Revision\";\n\tprivate static final String SERIAL_NUM_KEY = \"Serial\";\n\tprivate static final String BOARD_MODEL_KEY = \"BoardModel\";\n\tprivate static final String BOARD_MANUFACTURER_KEY = \"BoardManufacturer\";\n\n\tprivate static SystemInfo instance;\n\n\tprivate Map<String, String> sysInfo;\n\tprivate Map<String, BoardModels> boardModelInfo;\n\n\tprivate SystemInfo() throws SystemInfoException {\n\t\tthis.sysInfo = new HashMap<>();\n\t\tthis.boardModelInfo = new HashMap<>();\n\t\tpopulateBoardModels();\n\t\tgetSystemInfo();\n\t}\n\n\t/**\n\t * Gets an instance of the SystemInfo class\n\t * \n\t * @return instance of the SystemInfo class as {@link SystemInfo}\n\t * @throws SystemInfoException\n\t */\n\tpublic static SystemInfo getInstance() throws SystemInfoException {\n\t\tif (instance == null) {\n\t\t\tinstance = new SystemInfo();\n\t\t}\n\t\treturn instance;\n\t}\n\n\t/**\n\t * Reports OS Firmware build\n\t * \n\t * @return OS Firmware build as {@link String}\n\t */\n\tpublic String getOsFirmwareBuild() {\n\t\tString osFirmwareVersion;\n\t\tosFirmwareVersion = this.sysInfo.get(OS_FW_BUILD_KEY);\n\t\tif (osFirmwareVersion == null) {\n\t\t\tosFirmwareVersion = \"\";\n\t\t}\n\t\treturn osFirmwareVersion;\n\t}\n\n\t/**\n\t * Reports Raspberry Pi model\n\t * \n\t * @return model as {@link String}\n\t */\n\tpublic String getModel() {\n\t\tString boardModel;\n\t\tboardModel = this.sysInfo.get(BOARD_MODEL_KEY);\n\t\tif (boardModel == null) {\n\t\t\tboardModel = \"\";\n\t\t}\n\t\treturn boardModel;\n\t}\n\n\t/**\n\t * Reports Raspberry Pi manufacturer\n\t * \n\t * @return manufacturer as {@link String}\n\t */\n\tpublic String getManufacturer() {\n\t\tString boardManufacturer;\n\t\tboardManufacturer = this.sysInfo.get(BOARD_MANUFACTURER_KEY);\n\t\tif (boardManufacturer == null) {\n\t\t\tboardManufacturer = \"\";\n\t\t}\n\t\treturn boardManufacturer;\n\t}\n\n\t/**\n\t * Reports Raspberry Pi hardware information\n\t * \n\t * @return hardware information as {@link String}\n\t */\n\tpublic String getHardware() {\n\t\tString hardware;\n\t\thardware = this.sysInfo.get(HARDWARE_KEY);\n\t\tif (hardware == null) {\n\t\t\thardware = \"\";\n\t\t}\n\t\treturn hardware;\n\t}\n\n\t/**\n\t * Reports revision\n\t * \n\t * @return revision as {@link String}\n\t */\n\tpublic String getRevision() {\n\t\tString revision;\n\t\trevision = this.sysInfo.get(REVISION_KEY);\n\t\tif (revision == null) {\n\t\t\trevision = \"\";\n\t\t}\n\t\treturn revision;\n\t}\n\n\tprivate void populateBoardModels() {\n\t\tboardModelInfo.put(BoardModels.CODE_900021.getCode(), BoardModels.CODE_900021);\n\t\tboardModelInfo.put(BoardModels.CODE_900032.getCode(), BoardModels.CODE_900032);\n\t\tboardModelInfo.put(BoardModels.CODE_900092.getCode(), BoardModels.CODE_900092);\n\t\tboardModelInfo.put(BoardModels.CODE_920092.getCode(), BoardModels.CODE_920092);\n\t\tboardModelInfo.put(BoardModels.CODE_900093.getCode(), BoardModels.CODE_900093);\n\t\tboardModelInfo.put(BoardModels.CODE_9000C1.getCode(), BoardModels.CODE_9000C1);\n\t\tboardModelInfo.put(BoardModels.CODE_920093.getCode(), BoardModels.CODE_920093);\n\t\tboardModelInfo.put(BoardModels.CODE_A01040.getCode(), BoardModels.CODE_A01040);\n\t\tboardModelInfo.put(BoardModels.CODE_A01041.getCode(), BoardModels.CODE_A01041);\n\t\tboardModelInfo.put(BoardModels.CODE_A02082.getCode(), BoardModels.CODE_A02082);\n\t\tboardModelInfo.put(BoardModels.CODE_A020A0.getCode(), BoardModels.CODE_A020A0);\n\t\tboardModelInfo.put(BoardModels.CODE_A21041.getCode(), BoardModels.CODE_A21041);\n\t\tboardModelInfo.put(BoardModels.CODE_A22042.getCode(), BoardModels.CODE_A22042);\n\t\tboardModelInfo.put(BoardModels.CODE_A22082.getCode(), BoardModels.CODE_A22082);\n\t\tboardModelInfo.put(BoardModels.CODE_A32082.getCode(), BoardModels.CODE_A32082);\n\t\tboardModelInfo.put(BoardModels.CODE_A52082.getCode(), BoardModels.CODE_A52082);\n\t\tboardModelInfo.put(BoardModels.CODE_A020D3.getCode(), BoardModels.CODE_A020D3);\n\t\tboardModelInfo.put(BoardModels.CODE_9020E0.getCode(), BoardModels.CODE_9020E0);\n\t}\n\n\tprivate void getSystemInfo() throws SystemInfoException {\n\t\ttry (FileReader fr = new FileReader(\"/proc/cpuinfo\"); BufferedReader br = new BufferedReader(fr)) {\n\t\t\tString line;\n\t\t\twhile ((line = br.readLine()) != null) {\n\t\t\t\tif (line.startsWith(HARDWARE_KEY)) {\n\t\t\t\t\tthis.sysInfo.put(HARDWARE_KEY, line.substring(line.indexOf(':') + 1).trim());\n\t\t\t\t} else if (line.startsWith(REVISION_KEY)) {\n\t\t\t\t\tString revision = line.substring(line.indexOf(':') + 1).trim();\n\t\t\t\t\tthis.sysInfo.put(REVISION_KEY, revision);\n\t\t\t\t\tBoardModels boardModel = this.boardModelInfo.get(revision);\n\t\t\t\t\tthis.sysInfo.put(BOARD_MODEL_KEY, boardModel.getModel());\n\t\t\t\t\tthis.sysInfo.put(BOARD_MANUFACTURER_KEY, boardModel.getManufacturer());\n\t\t\t\t} else if (line.startsWith(SERIAL_NUM_KEY)) {\n\t\t\t\t\tthis.sysInfo.put(SERIAL_NUM_KEY, line.substring(line.indexOf(':') + 1).trim());\n\t\t\t\t}\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\tthrow new SystemInfoException(\"failed to obtain system info\", e);\n\t\t}\n\t\tgetOsFirmwareVersion();\n\t}\n\n\tprivate void getOsFirmwareVersion() throws SystemInfoException {\n\t\tProcess p;\n\t\ttry {\n\t\t\tp = Runtime.getRuntime().exec(\"/opt/vc/bin/vcgencmd version\");\n\t\t\tp.waitFor();\n\t\t} catch (Exception e) {\n\t\t\tthrow new SystemInfoException(\"failed to obtain OS FW Version info\", e);\n\t\t}\n\t\ttry (InputStreamReader isr = new InputStreamReader(p.getInputStream());\n\t\t\t\tBufferedReader br = new BufferedReader(isr)) {\n\t\t\tString line;\n\t\t\twhile ((line = br.readLine()) != null) {\n\t\t\t\tif (line.startsWith(\"version\")) {\n\t\t\t\t\tthis.sysInfo.put(OS_FW_BUILD_KEY, line.substring(\"version\".length()).trim());\n\t\t\t\t\tbreak;\n\t\t\t\t}\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\tthrow new SystemInfoException(\"failed to parse OS FW Version info\", e);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pi/system/SystemInfoException.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pi.system;\n\n/**\n * Defines SystemInfoException\n */\npublic class SystemInfoException extends Exception {\n\tprivate static final long serialVersionUID = 1L;\n\n\tpublic SystemInfoException(String msg) {\n\t\tsuper(msg);\n\t}\n\n\tpublic SystemInfoException(String message, Throwable cause) {\n\t\tsuper(message, cause);\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pibrella/Pibrella.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pibrella;\n\nimport java.io.IOException;\nimport java.util.EnumMap;\nimport java.util.Map;\n\nimport org.eclipse.tahu.pi.dio.DioException;\n\nimport jdk.dio.gpio.GPIOPin;\n\n/**\n * Defines Pibrella class\n */\npublic class Pibrella {\n\n\tprivate static Pibrella instance;\n\n\t// Pibrella I/O pins\n\tprivate Map<PibrellaPins, GPIOPin> pins;\n\n\tprivate Pibrella() {\n\t\tthis.pins = new EnumMap<>(PibrellaPins.class);\n\t}\n\n\t/**\n\t * Gets an instance of Pibrella class\n\t * \n\t * @return an instance of Pibrella class as {@link Pibrella}\n\t */\n\tpublic static Pibrella getInstance() {\n\t\tif (instance == null) {\n\t\t\tinstance = new Pibrella();\n\t\t}\n\t\treturn instance;\n\t}\n\n\t/**\n\t * Gets registered Pibrella I/O pins\n\t * \n\t * @return a map of registered Pibrella I/O pins as {@link Map<PibrellaPins, GPIOPin>}\n\t */\n\tpublic Map<PibrellaPins, GPIOPin> getRegisteredPins() {\n\t\treturn this.pins;\n\t}\n\n\t/**\n\t * Registers specified Pibrella I/O pin\n\t * \n\t * @param pin - Pibrella pin as {@link PibrellaPins}\n\t * @param gpioPin - GPIO pin as {@link GPIOPin}\n\t */\n\tpublic void registerPin(PibrellaPins pin, GPIOPin gpioPin) {\n\t\tthis.pins.put(pin, gpioPin);\n\t}\n\n\t/**\n\t * Gets specified Pibrella input\n\t * \n\t * @param input - Pibrella input pin as {@link PibrellaInputPins}\n\t * @return Pibrella Input pin as {@link PibrellaInputPin}\n\t * @throws DioException\n\t */\n\tpublic PibrellaInputPin getInput(PibrellaInputPins input) throws DioException {\n\t\treturn PibrellaInputPin.getInstance(input);\n\t}\n\n\t/**\n\t * Gets specified Pibrella output\n\t * \n\t * @param output Pibrella output pin as {@link PibrellaOutputPins}\n\t * @return Pibrella output pin as {@link PibrellaOutputPin}\n\t * @throws DioException\n\t */\n\tpublic PibrellaOutputPin getOutput(PibrellaOutputPins output) throws DioException {\n\t\treturn PibrellaOutputPin.getInstance(output);\n\t}\n\n\t/**\n\t * Gets Pibrella button\n\t * \n\t * @return Pibrella button as {@link PibrellaButton}\n\t * @throws DioException\n\t */\n\tpublic PibrellaButton getButton() throws DioException {\n\t\treturn PibrellaButton.getInstance();\n\t}\n\n\t/**\n\t * Gets Pibrella buzzer\n\t * \n\t * @return Pibrella buzzer as {@link PibrellaBuzzer}\n\t * @throws DioException\n\t */\n\tpublic PibrellaBuzzer getBuzzer() throws DioException {\n\t\treturn PibrellaBuzzer.getInstance();\n\t}\n\n\t/**\n\t * Gets specified Pibrella LED\n\t * \n\t * @param led - Pibrella LED as {@link PibrellaLEDs}\n\t * @return Pibrella LED as {@link PibrellaLED}\n\t * @throws DioException\n\t */\n\tpublic PibrellaLED getLED(PibrellaLEDs led) throws DioException {\n\t\treturn PibrellaLED.getInstance(led);\n\t}\n\n\t/**\n\t * Closes all Pibrella I/O pins\n\t */\n\tpublic void closeAllIOpins() {\n\t\tthis.pins.keySet().forEach(pin -> {\n\t\t\tGPIOPin gpioPin = this.pins.get(pin);\n\t\t\ttry {\n\t\t\t\tgpioPin.close();\n\t\t\t} catch (IOException e) {\n\t\t\t\tSystem.out.println(\"Failed ot close Pibrella pin: \" + pin.getDescription());\n\t\t\t}\n\t\t});\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pibrella/PibrellaButton.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pibrella;\n\nimport org.eclipse.tahu.pi.dio.DioException;\nimport org.eclipse.tahu.pi.dio.DioPin;\n\nimport jdk.dio.DeviceConfig;\nimport jdk.dio.gpio.GPIOPin;\nimport jdk.dio.gpio.GPIOPinConfig;\n\n/**\n * Defines Pibrella button\n */\npublic class PibrellaButton extends DioPin {\n\n\tprivate static PibrellaButton instance;\n\n\tprivate PibrellaButton() {\n\t\tsuper(PibrellaPins.BUTTON.getName());\n\t}\n\n\t/**\n\t * Gets an instance of Pibrella button\n\t * \n\t * @return instance of Pibrella button as {@link PibrellaButton}\n\t * @throws DioException\n\t */\n\tpublic static PibrellaButton getInstance() throws DioException {\n\t\tif (instance == null) {\n\t\t\tinstance = new PibrellaButton();\n\t\t}\n\t\tGPIOPin gpioPin = Pibrella.getInstance().getRegisteredPins().get(PibrellaPins.BUTTON);\n\t\tif (gpioPin == null || !gpioPin.isOpen()) {\n\t\t\tgpioPin = open(PibrellaPins.BUTTON.getName(),\n\t\t\t\t\tnew GPIOPinConfig(DeviceConfig.DEFAULT, PibrellaPins.BUTTON.getGPIO(), GPIOPinConfig.DIR_INPUT_ONLY,\n\t\t\t\t\t\t\tGPIOPinConfig.MODE_INPUT_PULL_DOWN, GPIOPinConfig.TRIGGER_BOTH_EDGES, false));\n\t\t\tinstance.setGpioPin(gpioPin);\n\t\t\tPibrella.getInstance().registerPin(PibrellaPins.BUTTON, gpioPin);\n\t\t}\n\t\treturn instance;\n\t}\n\n\t/**\n\t * Reports if button is pressed\n\t * \n\t * @return button state as {@link boolean}\n\t * @throws DioException\n\t */\n\tpublic boolean isPressed() throws DioException {\n\t\treturn isHigh();\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pibrella/PibrellaBuzzer.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pibrella;\n\nimport org.eclipse.tahu.pi.dio.DigitalOutputPin;\nimport org.eclipse.tahu.pi.dio.DioException;\n\nimport jdk.dio.DeviceConfig;\nimport jdk.dio.gpio.GPIOPin;\nimport jdk.dio.gpio.GPIOPinConfig;\n\n/**\n * Defines Pibrella buzzer\n */\npublic class PibrellaBuzzer extends DigitalOutputPin {\n\n\tprivate static PibrellaBuzzer instance;\n\tprivate Object lock = new Object();\n\n\tprivate PibrellaBuzzer() {\n\t\tsuper(PibrellaPins.BUZZER.getName());\n\t}\n\n\t/**\n\t * Gets an instance of Pibrella buzzer\n\t * \n\t * @return instance of Pibrella buzzer as {@link PibrellaBuzzer}\n\t * @throws DioException\n\t */\n\tpublic static PibrellaBuzzer getInstance() throws DioException {\n\t\tif (instance == null) {\n\t\t\tinstance = new PibrellaBuzzer();\n\t\t}\n\t\tGPIOPin gpioPin = Pibrella.getInstance().getRegisteredPins().get(PibrellaPins.BUZZER);\n\t\tif (gpioPin == null || !gpioPin.isOpen()) {\n\t\t\tgpioPin = open(PibrellaPins.BUZZER.getName(),\n\t\t\t\t\tnew GPIOPinConfig(DeviceConfig.DEFAULT, PibrellaPins.BUZZER.getGPIO(),\n\t\t\t\t\t\t\tGPIOPinConfig.DIR_OUTPUT_ONLY, GPIOPinConfig.MODE_OUTPUT_PUSH_PULL,\n\t\t\t\t\t\t\tGPIOPinConfig.TRIGGER_NONE, false));\n\t\t\tinstance.setGpioPin(gpioPin);\n\t\t\tPibrella.getInstance().registerPin(PibrellaPins.BUZZER, gpioPin);\n\t\t}\n\t\treturn instance;\n\t}\n\n\t/**\n\t * Starts the buzzer at specified frequency for a specified duration (in milliseconds)\n\t *\n\t * @param frequency as {@link int}\n\t * @param duration number of milliseconds as {@link int}\n\t */\n\tpublic void buzz(int frequency, int duration) {\n\t\tnew Buzzer(frequency, duration).start();\n\t}\n\n\t/*\n\t * Defines buzzer\n\t */\n\tprivate class Buzzer extends Thread {\n\n\t\tprivate int frequency;\n\t\tprivate int duration;\n\n\t\tprivate Buzzer(int frequency, int duration) {\n\t\t\tthis.frequency = frequency;\n\t\t\tthis.duration = duration;\n\t\t}\n\n\t\t@Override\n\t\tpublic void run() {\n\t\t\tsynchronized (lock) {\n\t\t\t\tlong startTime = System.currentTimeMillis();\n\t\t\t\tint halfPeriod = 1000 / (2 * this.frequency);\n\t\t\t\twhile ((System.currentTimeMillis() - startTime) < this.duration) {\n\t\t\t\t\ttry {\n\t\t\t\t\t\tsetHigh();\n\t\t\t\t\t\tlock.wait(halfPeriod);\n\t\t\t\t\t\tsetLow();\n\t\t\t\t\t} catch (Exception e) {\n\t\t\t\t\t\tSystem.out.println(\"failed to buzz\");\n\t\t\t\t\t\tbreak;\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pibrella/PibrellaInputPin.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pibrella;\n\nimport java.util.EnumMap;\nimport java.util.Map;\n\nimport org.eclipse.tahu.pi.dio.DioException;\nimport org.eclipse.tahu.pi.dio.DioPin;\n\nimport jdk.dio.DeviceConfig;\nimport jdk.dio.gpio.GPIOPin;\nimport jdk.dio.gpio.GPIOPinConfig;\n\n/**\n * Defines Pibrella input pin\n */\npublic class PibrellaInputPin extends DioPin {\n\n\tprivate static Map<PibrellaInputPins, PibrellaInputPin> inputs = new EnumMap<>(PibrellaInputPins.class);\n\n\tprivate PibrellaInputPin(PibrellaInputPins input) {\n\t\tsuper(input.getName());\n\t}\n\n\t/**\n\t * Gets an instance of PibrellaInputPin class\n\t * \n\t * @param input - pibrella input pin\n\t * @return instance of PibrellaInputPin class as {@link PibrellaInputPin}\n\t * @throws DioException\n\t */\n\tpublic static PibrellaInputPin getInstance(PibrellaInputPins input) throws DioException {\n\t\tPibrellaInputPin pibrellaInput = inputs.get(input);\n\t\tif (inputs.get(input) == null) {\n\t\t\tpibrellaInput = new PibrellaInputPin(input);\n\t\t\tinputs.put(input, pibrellaInput);\n\t\t}\n\t\tGPIOPin gpioPin = Pibrella.getInstance().getRegisteredPins().get(input.getPin());\n\t\tif (gpioPin == null || !gpioPin.isOpen()) {\n\t\t\tgpioPin = open(input.getName(),\n\t\t\t\t\tnew GPIOPinConfig(DeviceConfig.DEFAULT, input.getPin().getGPIO(), GPIOPinConfig.DIR_INPUT_ONLY,\n\t\t\t\t\t\t\tGPIOPinConfig.MODE_INPUT_PULL_DOWN, GPIOPinConfig.TRIGGER_BOTH_EDGES, false));\n\t\t\tpibrellaInput.setGpioPin(gpioPin);\n\t\t\tPibrella.getInstance().registerPin(input.getPin(), gpioPin);\n\t\t}\n\t\treturn pibrellaInput;\n\t}\n\n\t/**\n\t * Closes all Pibrella input pins\n\t */\n\tpublic static void closeAll() {\n\t\tinputs.values().forEach(input -> {\n\t\t\ttry {\n\t\t\t\tinput.close();\n\t\t\t} catch (Exception e) {\n\t\t\t\tSystem.out.println(\"failed to close \" + input.getPinName());\n\t\t\t}\n\t\t});\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pibrella/PibrellaInputPins.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pibrella;\n\n/**\n * Enumerates Pibrella input pins\n */\npublic enum PibrellaInputPins {\n\tA(PibrellaPins.INA),\n\tB(PibrellaPins.INB),\n\tC(PibrellaPins.INC),\n\tD(PibrellaPins.IND);\n\n\tprivate PibrellaPins pin;\n\n\tprivate PibrellaInputPins(PibrellaPins pin) {\n\t\tthis.pin = pin;\n\t}\n\n\tpublic PibrellaPins getPin() {\n\t\treturn this.pin;\n\t}\n\n\tpublic int getGPIO() {\n\t\treturn this.pin.getGPIO();\n\t}\n\n\tpublic String getName() {\n\t\treturn this.getPin().getName();\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pibrella/PibrellaLED.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pibrella;\n\nimport java.util.EnumMap;\nimport java.util.Map;\n\nimport org.eclipse.tahu.pi.dio.DigitalOutputPin;\nimport org.eclipse.tahu.pi.dio.DioException;\n\nimport jdk.dio.DeviceConfig;\nimport jdk.dio.gpio.GPIOPin;\nimport jdk.dio.gpio.GPIOPinConfig;\n\n/**\n * Defines Pibrella LED\n */\npublic class PibrellaLED extends DigitalOutputPin {\n\n\tprivate static Map<PibrellaLEDs, PibrellaLED> leds = new EnumMap<>(PibrellaLEDs.class);\n\n\tprivate PibrellaLED(PibrellaLEDs led) {\n\t\tsuper(led.getName());\n\t}\n\n\t/**\n\t * Gets an instance of PibrellaLED class\n\t * \n\t * @param led - Pibrellas LED\n\t * @return instance of PibrellaLED class as {@link PibrellaLED}\n\t * @throws DioException\n\t */\n\tpublic static PibrellaLED getInstance(PibrellaLEDs led) throws DioException {\n\t\tPibrellaLED pibrellaLED = leds.get(led);\n\t\tif (leds.get(led) == null) {\n\t\t\tpibrellaLED = new PibrellaLED(led);\n\t\t\tleds.put(led, pibrellaLED);\n\t\t}\n\t\tGPIOPin gpioPin = Pibrella.getInstance().getRegisteredPins().get(led.getPin());\n\t\tif (gpioPin == null || !gpioPin.isOpen()) {\n\t\t\tgpioPin = open(led.getName(),\n\t\t\t\t\tnew GPIOPinConfig(DeviceConfig.DEFAULT, led.getPin().getGPIO(), GPIOPinConfig.DIR_OUTPUT_ONLY,\n\t\t\t\t\t\t\tGPIOPinConfig.MODE_OUTPUT_PUSH_PULL, GPIOPinConfig.TRIGGER_NONE, false));\n\t\t\tpibrellaLED.setGpioPin(gpioPin);\n\t\t\tPibrella.getInstance().registerPin(led.getPin(), gpioPin);\n\t\t}\n\t\treturn pibrellaLED;\n\t}\n\n\t/**\n\t * Closes all LEDs\n\t */\n\tpublic static void closeAll() {\n\t\tleds.values().forEach(led -> {\n\t\t\ttry {\n\t\t\t\tled.close();\n\t\t\t} catch (Exception e) {\n\t\t\t\tSystem.out.println(\"failed to close \" + led.getPinName());\n\t\t\t}\n\t\t});\n\t}\n\n\t/**\n\t * Turns LED on\n\t * \n\t * @throws DioException\n\t */\n\tpublic void turnOn() throws DioException {\n\t\tsetHigh();\n\t}\n\n\t/**\n\t * Turns LED off\n\t * \n\t * @throws DioException\n\t */\n\tpublic void turnOff() throws DioException {\n\t\tsetLow();\n\t}\n\n\t/**\n\t * Reports if LED is on\n\t * \n\t * @return LED state as {@link boolean}\n\t * @throws DioException\n\t */\n\tpublic boolean isOn() throws DioException {\n\t\treturn isHigh();\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pibrella/PibrellaLEDs.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pibrella;\n\n/**\n * Enumerates Pibrella LEDs\n */\npublic enum PibrellaLEDs {\n\tGREEN(PibrellaPins.LEDG),\n\tYELLOW(PibrellaPins.LEDY),\n\tRED(PibrellaPins.LEDR);\n\n\tprivate PibrellaPins pin;\n\n\tprivate PibrellaLEDs(PibrellaPins pin) {\n\t\tthis.pin = pin;\n\t}\n\n\tpublic PibrellaPins getPin() {\n\t\treturn this.pin;\n\t}\n\n\tpublic int getGPIO() {\n\t\treturn this.pin.getGPIO();\n\t}\n\n\tpublic String getName() {\n\t\treturn this.getPin().getName();\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pibrella/PibrellaOutputPin.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pibrella;\n\nimport java.util.EnumMap;\nimport java.util.Map;\n\nimport org.eclipse.tahu.pi.dio.DigitalOutputPin;\nimport org.eclipse.tahu.pi.dio.DioException;\n\nimport jdk.dio.DeviceConfig;\nimport jdk.dio.gpio.GPIOPin;\nimport jdk.dio.gpio.GPIOPinConfig;\n\n/**\n * Defines Pibrella output pin\n */\npublic class PibrellaOutputPin extends DigitalOutputPin {\n\n\tprivate static Map<PibrellaOutputPins, PibrellaOutputPin> outputs = new EnumMap<>(PibrellaOutputPins.class);\n\n\tprivate PibrellaOutputPin(PibrellaOutputPins output) {\n\t\tsuper(output.getName());\n\t}\n\n\t/**\n\t * Gets an instance of PibrellaOutputPin class\n\t * \n\t * @param output - Pibrella output pin\n\t * @return instance of PibrellaOutputPin class as {@link PibrellaOutputPin}\n\t * @throws DioException\n\t */\n\tpublic static PibrellaOutputPin getInstance(PibrellaOutputPins output) throws DioException {\n\t\tPibrellaOutputPin pibrellaOutput = outputs.get(output);\n\t\tif (outputs.get(output) == null) {\n\t\t\tpibrellaOutput = new PibrellaOutputPin(output);\n\t\t\toutputs.put(output, pibrellaOutput);\n\t\t}\n\t\tGPIOPin gpioPin = Pibrella.getInstance().getRegisteredPins().get(output.getPin());\n\t\tif (gpioPin == null || !gpioPin.isOpen()) {\n\t\t\tgpioPin = open(output.getName(),\n\t\t\t\t\tnew GPIOPinConfig(DeviceConfig.DEFAULT, output.getPin().getGPIO(), GPIOPinConfig.DIR_OUTPUT_ONLY,\n\t\t\t\t\t\t\tGPIOPinConfig.MODE_OUTPUT_PUSH_PULL, GPIOPinConfig.TRIGGER_NONE, false));\n\t\t\tpibrellaOutput.setGpioPin(gpioPin);\n\t\t\tPibrella.getInstance().registerPin(output.getPin(), gpioPin);\n\t\t}\n\t\treturn pibrellaOutput;\n\t}\n\n\t/**\n\t * Closes all Pibrella output pins\n\t */\n\tpublic static void closeAll() {\n\t\toutputs.values().forEach(output -> {\n\t\t\ttry {\n\t\t\t\toutput.close();\n\t\t\t} catch (Exception e) {\n\t\t\t\tSystem.out.println(\"failed to close \" + output.getPinName());\n\t\t\t}\n\t\t});\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pibrella/PibrellaOutputPins.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pibrella;\n\n/**\n * Enumerates Pibrella output pins\n */\npublic enum PibrellaOutputPins {\n\tE(PibrellaPins.OUTE),\n\tF(PibrellaPins.OUTF),\n\tG(PibrellaPins.OUTG),\n\tH(PibrellaPins.OUTH);\n\n\tprivate PibrellaPins pin;\n\n\tprivate PibrellaOutputPins(PibrellaPins pin) {\n\t\tthis.pin = pin;\n\t}\n\n\tpublic PibrellaPins getPin() {\n\t\treturn this.pin;\n\t}\n\n\tpublic int getGPIO() {\n\t\treturn this.pin.getGPIO();\n\t}\n\n\tpublic String getName() {\n\t\treturn this.getPin().getName();\n\t}\n}\n"
  },
  {
    "path": "java/examples/raspberry_pi/src/main/java/org/eclipse/tahu/pibrella/PibrellaPins.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.pibrella;\n\nimport org.eclipse.tahu.pi.dio.PinDirection;\nimport org.eclipse.tahu.pi.dio.Pins;\n\n/**\n * Enumerates Pibrella I/O pins\n */\npublic enum PibrellaPins {\n\tINA(Pins.P21, PinDirection.INPUT, \"Input A\", \"Inputs/a\"),\n\tINB(Pins.P26, PinDirection.INPUT, \"Input B\", \"Inputs/b\"),\n\tINC(Pins.P24, PinDirection.INPUT, \"Input C\", \"Inputs/c\"),\n\tIND(Pins.P19, PinDirection.INPUT, \"Input D\", \"Inputs/d\"),\n\tBUTTON(Pins.P23, PinDirection.INPUT, \"Button\", \"button\"),\n\tOUTE(Pins.P15, PinDirection.OUTPUT, \"Output E\", \"Outputs/e\"),\n\tOUTF(Pins.P16, PinDirection.OUTPUT, \"Output F\", \"Outputs/f\"),\n\tOUTG(Pins.P18, PinDirection.OUTPUT, \"Output G\", \"Outputs/g\"),\n\tOUTH(Pins.P22, PinDirection.OUTPUT, \"Output H\", \"Outputs/h\"),\n\tLEDG(Pins.P7, PinDirection.OUTPUT, \"Green LED\", \"Outputs/LEDs/green\"),\n\tLEDY(Pins.P11, PinDirection.OUTPUT, \"Yellow LED\", \"Outputs/LEDs/yellow\"),\n\tLEDR(Pins.P13, PinDirection.OUTPUT, \"Red LED\", \"Outputs/LEDs/red\"),\n\tBUZZER(Pins.P12, PinDirection.OUTPUT, \"Buzzer\", \"buzzer\");\n\n\tprivate Pins pin;\n\tprivate PinDirection direction;\n\tprivate String name;\n\tprivate String description;\n\n\tprivate PibrellaPins(Pins pin, PinDirection direction, String name, String description) {\n\t\tthis.pin = pin;\n\t\tthis.direction = direction;\n\t\tthis.name = name;\n\t\tthis.description = description;\n\t}\n\n\tpublic Pins getPin() {\n\t\treturn this.pin;\n\t}\n\n\tpublic int getGPIO() {\n\t\treturn this.pin.getGPIO();\n\t}\n\n\tpublic PinDirection getDirection() {\n\t\treturn this.direction;\n\t}\n\n\tpublic String getName() {\n\t\treturn this.name;\n\t}\n\n\tpublic String getDescription() {\n\t\treturn this.description;\n\t}\n}\n"
  },
  {
    "path": "java/examples/records/THIRD-PARTY.txt",
    "content": "\nLists of 5 third-party dependencies.\n     (New BSD license) Protocol Buffer Java API (com.google.protobuf:protobuf-java:2.6.1 - https://developers.google.com/protocol-buffers/)\n     (The Apache Software License, Version 2.0) Apache Log4j (log4j:log4j:1.2.17 - http://logging.apache.org/log4j/1.2/)\n     (Eclipse Public License - Version 1.0) org.eclipse.paho.client.mqttv3 (org.eclipse.paho:org.eclipse.paho.client.mqttv3:1.0.2 - http://www.eclipse.org/paho/org.eclipse.paho.client.mqttv3)\n     (MIT License) SLF4J API Module (org.slf4j:slf4j-api:1.7.5 - http://www.slf4j.org)\n     (MIT License) SLF4J LOG4J-12 Binding (org.slf4j:slf4j-log4j12:1.7.5 - http://www.slf4j.org)\n"
  },
  {
    "path": "java/examples/records/pom.xml",
    "content": "<!--/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n  xsi:schemaLocation=\"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">\n  <modelVersion>4.0.0</modelVersion>\n\n  <parent>\n    <groupId>org.eclipse.tahu</groupId>\n    <artifactId>tahu-examples</artifactId>\n    <version>1.0.7</version>\n    <relativePath>../pom.xml</relativePath>\n  </parent>\n\n  <artifactId>example_records</artifactId>\n  <packaging>jar</packaging>\n  <name>Sparkplug B Records Example</name>\n\n  <dependencies>\n    <dependency>\n      <groupId>org.eclipse.tahu</groupId>\n      <artifactId>tahu-core</artifactId>\n      <version>${project.version}</version>\n    </dependency>\n  </dependencies>\n  <build>\n    <plugins>\n      <plugin>\n        <groupId>org.sonatype.plugins</groupId>\n        <artifactId>nexus-staging-maven-plugin</artifactId>\n        <configuration>\n          <skipNexusStagingDeployMojo>true</skipNexusStagingDeployMojo>\n        </configuration>\n      </plugin>\n\n      <!-- New Build mechanism - replaces maven-assembly-plugin -->\n      <plugin>\n        <groupId>org.apache.maven.plugins</groupId>\n        <artifactId>maven-shade-plugin</artifactId>\n        <version>2.4.1</version>\n        <executions>\n          <execution>\n            <phase>package</phase>\n            <goals>\n              <goal>shade</goal>\n            </goals>\n            <configuration>\n              <filters>\n                <filter>\n                  <artifact>*:*</artifact>\n                  <excludes>\n                    <exclude>META-INF/*.SF</exclude>\n                    <exclude>META-INF/*.DSA</exclude>\n                    <exclude>META-INF/*.RSA</exclude>\n                  </excludes>\n                </filter>\n              </filters>\n              <transformers>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ManifestResourceTransformer\">\n                  <mainClass>org.eclipse.tahu.SparkplugRecordsExample</mainClass>\n                </transformer>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ServicesResourceTransformer\" />\n              </transformers>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <groupId>org.codehaus.mojo</groupId>\n        <artifactId>license-maven-plugin</artifactId>\n        <version>1.8</version>\n        <executions>\n          <execution>\n            <id>add-third-party</id>\n            <phase>package</phase>\n            <goals>\n              <goal>add-third-party</goal>\n              <goal>download-licenses</goal>\n            </goals>\n            <configuration>\n              <useMissingFile>true</useMissingFile>\n              <excludedScopes>test</excludedScopes>\n              <excludedGroups> (org.eclipse.tahu*)\n              </excludedGroups>\n              <licenseMerges>\n                <licenseMerge>The Apache Software License, Version\n                  2.0|Apache License, Version 2.0|Apache Public License\n                  2.0|Apache License 2.0|Apache Software License -\n                  Version 2.0</licenseMerge>\n              </licenseMerges>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n    </plugins>\n  </build>\n</project>\n"
  },
  {
    "path": "java/examples/records/src/main/java/org/eclipse/tahu/SparkplugRecordsExample.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu;\n\nimport static org.eclipse.tahu.message.model.MetricDataType.Boolean;\nimport static org.eclipse.tahu.message.model.MetricDataType.Int64;\nimport static org.eclipse.tahu.message.model.MetricDataType.String;\n\nimport java.util.Date;\nimport java.util.Random;\nimport java.util.TreeMap;\nimport java.util.concurrent.ExecutorService;\nimport java.util.concurrent.Executors;\n\nimport javax.net.SocketFactory;\nimport javax.net.ssl.SSLSocketFactory;\n\nimport org.eclipse.paho.client.mqttv3.IMqttDeliveryToken;\nimport org.eclipse.paho.client.mqttv3.MqttCallbackExtended;\nimport org.eclipse.paho.client.mqttv3.MqttClient;\nimport org.eclipse.paho.client.mqttv3.MqttConnectOptions;\nimport org.eclipse.paho.client.mqttv3.MqttException;\nimport org.eclipse.paho.client.mqttv3.MqttMessage;\nimport org.eclipse.paho.client.mqttv3.MqttPersistenceException;\nimport org.eclipse.tahu.message.SparkplugBPayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadEncoder;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.PropertyDataType;\nimport org.eclipse.tahu.message.model.PropertySet.PropertySetBuilder;\nimport org.eclipse.tahu.message.model.PropertyValue;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\n\n/**\n * An example Sparkplug B application.\n */\npublic class SparkplugRecordsExample implements MqttCallbackExtended {\n\n\tprivate static final String NAMESPACE = \"spBv1.0\";\n\n\t// Configuration\n\tprivate static final boolean USING_REAL_TLS = false;\n\tprivate String serverUrl = \"tcp://localhost:1883\";\n\tprivate String groupId = \"Sparkplug B Devices\";\n\tprivate String edgeNode = \"Java Sparkplug B Example\";\n\tprivate String deviceId = \"SparkplugBExample\";\n\tprivate String clientId = \"SparkplugBRecordExample\";\n\tprivate String username = \"admin\";\n\tprivate String password = \"changeme\";\n\tprivate long eventPeriod = 1000; // Publish period in milliseconds\n\tprivate int numOfEventsPerPublish = 10;\n\tprivate ExecutorService executor;\n\tprivate MqttClient client;\n\n\tprivate int bdSeq = 0;\n\tprivate int seq = 0;\n\n\tprivate Object seqLock = new Object();\n\n\tpublic static void main(String[] args) {\n\t\tSparkplugRecordsExample example = new SparkplugRecordsExample();\n\t\texample.run();\n\t}\n\n\tpublic void run() {\n\t\ttry {\n\t\t\t// Random generator and thread pool for outgoing published messages\n\t\t\texecutor = Executors.newFixedThreadPool(1);\n\n\t\t\t// Build up DEATH payload - note DEATH payloads don't have a regular sequence number\n\t\t\tSparkplugBPayloadBuilder deathPayload = new SparkplugBPayloadBuilder().setTimestamp(new Date());\n\t\t\tdeathPayload = addBdSeqNum(deathPayload);\n\t\t\tbyte[] deathBytes = new SparkplugBPayloadEncoder().getBytes(deathPayload.createPayload(), false);\n\n\t\t\tMqttConnectOptions options = new MqttConnectOptions();\n\n\t\t\tif (USING_REAL_TLS) {\n\t\t\t\tSocketFactory sf = SSLSocketFactory.getDefault();\n\t\t\t\toptions.setSocketFactory(sf);\n\t\t\t}\n\n\t\t\t// Connect to the MQTT Server\n\t\t\toptions.setAutomaticReconnect(true);\n\t\t\toptions.setCleanSession(true);\n\t\t\toptions.setConnectionTimeout(30);\n\t\t\toptions.setKeepAliveInterval(30);\n\t\t\toptions.setUserName(username);\n\t\t\toptions.setPassword(password.toCharArray());\n\t\t\toptions.setWill(NAMESPACE + \"/\" + groupId + \"/NDEATH/\" + edgeNode, deathBytes, 0, false);\n\t\t\tclient = new MqttClient(serverUrl, clientId);\n\t\t\tclient.setTimeToWait(2000);\n\t\t\tclient.setCallback(this); // short timeout on failure to connect\n\t\t\tclient.connect(options);\n\n\t\t\t// Subscribe to control/command messages for both the edge of network node and the attached devices\n\t\t\tclient.subscribe(NAMESPACE + \"/\" + groupId + \"/NCMD/\" + edgeNode + \"/#\", 0);\n\t\t\tclient.subscribe(NAMESPACE + \"/\" + groupId + \"/DCMD/\" + edgeNode + \"/#\", 0);\n\n\t\t\t// Delay before starting to publish records\n\t\t\tThread.sleep(3000);\n\n\t\t\t// Loop forever publishing records\n\t\t\twhile (true) {\n\t\t\t\tString recordType = \"deviceEvents\";\n\t\t\t\t// Create the payload\n\t\t\t\tSparkplugBPayload payload = new SparkplugBPayloadBuilder(getSeqNum()).setTimestamp(new Date())\n\t\t\t\t\t\t.setUuid(newUUID()).createPayload();\n\n\t\t\t\t// Add records to payload\n\t\t\t\tfor (int i = 0; i < numOfEventsPerPublish; i++) {\n\t\t\t\t\tThread.sleep(eventPeriod);\n\t\t\t\t\tpayload.addMetric(newRecord(recordType));\n\t\t\t\t}\n\n\t\t\t\t// Publish the payload, if connected\n\t\t\t\tif (client.isConnected()) {\n\t\t\t\t\tsynchronized (seqLock) {\n\t\t\t\t\t\tSystem.out.println(\"Connected - publishing new records\");\n\t\t\t\t\t\tclient.publish(NAMESPACE + \"/\" + groupId + \"/DRECORD/\" + edgeNode + \"/\" + deviceId,\n\t\t\t\t\t\t\t\tnew SparkplugBPayloadEncoder().getBytes(payload, false), 0, false);\n\t\t\t\t\t}\n\t\t\t\t} else {\n\t\t\t\t\tSystem.out.println(\"Not connected - not publishing records\");\n\t\t\t\t}\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\t/*\n\t * Returns a new metric representing a record.\n\t */\n\tprivate Metric newRecord(String type) throws SparkplugInvalidTypeException {\n\t\tRandom random = new Random();\n\t\tDate timestamp = new Date();\n\n\t\tSystem.out.println(\"Creatine new \" + type + \" record, \" + timestamp);\n\t\t// Metric name = Record type\n\t\t// Metric datatype = (not used)\n\t\t// Metric value = (not used)\n\t\t// Metric properties = Record fields\n\t\treturn new MetricBuilder(type, String, null).timestamp(timestamp)\n\t\t\t\t.properties(new PropertySetBuilder(new TreeMap<>()) // TreeMap for natural ordering of the fields\n\t\t\t\t\t\t.addProperty(\"intField\", new PropertyValue(PropertyDataType.PropertySet,\n\t\t\t\t\t\t\t\tnew PropertySetBuilder(new TreeMap<>())\n\t\t\t\t\t\t\t\t\t\t.addProperty(\"fieldValue\",\n\t\t\t\t\t\t\t\t\t\t\t\tnew PropertyValue(PropertyDataType.Int32, random.nextInt()))\n\t\t\t\t\t\t\t\t\t\t.createPropertySet()))\n\t\t\t\t\t\t.addProperty(\"fltField\",\n\t\t\t\t\t\t\t\tnew PropertyValue(PropertyDataType.PropertySet,\n\t\t\t\t\t\t\t\t\t\tnew PropertySetBuilder(new TreeMap<>())\n\t\t\t\t\t\t\t\t\t\t\t\t.addProperty(\"fieldValue\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\tnew PropertyValue(PropertyDataType.Float, random.nextFloat()))\n\t\t\t\t\t\t\t\t\t\t\t\t.createPropertySet()))\n\t\t\t\t\t\t.addProperty(\"strField\",\n\t\t\t\t\t\t\t\tnew PropertyValue(PropertyDataType.PropertySet,\n\t\t\t\t\t\t\t\t\t\tnew PropertySetBuilder(new TreeMap<>())\n\t\t\t\t\t\t\t\t\t\t\t\t.addProperty(\"fieldValue\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\tnew PropertyValue(PropertyDataType.String, newUUID()))\n\t\t\t\t\t\t\t\t\t\t\t\t.createPropertySet()))\n\t\t\t\t\t\t.createPropertySet())\n\t\t\t\t.createMetric();\n\t}\n\n\tprivate void publishBirth() {\n\t\tpublishNodeBirth();\n\t\tpublishDeviceBirth();\n\t}\n\n\tprivate void publishNodeBirth() {\n\t\ttry {\n\t\t\tsynchronized (seqLock) {\n\t\t\t\t// Reset the sequence number\n\t\t\t\tseq = 0;\n\n\t\t\t\t// Create the BIRTH payload\n\t\t\t\tSparkplugBPayload payload = new SparkplugBPayloadBuilder(getSeqNum()).setTimestamp(new Date())\n\t\t\t\t\t\t.setUuid(newUUID()).addMetric(new MetricBuilder(\"bdSeq\", Int64, (long) bdSeq).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"Node Control/Rebirth\", Boolean, false).createMetric())\n\t\t\t\t\t\t.createPayload();\n\n\t\t\t\tSystem.out.println(\"Publishing Edge Node Birth\");\n\t\t\t\texecutor.execute(new Publisher(NAMESPACE + \"/\" + groupId + \"/NBIRTH/\" + edgeNode, payload));\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\tprivate void publishDeviceBirth() {\n\t\ttry {\n\t\t\tsynchronized (seqLock) {\n\t\t\t\t// Create the payload\n\t\t\t\tSparkplugBPayload payload =\n\t\t\t\t\t\tnew SparkplugBPayloadBuilder(getSeqNum()).setTimestamp(new Date()).setUuid(newUUID())\n\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(\"Device Control/Rebirth\", Boolean, false).createMetric())\n\t\t\t\t\t\t\t\t.createPayload();\n\n\t\t\t\tSystem.out.println(\"Publishing Device Birth\");\n\t\t\t\texecutor.execute(\n\t\t\t\t\t\tnew Publisher(NAMESPACE + \"/\" + groupId + \"/DBIRTH/\" + edgeNode + \"/\" + deviceId, payload));\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\t/*\n\t *  Add the birth/death sequence number to a payload\n\t */\n\tprivate SparkplugBPayloadBuilder addBdSeqNum(SparkplugBPayloadBuilder payload) throws Exception {\n\t\tif (payload == null) {\n\t\t\tpayload = new SparkplugBPayloadBuilder();\n\t\t}\n\t\tif (bdSeq == 256) {\n\t\t\tbdSeq = 0;\n\t\t}\n\t\tpayload.addMetric(new MetricBuilder(\"bdSeq\", Int64, (long) bdSeq).createMetric());\n\t\tbdSeq++;\n\t\treturn payload;\n\t}\n\n\t/*\n\t *  Increments and returns the next sequence number\n\t */\n\tprivate long getSeqNum() throws Exception {\n\t\tSystem.out.println(\"seq: \" + seq);\n\t\tif (seq == 256) {\n\t\t\tseq = 0;\n\t\t}\n\t\treturn seq++;\n\t}\n\n\t@Override\n\tpublic void connectComplete(boolean reconnect, String serverURI) {\n\t\tSystem.out.println(\"Connected! - publishing birth\");\n\t\tpublishBirth();\n\t}\n\n\tpublic void connectionLost(Throwable cause) {\n\t\tcause.printStackTrace();\n\t\tSystem.out.println(\"The MQTT Connection was lost! - will auto-reconnect\");\n\t}\n\n\tpublic void messageArrived(String topic, MqttMessage message) throws Exception {\n\t\tSystem.out.println(\"Message Arrived on topic \" + topic);\n\n\t\tSparkplugBPayloadDecoder decoder = new SparkplugBPayloadDecoder();\n\t\tSparkplugBPayload inboundPayload = decoder.buildFromByteArray(message.getPayload(), null);\n\n\t\t// Debug\n\t\tfor (Metric metric : inboundPayload.getMetrics()) {\n\t\t\tSystem.out.println(\"Metric \" + metric.getName() + \"=\" + metric.getValue());\n\t\t}\n\n\t\tString[] splitTopic = topic.split(\"/\");\n\t\tif (splitTopic[0].equals(NAMESPACE) && splitTopic[1].equals(groupId) && splitTopic[2].equals(\"NCMD\")\n\t\t\t\t&& splitTopic[3].equals(edgeNode)) {\n\t\t\tfor (Metric metric : inboundPayload.getMetrics()) {\n\t\t\t\tif (\"Node Control/Rebirth\".equals(metric.getName()) && ((Boolean) metric.getValue())) {\n\t\t\t\t\tpublishBirth();\n\t\t\t\t} else {\n\t\t\t\t\tSystem.out.println(\"Unknown Node Command NCMD: \" + metric.getName());\n\t\t\t\t}\n\t\t\t}\n\t\t} else if (splitTopic[0].equals(NAMESPACE) && splitTopic[1].equals(groupId) && splitTopic[2].equals(\"DCMD\")\n\t\t\t\t&& splitTopic[3].equals(edgeNode)) {\n\t\t\tSystem.out.println(\"Command recevied for device \" + splitTopic[4]);\n\t\t}\n\t}\n\n\tpublic void deliveryComplete(IMqttDeliveryToken token) {\n\t\tSystem.out.println(\"Published message: \" + token);\n\t}\n\n\tprivate String newUUID() {\n\t\treturn java.util.UUID.randomUUID().toString();\n\t}\n\n\tprivate class Publisher implements Runnable {\n\n\t\tprivate String topic;\n\t\tprivate SparkplugBPayload outboundPayload;\n\n\t\tpublic Publisher(String topic, SparkplugBPayload outboundPayload) {\n\t\t\tthis.topic = topic;\n\t\t\tthis.outboundPayload = outboundPayload;\n\t\t}\n\n\t\tpublic void run() {\n\t\t\ttry {\n\t\t\t\toutboundPayload.setTimestamp(new Date());\n\t\t\t\tSparkplugBPayloadEncoder encoder = new SparkplugBPayloadEncoder();\n\t\t\t\tclient.publish(topic, encoder.getBytes(outboundPayload, false), 0, false);\n\t\t\t} catch (MqttPersistenceException e) {\n\t\t\t\te.printStackTrace();\n\t\t\t} catch (MqttException e) {\n\t\t\t\te.printStackTrace();\n\t\t\t} catch (Exception e) {\n\t\t\t\te.printStackTrace();\n\t\t\t}\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/examples/simple/THIRD-PARTY.txt",
    "content": "\nLists of 5 third-party dependencies.\n     (New BSD license) Protocol Buffer Java API (com.google.protobuf:protobuf-java:2.6.1 - https://developers.google.com/protocol-buffers/)\n     (The Apache Software License, Version 2.0) Apache Log4j (log4j:log4j:1.2.17 - http://logging.apache.org/log4j/1.2/)\n     (Eclipse Public License - Version 1.0) org.eclipse.paho.client.mqttv3 (org.eclipse.paho:org.eclipse.paho.client.mqttv3:1.0.2 - http://www.eclipse.org/paho/org.eclipse.paho.client.mqttv3)\n     (MIT License) SLF4J API Module (org.slf4j:slf4j-api:1.7.5 - http://www.slf4j.org)\n     (MIT License) SLF4J LOG4J-12 Binding (org.slf4j:slf4j-log4j12:1.7.5 - http://www.slf4j.org)\n"
  },
  {
    "path": "java/examples/simple/pom.xml",
    "content": "<!--/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n  xsi:schemaLocation=\"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">\n  <modelVersion>4.0.0</modelVersion>\n\n  <parent>\n    <groupId>org.eclipse.tahu</groupId>\n    <artifactId>tahu-examples</artifactId>\n    <version>1.0.7</version>\n    <relativePath>../pom.xml</relativePath>\n  </parent>\n\n  <artifactId>example_simple</artifactId>\n  <packaging>jar</packaging>\n  <name>Sparkplug B Simple Example</name>\n\n  <dependencies>\n    <dependency>\n      <groupId>org.eclipse.tahu</groupId>\n      <artifactId>tahu-core</artifactId>\n      <version>${project.version}</version>\n    </dependency>\n  </dependencies>\n  <build>\n    <plugins>\n      <plugin>\n        <groupId>org.sonatype.plugins</groupId>\n        <artifactId>nexus-staging-maven-plugin</artifactId>\n        <configuration>\n          <skipNexusStagingDeployMojo>true</skipNexusStagingDeployMojo>\n        </configuration>\n      </plugin>\n\n      <!-- New Build mechanism - replaces maven-assembly-plugin -->\n      <plugin>\n        <groupId>org.apache.maven.plugins</groupId>\n        <artifactId>maven-shade-plugin</artifactId>\n        <version>2.4.1</version>\n        <executions>\n          <execution>\n            <phase>package</phase>\n            <goals>\n              <goal>shade</goal>\n            </goals>\n            <configuration>\n              <filters>\n                <filter>\n                  <artifact>*:*</artifact>\n                  <excludes>\n                    <exclude>META-INF/*.SF</exclude>\n                    <exclude>META-INF/*.DSA</exclude>\n                    <exclude>META-INF/*.RSA</exclude>\n                  </excludes>\n                </filter>\n              </filters>\n              <transformers>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ManifestResourceTransformer\">\n                  <mainClass>org.eclipse.tahu.SparkplugExample</mainClass>\n                </transformer>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ServicesResourceTransformer\" />\n              </transformers>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <groupId>org.codehaus.mojo</groupId>\n        <artifactId>license-maven-plugin</artifactId>\n        <version>1.8</version>\n        <executions>\n          <execution>\n            <id>add-third-party</id>\n            <phase>package</phase>\n            <goals>\n              <goal>add-third-party</goal>\n              <goal>download-licenses</goal>\n            </goals>\n            <configuration>\n              <useMissingFile>true</useMissingFile>\n              <excludedScopes>test</excludedScopes>\n              <excludedGroups> (org.eclipse.tahu*)\n              </excludedGroups>\n              <licenseMerges>\n                <licenseMerge>The Apache Software License, Version\n                  2.0|Apache License, Version 2.0|Apache Public License\n                  2.0|Apache License 2.0|Apache Software License -\n                  Version 2.0</licenseMerge>\n              </licenseMerges>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n    </plugins>\n  </build>\n</project>\n"
  },
  {
    "path": "java/examples/simple/src/main/java/org/eclipse/tahu/SparkplugExample.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu;\n\nimport static org.eclipse.tahu.message.model.MetricDataType.Boolean;\nimport static org.eclipse.tahu.message.model.MetricDataType.DataSet;\nimport static org.eclipse.tahu.message.model.MetricDataType.DateTime;\nimport static org.eclipse.tahu.message.model.MetricDataType.Double;\nimport static org.eclipse.tahu.message.model.MetricDataType.Float;\nimport static org.eclipse.tahu.message.model.MetricDataType.Int16;\nimport static org.eclipse.tahu.message.model.MetricDataType.Int32;\nimport static org.eclipse.tahu.message.model.MetricDataType.Int64;\nimport static org.eclipse.tahu.message.model.MetricDataType.Int8;\nimport static org.eclipse.tahu.message.model.MetricDataType.String;\nimport static org.eclipse.tahu.message.model.MetricDataType.Template;\nimport static org.eclipse.tahu.message.model.MetricDataType.Text;\nimport static org.eclipse.tahu.message.model.MetricDataType.UInt16;\nimport static org.eclipse.tahu.message.model.MetricDataType.UInt32;\nimport static org.eclipse.tahu.message.model.MetricDataType.UInt64;\nimport static org.eclipse.tahu.message.model.MetricDataType.UInt8;\nimport static org.eclipse.tahu.message.model.MetricDataType.UUID;\n\nimport java.math.BigInteger;\nimport java.util.ArrayList;\nimport java.util.Date;\nimport java.util.HashMap;\nimport java.util.List;\nimport java.util.Map;\nimport java.util.Random;\nimport java.util.concurrent.ExecutorService;\nimport java.util.concurrent.Executors;\nimport java.util.concurrent.ThreadLocalRandom;\n\nimport javax.net.SocketFactory;\nimport javax.net.ssl.SSLSocketFactory;\n\nimport org.eclipse.paho.client.mqttv3.IMqttDeliveryToken;\nimport org.eclipse.paho.client.mqttv3.MqttCallbackExtended;\nimport org.eclipse.paho.client.mqttv3.MqttClient;\nimport org.eclipse.paho.client.mqttv3.MqttConnectOptions;\nimport org.eclipse.paho.client.mqttv3.MqttException;\nimport org.eclipse.paho.client.mqttv3.MqttMessage;\nimport org.eclipse.paho.client.mqttv3.MqttPersistenceException;\nimport org.eclipse.tahu.message.SparkplugBPayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadEncoder;\nimport org.eclipse.tahu.message.model.DataSet;\nimport org.eclipse.tahu.message.model.DataSet.DataSetBuilder;\nimport org.eclipse.tahu.message.model.DataSetDataType;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.Parameter;\nimport org.eclipse.tahu.message.model.ParameterDataType;\nimport org.eclipse.tahu.message.model.PropertyDataType;\nimport org.eclipse.tahu.message.model.PropertySet;\nimport org.eclipse.tahu.message.model.PropertySet.PropertySetBuilder;\nimport org.eclipse.tahu.message.model.PropertyValue;\nimport org.eclipse.tahu.message.model.Row.RowBuilder;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.eclipse.tahu.message.model.Template;\nimport org.eclipse.tahu.message.model.Template.TemplateBuilder;\nimport org.eclipse.tahu.message.model.Value;\nimport org.eclipse.tahu.util.CompressionAlgorithm;\nimport org.eclipse.tahu.util.PayloadUtil;\n\n/**\n * An example Sparkplug B application.\n */\npublic class SparkplugExample implements MqttCallbackExtended {\n\n\t// HW/SW versions\n\tprivate static final String HW_VERSION = \"Emulated Hardware\";\n\tprivate static final String SW_VERSION = \"v1.0.0\";\n\n\tprivate static final String NAMESPACE = \"spBv1.0\";\n\n\t// Configuration\n\tprivate static final boolean USING_REAL_TLS = false;\n\tprivate static final boolean USING_COMPRESSION = false;\n\tprivate static final CompressionAlgorithm compressionAlgorithm = CompressionAlgorithm.GZIP;\n\tprivate String serverUrl = \"tcp://localhost:1883\";\n\tprivate String groupId = \"Sparkplug B Devices\";\n\tprivate String edgeNode = \"Java Sparkplug B Example\";\n\tprivate String deviceId = \"SparkplugBExample\";\n\tprivate String clientId = \"SparkplugBExampleEdgeNode\";\n\tprivate String username = \"admin\";\n\tprivate String password = \"changeme\";\n\tprivate long PUBLISH_PERIOD = 60000; // Publish period in milliseconds\n\tprivate ExecutorService executor;\n\tprivate MqttClient client;\n\n\tprivate int bdSeq = 0;\n\tprivate int seq = 0;\n\n\tprivate Object seqLock = new Object();\n\n\tpublic static void main(String[] args) {\n\t\tSparkplugExample example = new SparkplugExample();\n\t\texample.run();\n\t}\n\n\tpublic void run() {\n\t\ttry {\n\t\t\t// Random generator and thread pool for outgoing published messages\n\t\t\texecutor = Executors.newFixedThreadPool(1);\n\n\t\t\t// Build up DEATH payload - note DEATH payloads don't have a regular sequence number\n\t\t\tSparkplugBPayloadBuilder deathPayload = new SparkplugBPayloadBuilder().setTimestamp(new Date());\n\t\t\tdeathPayload = addBdSeqNum(deathPayload);\n\t\t\tbyte[] deathBytes;\n\t\t\tif (USING_COMPRESSION) {\n\t\t\t\t// Compress payload (optional)\n\t\t\t\tdeathBytes = new SparkplugBPayloadEncoder().getBytes(\n\t\t\t\t\t\tPayloadUtil.compress(deathPayload.createPayload(), compressionAlgorithm, false), false);\n\t\t\t} else {\n\t\t\t\tdeathBytes = new SparkplugBPayloadEncoder().getBytes(deathPayload.createPayload(), false);\n\t\t\t}\n\n\t\t\tMqttConnectOptions options = new MqttConnectOptions();\n\n\t\t\tif (USING_REAL_TLS) {\n\t\t\t\tSocketFactory sf = SSLSocketFactory.getDefault();\n\t\t\t\toptions.setSocketFactory(sf);\n\t\t\t}\n\n\t\t\t// Connect to the MQTT Server\n\t\t\toptions.setAutomaticReconnect(true);\n\t\t\toptions.setCleanSession(true);\n\t\t\toptions.setConnectionTimeout(30);\n\t\t\toptions.setKeepAliveInterval(30);\n\t\t\toptions.setUserName(username);\n\t\t\toptions.setPassword(password.toCharArray());\n\t\t\toptions.setWill(NAMESPACE + \"/\" + groupId + \"/NDEATH/\" + edgeNode, deathBytes, 0, false);\n\t\t\tclient = new MqttClient(serverUrl, clientId);\n\t\t\tclient.setTimeToWait(2000);\n\t\t\tclient.setCallback(this); // short timeout on failure to connect\n\t\t\tclient.connect(options);\n\n\t\t\t// Subscribe to control/command messages for both the edge of network node and the attached devices\n\t\t\tclient.subscribe(NAMESPACE + \"/\" + groupId + \"/NCMD/\" + edgeNode + \"/#\", 0);\n\t\t\tclient.subscribe(NAMESPACE + \"/\" + groupId + \"/DCMD/\" + edgeNode + \"/#\", 0);\n\t\t\tclient.subscribe(NAMESPACE + \"/#\", 0);\n\n\t\t\t// Loop forever publishing data every PUBLISH_PERIOD\n\t\t\twhile (true) {\n\t\t\t\tThread.sleep(PUBLISH_PERIOD);\n\n\t\t\t\tif (client.isConnected()) {\n\t\t\t\t\tsynchronized (seqLock) {\n\t\t\t\t\t\tSystem.out.println(\"Connected - publishing new data\");\n\t\t\t\t\t\t// Create the payload and add some metrics\n\t\t\t\t\t\tSparkplugBPayload payload =\n\t\t\t\t\t\t\t\tnew SparkplugBPayload(new Date(), newMetrics(false), getSeqNum(), newUUID(), null);\n\n\t\t\t\t\t\t// Compress payload (optional)\n\t\t\t\t\t\tif (USING_COMPRESSION) {\n\t\t\t\t\t\t\tclient.publish(NAMESPACE + \"/\" + groupId + \"/DDATA/\" + edgeNode + \"/\" + deviceId,\n\t\t\t\t\t\t\t\t\tnew SparkplugBPayloadEncoder().getBytes(\n\t\t\t\t\t\t\t\t\t\t\tPayloadUtil.compress(payload, compressionAlgorithm, false), false),\n\t\t\t\t\t\t\t\t\t0, false);\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tclient.publish(NAMESPACE + \"/\" + groupId + \"/DDATA/\" + edgeNode + \"/\" + deviceId,\n\t\t\t\t\t\t\t\t\tnew SparkplugBPayloadEncoder().getBytes(payload, false), 0, false);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t} else {\n\t\t\t\t\tSystem.out.println(\"Not connected - not publishing data\");\n\t\t\t\t}\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\tprivate byte[] randomBytes(int numOfBytes) {\n\t\tbyte[] bytes = new byte[numOfBytes];\n\t\tnew Random().nextBytes(bytes);\n\t\treturn bytes;\n\t}\n\n\tprivate void publishBirth() {\n\t\tpublishNodeBirth();\n\t\tpublishDeviceBirth();\n\t}\n\n\tprivate void publishNodeBirth() {\n\t\ttry {\n\t\t\tsynchronized (seqLock) {\n\t\t\t\t// Reset the sequence number\n\t\t\t\tseq = 0;\n\n\t\t\t\t// Create the BIRTH payload and set the position and other metrics\n\t\t\t\tSparkplugBPayload payload =\n\t\t\t\t\t\tnew SparkplugBPayload(new Date(), new ArrayList<Metric>(), getSeqNum(), newUUID(), null);\n\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"bdSeq\", Int64, (long) bdSeq).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"Node Control/Rebirth\", Boolean, false).createMetric());\n\n\t\t\t\tPropertySet nestedPropertySet = new PropertySetBuilder()\n\t\t\t\t\t\t.addProperty(\"custom\", new PropertyValue(PropertyDataType.String, \"Custom Value\"))\n\t\t\t\t\t\t.createPropertySet();\n\n\t\t\t\tPropertySet propertySet = new PropertySetBuilder()\n\t\t\t\t\t\t.addProperty(\"engUnit\", new PropertyValue(PropertyDataType.String, \"My Units\"))\n\t\t\t\t\t\t.addProperty(\"engLow\", new PropertyValue(PropertyDataType.Double, 1.0))\n\t\t\t\t\t\t.addProperty(\"engHigh\", new PropertyValue(PropertyDataType.Double, 10.0))\n\t\t\t\t\t\t.addProperty(\"Custom nested node prop\",\n\t\t\t\t\t\t\t\tnew PropertyValue(PropertyDataType.PropertySet, nestedPropertySet))\n\t\t\t\t\t\t/*\n\t\t\t\t\t\t * .addProperty(\"CustA\", new PropertyValue(PropertyDataType.String, \"Custom A\"))\n\t\t\t\t\t\t * .addProperty(\"CustB\", new PropertyValue(PropertyDataType.Double, 10.0)) .addProperty(\"CustC\",\n\t\t\t\t\t\t * new PropertyValue(PropertyDataType.Int32, 100))\n\t\t\t\t\t\t */\n\t\t\t\t\t\t.createPropertySet();\n\t\t\t\tpayload.addMetric(\n\t\t\t\t\t\tnew MetricBuilder(\"MyMetric\", String, \"My Value\").properties(propertySet).createMetric());\n\n\t\t\t\tSystem.out.println(\"Publishing Edge Node Birth\");\n\t\t\t\texecutor.execute(new Publisher(NAMESPACE + \"/\" + groupId + \"/NBIRTH/\" + edgeNode, payload));\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\tprivate void publishDeviceBirth() {\n\t\ttry {\n\t\t\tsynchronized (seqLock) {\n\t\t\t\t// Create the payload and add some metrics\n\t\t\t\tSparkplugBPayload payload =\n\t\t\t\t\t\tnew SparkplugBPayload(new Date(), newMetrics(true), getSeqNum(), newUUID(), null);\n\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"Device Control/Rebirth\", Boolean, false).createMetric());\n\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"Int8_Min\", Int8, (byte) -128).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"Int8_Max\", Int8, (byte) 127).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"Int16_Min\", Int16, (short) -32768).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"Int16_Max\", Int16, (short) 32767).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"Int32_Min\", Int32, -2147483648).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"Int32_Max\", Int32, 2147483647).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"Int64_Min\", Int64, -9223372036854775808L).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"Int64_Max\", Int64, 9223372036854775807L).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"UInt8_Min\", UInt8, (short) 0).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"UInt8_Max\", UInt8, (short) 255).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"UInt16_Min\", UInt16, 0).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"UInt16_Max\", UInt16, 64535).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"UInt32_Min\", UInt32, 0L).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"UInt32_Max\", UInt32, 4294967295L).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"UInt64_Min\", UInt64, BigInteger.valueOf(0L)).createMetric());\n\t\t\t\tpayload.addMetric(\n\t\t\t\t\t\tnew MetricBuilder(\"UInt64_Max\", UInt64, new BigInteger(\"18446744073709551615\")).createMetric());\n\n\t\t\t\t// Add some properties\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"Properties/hw_version\", String, HW_VERSION).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"Properties/sw_version\", String, SW_VERSION).createMetric());\n\n\t\t\t\tPropertySet nestedPropertySet = new PropertySetBuilder()\n\t\t\t\t\t\t.addProperty(\"custom\", new PropertyValue(PropertyDataType.String, \"Custom Value\"))\n\t\t\t\t\t\t.createPropertySet();\n\n\t\t\t\tPropertySet propertySet = new PropertySetBuilder()\n\t\t\t\t\t\t.addProperty(\"engUnit\", new PropertyValue(PropertyDataType.String, \"My Units\"))\n\t\t\t\t\t\t.addProperty(\"engLow\", new PropertyValue(PropertyDataType.Double, 1.0))\n\t\t\t\t\t\t.addProperty(\"engHigh\", new PropertyValue(PropertyDataType.Double, 10.0))\n\t\t\t\t\t\t.addProperty(\"Custom nested device prop\",\n\t\t\t\t\t\t\t\tnew PropertyValue(PropertyDataType.PropertySet, nestedPropertySet))\n\t\t\t\t\t\t/*\n\t\t\t\t\t\t * .addProperty(\"CustA\", new PropertyValue(PropertyDataType.String, \"Custom A\"))\n\t\t\t\t\t\t * .addProperty(\"CustB\", new PropertyValue(PropertyDataType.Double, 10.0)) .addProperty(\"CustC\",\n\t\t\t\t\t\t * new PropertyValue(PropertyDataType.Int32, 100))\n\t\t\t\t\t\t */\n\t\t\t\t\t\t.createPropertySet();\n\t\t\t\tpayload.addMetric(\n\t\t\t\t\t\tnew MetricBuilder(\"MyMetric\", String, \"My Value\").properties(propertySet).createMetric());\n\n\t\t\t\tSystem.out.println(\"Publishing Device Birth\");\n\t\t\t\texecutor.execute(\n\t\t\t\t\t\tnew Publisher(NAMESPACE + \"/\" + groupId + \"/DBIRTH/\" + edgeNode + \"/\" + deviceId, payload));\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\t// Used to add the birth/death sequence number\n\tprivate SparkplugBPayloadBuilder addBdSeqNum(SparkplugBPayloadBuilder payload) throws Exception {\n\t\tif (payload == null) {\n\t\t\tpayload = new SparkplugBPayloadBuilder();\n\t\t}\n\t\tif (bdSeq == 256) {\n\t\t\tbdSeq = 0;\n\t\t}\n\t\tpayload.addMetric(new MetricBuilder(\"bdSeq\", Int64, (long) bdSeq).createMetric());\n\t\tbdSeq++;\n\t\treturn payload;\n\t}\n\n\t// Used to add the sequence number\n\tprivate long getSeqNum() throws Exception {\n\t\tSystem.out.println(\"seq: \" + seq);\n\t\tif (seq == 256) {\n\t\t\tseq = 0;\n\t\t}\n\t\treturn seq++;\n\t}\n\n\t@Override\n\tpublic void connectComplete(boolean reconnect, String serverURI) {\n\t\tSystem.out.println(\"Connected! - publishing birth\");\n\t\tpublishBirth();\n\t}\n\n\tpublic void connectionLost(Throwable cause) {\n\t\tcause.printStackTrace();\n\t\tSystem.out.println(\"The MQTT Connection was lost! - will auto-reconnect\");\n\t}\n\n\tpublic void messageArrived(String topic, MqttMessage message) throws Exception {\n\t\tSystem.out.println(\"Message Arrived on topic \" + topic);\n\n\t\tSparkplugBPayloadDecoder decoder = new SparkplugBPayloadDecoder();\n\t\tSparkplugBPayload inboundPayload = decoder.buildFromByteArray(message.getPayload(), null);\n\n\t\t// Debug\n\t\tfor (Metric metric : inboundPayload.getMetrics()) {\n\t\t\tSystem.out.println(\"Metric \" + metric.getName() + \"=\" + metric.getValue());\n\t\t}\n\n\t\tString[] splitTopic = topic.split(\"/\");\n\t\tif (splitTopic[0].equals(NAMESPACE) && splitTopic[1].equals(groupId) && splitTopic[2].equals(\"NCMD\")\n\t\t\t\t&& splitTopic[3].equals(edgeNode)) {\n\t\t\tfor (Metric metric : inboundPayload.getMetrics()) {\n\t\t\t\tif (\"Node Control/Rebirth\".equals(metric.getName()) && ((Boolean) metric.getValue())) {\n\t\t\t\t\tpublishBirth();\n\t\t\t\t} else {\n\t\t\t\t\tSystem.out.println(\"Unknown Node Command NCMD: \" + metric.getName());\n\t\t\t\t}\n\t\t\t}\n\t\t} else if (splitTopic[0].equals(NAMESPACE) && splitTopic[1].equals(groupId) && splitTopic[2].equals(\"DCMD\")\n\t\t\t\t&& splitTopic[3].equals(edgeNode)) {\n\t\t\tSystem.out.println(\"Command recevied for device \" + splitTopic[4]);\n\n\t\t\t// Process the incoming payload and publish any updated/modified metrics\n\t\t\t// Simulate the following:\n\t\t\t// Outputs/0 is tied to Inputs/0\n\t\t\t// Outputs/2 is tied to Inputs/1\n\t\t\t// Outputs/2 is tied to Inputs/2\n\t\t\tSparkplugBPayload outboundPayload =\n\t\t\t\t\tnew SparkplugBPayload(new Date(), new ArrayList<Metric>(), getSeqNum(), newUUID(), null);\n\t\t\tfor (Metric metric : inboundPayload.getMetrics()) {\n\t\t\t\tString name = metric.getName();\n\t\t\t\tObject value = metric.getValue();\n\t\t\t\tif (\"Device Control/Rebirth\".equals(metric.getName()) && ((Boolean) metric.getValue())) {\n\t\t\t\t\tpublishDeviceBirth();\n\t\t\t\t} else if (\"Outputs/0\".equals(name)) {\n\t\t\t\t\tSystem.out.println(\"Outputs/0: \" + value);\n\t\t\t\t\toutboundPayload.addMetric(new MetricBuilder(\"Inputs/0\", Boolean, value).createMetric());\n\t\t\t\t\toutboundPayload.addMetric(new MetricBuilder(\"Outputs/0\", Boolean, value).createMetric());\n\t\t\t\t\tSystem.out.println(\"Publishing updated value for Inputs/0 \" + value);\n\t\t\t\t} else if (\"Outputs/1\".equals(name)) {\n\t\t\t\t\tSystem.out.println(\"Output1: \" + value);\n\t\t\t\t\toutboundPayload.addMetric(new MetricBuilder(\"Inputs/1\", Int32, value).createMetric());\n\t\t\t\t\toutboundPayload.addMetric(new MetricBuilder(\"Outputs/1\", Int32, value).createMetric());\n\t\t\t\t\tSystem.out.println(\"Publishing updated value for Inputs/1 \" + value);\n\t\t\t\t} else if (\"Outputs/2\".equals(name)) {\n\t\t\t\t\tSystem.out.println(\"Output2: \" + value);\n\t\t\t\t\toutboundPayload.addMetric(new MetricBuilder(\"Inputs/2\", Double, value).createMetric());\n\t\t\t\t\toutboundPayload.addMetric(new MetricBuilder(\"Outputs/2\", Double, value).createMetric());\n\t\t\t\t\tSystem.out.println(\"Publishing updated value for Inputs/2 \" + value);\n\t\t\t\t}\n\t\t\t}\n\n\t\t\t// Publish the message in a new thread\n\t\t\texecutor.execute(\n\t\t\t\t\tnew Publisher(NAMESPACE + \"/\" + groupId + \"/DDATA/\" + edgeNode + \"/\" + deviceId, outboundPayload));\n\t\t}\n\t}\n\n\tpublic void deliveryComplete(IMqttDeliveryToken token) {\n\t\tSystem.out.println(\"Published message: \" + token);\n\t}\n\n\tprivate String newUUID() {\n\t\treturn java.util.UUID.randomUUID().toString();\n\t}\n\n\tprivate List<Metric> newMetrics(boolean isBirth) throws SparkplugException {\n\t\tRandom random = new Random();\n\t\tList<Metric> metrics = new ArrayList<Metric>();\n\t\tmetrics.add(new MetricBuilder(\"Int8\", Int8, (byte) random.nextInt()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"Int16\", Int16, (short) random.nextInt()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"Int32\", Int32, random.nextInt()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"Int64\", Int64, random.nextLong()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"UInt8\", UInt8, getRandomUInt8()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"UInt16\", UInt16, getRandomUInt16()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"UInt32\", UInt32, getRandomUInt32()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"UInt64\", UInt64, getRandomUInt64()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"Float\", Float, random.nextFloat()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"Double\", Double, random.nextDouble()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"Boolean\", Boolean, random.nextBoolean()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"String\", String, newUUID()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"DateTime\", DateTime, new Date()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"Text\", Text, newUUID()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"UUID\", UUID, newUUID()).createMetric());\n\t\t// metrics.add(new MetricBuilder(\"Bytes\", Bytes, randomBytes(20)).createMetric());\n\t\t// metrics.add(new MetricBuilder(\"File\", File, null).createMetric());\n\n\t\t// DataSet\n\t\tmetrics.add(new MetricBuilder(\"DataSet\", DataSet, newDataSet()).createMetric());\n\t\tif (isBirth) {\n\t\t\tmetrics.add(new MetricBuilder(\"TemplateDef\", Template, newTemplate(true, null)).createMetric());\n\t\t}\n\n\t\t// Template\n\t\tmetrics.add(new MetricBuilder(\"TemplateInst\", Template, newTemplate(false, \"TemplateDef\")).createMetric());\n\n\t\t// Complex Template\n\t\tmetrics.addAll(newComplexTemplate(isBirth));\n\n\t\t// Metrics with properties\n\t\tmetrics.add(new MetricBuilder(\"IntWithProps\", Int32, random.nextInt()).properties(\n\t\t\t\tnew PropertySetBuilder().addProperty(\"engUnit\", new PropertyValue(PropertyDataType.String, \"My Units\"))\n\t\t\t\t\t\t.addProperty(\"engHigh\", new PropertyValue(PropertyDataType.Int32, Integer.MAX_VALUE))\n\t\t\t\t\t\t.addProperty(\"engLow\", new PropertyValue(PropertyDataType.Int32, Integer.MIN_VALUE))\n\t\t\t\t\t\t.createPropertySet())\n\t\t\t\t.createMetric());\n\n\t\t// Aliased metric\n\t\t// The name and alias will be specified in a NBIRTH/DBIRTH message.\n\t\t// Only the alias will be specified in a NDATA/DDATA message.\n\t\tLong alias = 1111L;\n\t\tif (isBirth) {\n\t\t\tmetrics.add(new MetricBuilder(\"AliasedString\", String, newUUID()).alias(alias).createMetric());\n\t\t} else {\n\t\t\tmetrics.add(new MetricBuilder(alias, String, newUUID()).createMetric());\n\t\t}\n\n\t\treturn metrics;\n\t}\n\n\tprivate PropertySet newPropertySet() throws SparkplugException {\n\t\treturn new PropertySetBuilder().addProperties(newProps(true)).createPropertySet();\n\t}\n\n\tprivate Map<String, PropertyValue> newProps(boolean withPropTypes) throws SparkplugException {\n\t\tRandom random = new Random();\n\t\tMap<String, PropertyValue> propMap = new HashMap<String, PropertyValue>();\n\t\tpropMap.put(\"PropInt8\", new PropertyValue(PropertyDataType.Int8, (byte) random.nextInt()));\n\t\tpropMap.put(\"PropInt16\", new PropertyValue(PropertyDataType.Int16, (short) random.nextInt()));\n\t\tpropMap.put(\"PropInt32\", new PropertyValue(PropertyDataType.Int32, random.nextInt()));\n\t\tpropMap.put(\"PropInt64\", new PropertyValue(PropertyDataType.Int64, random.nextLong()));\n\t\tpropMap.put(\"PropUInt8\", new PropertyValue(PropertyDataType.UInt8, getRandomUInt8()));\n\t\tpropMap.put(\"PropUInt16\", new PropertyValue(PropertyDataType.UInt16, getRandomUInt16()));\n\t\tpropMap.put(\"PropUInt32\", new PropertyValue(PropertyDataType.UInt32, getRandomUInt32()));\n\t\tpropMap.put(\"PropUInt64\", new PropertyValue(PropertyDataType.UInt64, getRandomUInt64()));\n\t\tpropMap.put(\"PropFloat\", new PropertyValue(PropertyDataType.Float, random.nextFloat()));\n\t\tpropMap.put(\"PropDouble\", new PropertyValue(PropertyDataType.Double, random.nextDouble()));\n\t\tpropMap.put(\"PropBoolean\", new PropertyValue(PropertyDataType.Boolean, random.nextBoolean()));\n\t\tpropMap.put(\"PropString\", new PropertyValue(PropertyDataType.String, newUUID()));\n\t\tpropMap.put(\"PropDateTime\", new PropertyValue(PropertyDataType.DateTime, new Date()));\n\t\tpropMap.put(\"PropText\", new PropertyValue(PropertyDataType.Text, newUUID()));\n\t\tif (withPropTypes) {\n\t\t\tpropMap.put(\"PropPropertySet\", new PropertyValue(PropertyDataType.PropertySet,\n\t\t\t\t\tnew PropertySetBuilder().addProperties(newProps(false)).createPropertySet()));\n\t\t\tList<PropertySet> propsList = new ArrayList<PropertySet>();\n\t\t\tpropsList.add(new PropertySetBuilder().addProperties(newProps(false)).createPropertySet());\n\t\t\tpropsList.add(new PropertySetBuilder().addProperties(newProps(false)).createPropertySet());\n\t\t\tpropsList.add(new PropertySetBuilder().addProperties(newProps(false)).createPropertySet());\n\t\t\tpropMap.put(\"PropPropertySetList\", new PropertyValue(PropertyDataType.PropertySetList, propsList));\n\t\t}\n\t\treturn propMap;\n\t}\n\n\tprivate List<Parameter> newParams() throws SparkplugException {\n\t\tRandom random = new Random();\n\t\tList<Parameter> params = new ArrayList<Parameter>();\n\t\tparams.add(new Parameter(\"ParamInt32\", ParameterDataType.Int32, random.nextInt()));\n\t\tparams.add(new Parameter(\"ParamFloat\", ParameterDataType.Float, random.nextFloat()));\n\t\tparams.add(new Parameter(\"ParamDouble\", ParameterDataType.Double, random.nextDouble()));\n\t\tparams.add(new Parameter(\"ParamBoolean\", ParameterDataType.Boolean, random.nextBoolean()));\n\t\tparams.add(new Parameter(\"ParamString\", ParameterDataType.String, newUUID()));\n\t\treturn params;\n\t}\n\n\tprivate List<Metric> newComplexTemplate(boolean withTemplateDefs) throws SparkplugInvalidTypeException {\n\t\tArrayList<Metric> metrics = new ArrayList<Metric>();\n\t\tif (withTemplateDefs) {\n\n\t\t\t// Add a new template \"subType\" definition with two primitive members\n\t\t\tmetrics.add(new MetricBuilder(\"subType\", Template,\n\t\t\t\t\tnew TemplateBuilder().definition(true)\n\t\t\t\t\t\t\t.addMetric(new MetricBuilder(\"StringMember\", String, \"value\").createMetric())\n\t\t\t\t\t\t\t.addMetric(new MetricBuilder(\"IntegerMember\", Int32, 0).createMetric()).createTemplate())\n\t\t\t\t\t\t\t\t\t.createMetric());\n\t\t\t// Add new template \"newType\" definition that contains an instance of \"subType\" as a member\n\t\t\tmetrics.add(new MetricBuilder(\"newType\", Template,\n\t\t\t\t\tnew TemplateBuilder().definition(true).addMetric(new MetricBuilder(\"mySubType\", Template,\n\t\t\t\t\t\t\tnew TemplateBuilder().definition(false).templateRef(\"subType\")\n\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(\"StringMember\", String, \"value\").createMetric())\n\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(\"IntegerMember\", Int32, 0).createMetric())\n\t\t\t\t\t\t\t\t\t.createTemplate()).createMetric())\n\t\t\t\t\t\t\t.createTemplate()).createMetric());\n\t\t}\n\n\t\t// Add an instance of \"newType\n\t\tmetrics.add(new MetricBuilder(\"myNewType\", Template,\n\t\t\t\tnew TemplateBuilder().definition(false).templateRef(\"newType\")\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"mySubType\", Template,\n\t\t\t\t\t\t\t\tnew TemplateBuilder().definition(false).templateRef(\"subType\")\n\t\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(\"StringMember\", String, \"myValue\").createMetric())\n\t\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(\"IntegerMember\", Int32, 1).createMetric())\n\t\t\t\t\t\t\t\t\t\t.createTemplate()).createMetric())\n\t\t\t\t\t\t.createTemplate()).createMetric());\n\n\t\treturn metrics;\n\n\t}\n\n\tprivate Template newTemplate(boolean isDef, String templatRef) throws SparkplugException {\n\t\tRandom random = new Random();\n\t\tList<Metric> metrics = new ArrayList<Metric>();\n\t\tmetrics.add(new MetricBuilder(\"MyInt8\", Int8, (byte) random.nextInt()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyInt16\", Int16, (short) random.nextInt()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyInt32\", Int32, random.nextInt()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyInt64\", Int64, random.nextLong()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyUInt8\", UInt8, getRandomUInt8()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyUInt16\", UInt16, getRandomUInt16()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyUInt32\", UInt32, getRandomUInt32()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyUInt64\", UInt64, getRandomUInt64()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyFloat\", Float, random.nextFloat()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyDouble\", Double, random.nextDouble()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyBoolean\", Boolean, random.nextBoolean()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyString\", String, newUUID()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyDateTime\", DateTime, new Date()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyText\", Text, newUUID()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyUUID\", UUID, newUUID()).createMetric());\n\n\t\treturn new TemplateBuilder().version(\"v1.0\").templateRef(templatRef).definition(isDef)\n\t\t\t\t.addParameters(newParams()).addMetrics(metrics).createTemplate();\n\t}\n\n\tprivate DataSet newDataSet() throws SparkplugException {\n\t\tRandom random = new Random();\n\t\treturn new DataSetBuilder(14).addColumnName(\"Int8s\").addColumnName(\"Int16s\").addColumnName(\"Int32s\")\n\t\t\t\t.addColumnName(\"Int64s\").addColumnName(\"UInt8s\").addColumnName(\"UInt16s\").addColumnName(\"UInt32s\")\n\t\t\t\t.addColumnName(\"UInt64s\").addColumnName(\"Floats\").addColumnName(\"Doubles\").addColumnName(\"Booleans\")\n\t\t\t\t.addColumnName(\"Strings\").addColumnName(\"Dates\").addColumnName(\"Texts\").addType(DataSetDataType.Int8)\n\t\t\t\t.addType(DataSetDataType.Int16).addType(DataSetDataType.Int32).addType(DataSetDataType.Int64)\n\t\t\t\t.addType(DataSetDataType.UInt8).addType(DataSetDataType.UInt16).addType(DataSetDataType.UInt32)\n\t\t\t\t.addType(DataSetDataType.UInt64).addType(DataSetDataType.Float).addType(DataSetDataType.Double)\n\t\t\t\t.addType(DataSetDataType.Boolean).addType(DataSetDataType.String).addType(DataSetDataType.DateTime)\n\t\t\t\t.addType(DataSetDataType.Text)\n\t\t\t\t.addRow(new RowBuilder().addValue(new Value<Byte>(DataSetDataType.Int8, (byte) random.nextInt()))\n\t\t\t\t\t\t.addValue(new Value<Short>(DataSetDataType.Int16, (short) random.nextInt()))\n\t\t\t\t\t\t.addValue(new Value<Integer>(DataSetDataType.Int32, random.nextInt()))\n\t\t\t\t\t\t.addValue(new Value<Long>(DataSetDataType.Int64, random.nextLong()))\n\t\t\t\t\t\t.addValue(new Value<Short>(DataSetDataType.UInt8, getRandomUInt8()))\n\t\t\t\t\t\t.addValue(new Value<Integer>(DataSetDataType.UInt16, getRandomUInt16()))\n\t\t\t\t\t\t.addValue(new Value<Long>(DataSetDataType.UInt32, getRandomUInt32()))\n\t\t\t\t\t\t.addValue(new Value<BigInteger>(DataSetDataType.UInt64, getRandomUInt64()))\n\t\t\t\t\t\t.addValue(new Value<Float>(DataSetDataType.Float, random.nextFloat()))\n\t\t\t\t\t\t.addValue(new Value<Double>(DataSetDataType.Double, random.nextDouble()))\n\t\t\t\t\t\t.addValue(new Value<Boolean>(DataSetDataType.Boolean, random.nextBoolean()))\n\t\t\t\t\t\t.addValue(new Value<String>(DataSetDataType.String, newUUID()))\n\t\t\t\t\t\t.addValue(new Value<Date>(DataSetDataType.DateTime, new Date()))\n\t\t\t\t\t\t.addValue(new Value<String>(DataSetDataType.Text, newUUID())).createRow())\n\t\t\t\t.addRow(new RowBuilder().addValue(new Value<Byte>(DataSetDataType.Int8, (byte) random.nextInt()))\n\t\t\t\t\t\t.addValue(new Value<Short>(DataSetDataType.Int16, (short) random.nextInt()))\n\t\t\t\t\t\t.addValue(new Value<Integer>(DataSetDataType.Int32, random.nextInt()))\n\t\t\t\t\t\t.addValue(new Value<Long>(DataSetDataType.Int64, random.nextLong()))\n\t\t\t\t\t\t.addValue(new Value<Short>(DataSetDataType.UInt8, getRandomUInt8()))\n\t\t\t\t\t\t.addValue(new Value<Integer>(DataSetDataType.UInt16, getRandomUInt16()))\n\t\t\t\t\t\t.addValue(new Value<Long>(DataSetDataType.UInt32, getRandomUInt32()))\n\t\t\t\t\t\t.addValue(new Value<BigInteger>(DataSetDataType.UInt64, getRandomUInt64()))\n\t\t\t\t\t\t.addValue(new Value<Float>(DataSetDataType.Float, random.nextFloat()))\n\t\t\t\t\t\t.addValue(new Value<Double>(DataSetDataType.Double, random.nextDouble()))\n\t\t\t\t\t\t.addValue(new Value<Boolean>(DataSetDataType.Boolean, random.nextBoolean()))\n\t\t\t\t\t\t.addValue(new Value<String>(DataSetDataType.String, newUUID()))\n\t\t\t\t\t\t.addValue(new Value<Date>(DataSetDataType.DateTime, new Date()))\n\t\t\t\t\t\t.addValue(new Value<String>(DataSetDataType.Text, newUUID())).createRow())\n\t\t\t\t.createDataSet();\n\t}\n\n\tprivate short getRandomUInt8() {\n\t\tRandom random = new Random();\n\t\treturn (short) random.nextInt(256);\n\t}\n\n\tprivate int getRandomUInt16() {\n\t\tRandom random = new Random();\n\t\treturn random.nextInt(65536);\n\n\t}\n\n\tprivate long getRandomUInt32() {\n\t\treturn ThreadLocalRandom.current().nextLong(4294967296L);\n\t}\n\n\tprivate BigInteger getRandomUInt64() {\n\t\tRandom random = new Random();\n\t\tBigInteger minSize = new BigInteger(\"0\");\n\t\tBigInteger maxSize = new BigInteger(\"18446744073709551616\");\n\t\tBigInteger randomResult = new BigInteger(64, random);\n\t\twhile (randomResult.compareTo(minSize) <= 0 || randomResult.compareTo(maxSize) >= 0) {\n\t\t\trandomResult = new BigInteger(64, random);\n\t\t\tSystem.out.println(\"New randomResult: \" + randomResult);\n\t\t}\n\t\treturn randomResult;\n\t}\n\n\tprivate class Publisher implements Runnable {\n\n\t\tprivate String topic;\n\t\tprivate SparkplugBPayload outboundPayload;\n\n\t\tpublic Publisher(String topic, SparkplugBPayload outboundPayload) {\n\t\t\tthis.topic = topic;\n\t\t\tthis.outboundPayload = outboundPayload;\n\t\t}\n\n\t\tpublic void run() {\n\t\t\ttry {\n\t\t\t\toutboundPayload.setTimestamp(new Date());\n\t\t\t\tSparkplugBPayloadEncoder encoder = new SparkplugBPayloadEncoder();\n\n\t\t\t\t// Compress payload (optional)\n\t\t\t\tif (USING_COMPRESSION) {\n\t\t\t\t\tclient.publish(topic,\n\t\t\t\t\t\t\tencoder.getBytes(PayloadUtil.compress(outboundPayload, compressionAlgorithm, false), false),\n\t\t\t\t\t\t\t0, false);\n\t\t\t\t} else {\n\t\t\t\t\tclient.publish(topic, encoder.getBytes(outboundPayload, false), 0, false);\n\t\t\t\t}\n\t\t\t} catch (MqttPersistenceException e) {\n\t\t\t\te.printStackTrace();\n\t\t\t} catch (MqttException e) {\n\t\t\t\te.printStackTrace();\n\t\t\t} catch (Exception e) {\n\t\t\t\te.printStackTrace();\n\t\t\t}\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/examples/udt/pom.xml",
    "content": "<!--/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n  xsi:schemaLocation=\"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">\n  <modelVersion>4.0.0</modelVersion>\n\n  <parent>\n    <groupId>org.eclipse.tahu</groupId>\n    <artifactId>tahu-examples</artifactId>\n    <version>1.0.7</version>\n    <relativePath>../pom.xml</relativePath>\n  </parent>\n\n  <artifactId>sparkplug_b_udt_example</artifactId>\n  <packaging>jar</packaging>\n  <name>Sparkplug B UDT Example</name>\n\n  <dependencies>\n    <dependency>\n      <groupId>org.eclipse.tahu</groupId>\n      <artifactId>tahu-core</artifactId>\n      <version>${project.version}</version>\n    </dependency>\n  </dependencies>\n  <build>\n    <plugins>\n      <plugin>\n        <groupId>org.sonatype.plugins</groupId>\n        <artifactId>nexus-staging-maven-plugin</artifactId>\n        <configuration>\n          <skipNexusStagingDeployMojo>true</skipNexusStagingDeployMojo>\n        </configuration>\n      </plugin>\n\n      <!-- New Build mechanism - replaces maven-assembly-plugin -->\n      <plugin>\n        <groupId>org.apache.maven.plugins</groupId>\n        <artifactId>maven-shade-plugin</artifactId>\n        <version>2.4.1</version>\n        <executions>\n          <execution>\n            <phase>package</phase>\n            <goals>\n              <goal>shade</goal>\n            </goals>\n            <configuration>\n              <filters>\n                <filter>\n                  <artifact>*:*</artifact>\n                  <excludes>\n                    <exclude>META-INF/*.SF</exclude>\n                    <exclude>META-INF/*.DSA</exclude>\n                    <exclude>META-INF/*.RSA</exclude>\n                  </excludes>\n                </filter>\n              </filters>\n              <transformers>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ManifestResourceTransformer\">\n                  <mainClass>org.eclipse.tahu.SparkplugExample</mainClass>\n                </transformer>\n                <transformer\n                  implementation=\"org.apache.maven.plugins.shade.resource.ServicesResourceTransformer\" />\n              </transformers>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <groupId>org.codehaus.mojo</groupId>\n        <artifactId>license-maven-plugin</artifactId>\n        <version>1.8</version>\n        <executions>\n          <execution>\n            <id>add-third-party</id>\n            <phase>package</phase>\n            <goals>\n              <goal>add-third-party</goal>\n              <goal>download-licenses</goal>\n            </goals>\n            <configuration>\n              <useMissingFile>true</useMissingFile>\n              <excludedScopes>test</excludedScopes>\n              <excludedGroups> (org.eclipse.tahu*)\n              </excludedGroups>\n              <licenseMerges>\n                <licenseMerge>The Apache Software License, Version\n                  2.0|Apache License, Version 2.0|Apache Public License\n                  2.0|Apache License 2.0|Apache Software License -\n                  Version 2.0</licenseMerge>\n              </licenseMerges>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n    </plugins>\n  </build>\n</project>\n"
  },
  {
    "path": "java/examples/udt/src/main/java/org/eclipse/tahu/SparkplugExample.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu;\n\nimport static org.eclipse.tahu.message.model.MetricDataType.Boolean;\nimport static org.eclipse.tahu.message.model.MetricDataType.DateTime;\nimport static org.eclipse.tahu.message.model.MetricDataType.Double;\nimport static org.eclipse.tahu.message.model.MetricDataType.Float;\nimport static org.eclipse.tahu.message.model.MetricDataType.Int16;\nimport static org.eclipse.tahu.message.model.MetricDataType.Int32;\nimport static org.eclipse.tahu.message.model.MetricDataType.Int64;\nimport static org.eclipse.tahu.message.model.MetricDataType.Int8;\nimport static org.eclipse.tahu.message.model.MetricDataType.String;\nimport static org.eclipse.tahu.message.model.MetricDataType.Template;\nimport static org.eclipse.tahu.message.model.MetricDataType.Text;\nimport static org.eclipse.tahu.message.model.MetricDataType.UInt16;\nimport static org.eclipse.tahu.message.model.MetricDataType.UInt32;\nimport static org.eclipse.tahu.message.model.MetricDataType.UInt64;\nimport static org.eclipse.tahu.message.model.MetricDataType.UInt8;\nimport static org.eclipse.tahu.message.model.MetricDataType.UUID;\n\nimport java.math.BigInteger;\nimport java.util.ArrayList;\nimport java.util.Date;\nimport java.util.List;\nimport java.util.Random;\nimport java.util.concurrent.ExecutorService;\nimport java.util.concurrent.Executors;\n\nimport javax.net.SocketFactory;\nimport javax.net.ssl.SSLSocketFactory;\n\nimport org.eclipse.paho.client.mqttv3.IMqttDeliveryToken;\nimport org.eclipse.paho.client.mqttv3.MqttCallbackExtended;\nimport org.eclipse.paho.client.mqttv3.MqttClient;\nimport org.eclipse.paho.client.mqttv3.MqttConnectOptions;\nimport org.eclipse.paho.client.mqttv3.MqttException;\nimport org.eclipse.paho.client.mqttv3.MqttMessage;\nimport org.eclipse.paho.client.mqttv3.MqttPersistenceException;\nimport org.eclipse.tahu.message.SparkplugBPayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadEncoder;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.Parameter;\nimport org.eclipse.tahu.message.model.ParameterDataType;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.eclipse.tahu.message.model.Template;\nimport org.eclipse.tahu.message.model.Template.TemplateBuilder;\n\n/**\n * An example Sparkplug B application.\n */\npublic class SparkplugExample implements MqttCallbackExtended {\n\n\tprivate static final String NAMESPACE = \"spBv1.0\";\n\n\t// Configuration\n\tprivate static final boolean USING_REAL_TLS = false;\n\tprivate String serverUrl = \"tcp://localhost:1883\";\n\tprivate String groupId = \"Sparkplug B Devices\";\n\tprivate String edgeNode = \"Java Sparkplug B UDT Example\";\n\tprivate String deviceId = \"SparkplugBExample\";\n\tprivate String clientId = \"SparkplugBExampleEdgeNode\";\n\tprivate String username = \"admin\";\n\tprivate String password = \"changeme\";\n\tprivate long PUBLISH_PERIOD = 60000; // Publish period in milliseconds\n\tprivate ExecutorService executor;\n\tprivate MqttClient client;\n\n\tprivate int bdSeq = 0;\n\tprivate int seq = 0;\n\n\tprivate Object seqLock = new Object();\n\n\tpublic static void main(String[] args) {\n\t\tSparkplugExample example = new SparkplugExample();\n\t\texample.run();\n\t}\n\n\tpublic void run() {\n\t\ttry {\n\t\t\t// Random generator and thread pool for outgoing published messages\n\t\t\texecutor = Executors.newFixedThreadPool(1);\n\n\t\t\t// Build up DEATH payload - note DEATH payloads don't have a regular sequence number\n\t\t\tSparkplugBPayloadBuilder deathPayload = new SparkplugBPayloadBuilder().setTimestamp(new Date());\n\t\t\tdeathPayload = addBdSeqNum(deathPayload);\n\t\t\tbyte[] deathBytes = new SparkplugBPayloadEncoder().getBytes(deathPayload.createPayload(), false);\n\n\t\t\tMqttConnectOptions options = new MqttConnectOptions();\n\n\t\t\tif (USING_REAL_TLS) {\n\t\t\t\tSocketFactory sf = SSLSocketFactory.getDefault();\n\t\t\t\toptions.setSocketFactory(sf);\n\t\t\t}\n\n\t\t\t// Connect to the MQTT Server\n\t\t\toptions.setAutomaticReconnect(true);\n\t\t\toptions.setCleanSession(true);\n\t\t\toptions.setConnectionTimeout(30);\n\t\t\toptions.setKeepAliveInterval(30);\n\t\t\toptions.setUserName(username);\n\t\t\toptions.setPassword(password.toCharArray());\n\t\t\toptions.setWill(NAMESPACE + \"/\" + groupId + \"/NDEATH/\" + edgeNode, deathBytes, 0, false);\n\t\t\tclient = new MqttClient(serverUrl, clientId);\n\t\t\tclient.setTimeToWait(2000);\n\t\t\tclient.setCallback(this); // short timeout on failure to connect\n\t\t\tclient.connect(options);\n\n\t\t\t// Subscribe to control/command messages for both the edge of network node and the attached devices\n\t\t\tclient.subscribe(NAMESPACE + \"/\" + groupId + \"/NCMD/\" + edgeNode + \"/#\", 0);\n\t\t\tclient.subscribe(NAMESPACE + \"/\" + groupId + \"/DCMD/\" + edgeNode + \"/#\", 0);\n\n\t\t\t// Loop forever publishing data every PUBLISH_PERIOD\n\t\t\twhile (true) {\n\t\t\t\tThread.sleep(PUBLISH_PERIOD);\n\n\t\t\t\tif (client.isConnected()) {\n\t\t\t\t\tsynchronized (seqLock) {\n\t\t\t\t\t\tSystem.out.println(\"Connected - publishing new data\");\n\t\t\t\t\t\t// Create the payload and add some metrics\n\t\t\t\t\t\tSparkplugBPayload payload = new SparkplugBPayload(new Date(), newComplexTemplateInstance(),\n\t\t\t\t\t\t\t\tgetSeqNum(), newUUID(), null);\n\n\t\t\t\t\t\tclient.publish(NAMESPACE + \"/\" + groupId + \"/DDATA/\" + edgeNode + \"/\" + deviceId,\n\t\t\t\t\t\t\t\tnew SparkplugBPayloadEncoder().getBytes(payload, false), 0, false);\n\t\t\t\t\t}\n\t\t\t\t} else {\n\t\t\t\t\tSystem.out.println(\"Not connected - not publishing data\");\n\t\t\t\t}\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\tprivate void publishBirth() {\n\t\ttry {\n\t\t\tsynchronized (seqLock) {\n\t\t\t\t// Reset the sequence number\n\t\t\t\tseq = 0;\n\n\t\t\t\t// Create the BIRTH payload and set the position and other metrics\n\t\t\t\tSparkplugBPayload payload =\n\t\t\t\t\t\tnew SparkplugBPayload(new Date(), new ArrayList<Metric>(), getSeqNum(), newUUID(), null);\n\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"bdSeq\", Int64, (long) bdSeq).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"Node Control/Rebirth\", Boolean, false).createMetric());\n\n\t\t\t\t// Add a node level template definition and instance\n\t\t\t\tpayload.addMetric(\n\t\t\t\t\t\tnew MetricBuilder(\"simpleType\", Template, newSimpleTemplate(true, null)).createMetric());\n\t\t\t\tpayload.addMetric(new MetricBuilder(\"mySimpleType\", Template, newSimpleTemplate(false, \"simpleType\"))\n\t\t\t\t\t\t.createMetric());\n\n\t\t\t\t// Add the complex template definition - All UDT definitions must be published in the NBIRTH\n\t\t\t\tpayload.addMetrics(newComplexTemplateDefs());\n\n\t\t\t\tSystem.out.println(\"Publishing Edge Node Birth\");\n\t\t\t\texecutor.execute(new Publisher(NAMESPACE + \"/\" + groupId + \"/NBIRTH/\" + edgeNode, payload));\n\n\t\t\t\t// Create the payload and add a complex Template instance\n\t\t\t\tpayload = new SparkplugBPayload(new Date(), newComplexTemplateInstance(), getSeqNum(), newUUID(), null);\n\n\t\t\t\tSystem.out.println(\"Publishing Device Birth\");\n\t\t\t\texecutor.execute(\n\t\t\t\t\t\tnew Publisher(NAMESPACE + \"/\" + groupId + \"/DBIRTH/\" + edgeNode + \"/\" + deviceId, payload));\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t}\n\t}\n\n\t// Used to add the birth/death sequence number\n\tprivate SparkplugBPayloadBuilder addBdSeqNum(SparkplugBPayloadBuilder payload) throws Exception {\n\t\tif (payload == null) {\n\t\t\tpayload = new SparkplugBPayloadBuilder();\n\t\t}\n\t\tif (bdSeq == 256) {\n\t\t\tbdSeq = 0;\n\t\t}\n\t\tpayload.addMetric(new MetricBuilder(\"bdSeq\", Int64, (long) bdSeq).createMetric());\n\t\tbdSeq++;\n\t\treturn payload;\n\t}\n\n\t// Used to add the sequence number\n\tprivate long getSeqNum() throws Exception {\n\t\tSystem.out.println(\"seq: \" + seq);\n\t\tif (seq == 256) {\n\t\t\tseq = 0;\n\t\t}\n\t\treturn seq++;\n\t}\n\n\t@Override\n\tpublic void connectComplete(boolean reconnect, String serverURI) {\n\t\tSystem.out.println(\"Connected! - publishing birth\");\n\t\tpublishBirth();\n\t}\n\n\tpublic void connectionLost(Throwable cause) {\n\t\tcause.printStackTrace();\n\t\tSystem.out.println(\"The MQTT Connection was lost! - will auto-reconnect\");\n\t}\n\n\tpublic void messageArrived(String topic, MqttMessage message) throws Exception {\n\t\tSystem.out.println(\"Message Arrived on topic \" + topic);\n\n\t\tSparkplugBPayloadDecoder decoder = new SparkplugBPayloadDecoder();\n\t\tSparkplugBPayload inboundPayload = decoder.buildFromByteArray(message.getPayload(), null);\n\n\t\t// Debug\n\t\tfor (Metric metric : inboundPayload.getMetrics()) {\n\t\t\tSystem.out.println(\"Metric \" + metric.getName() + \"=\" + metric.getValue());\n\t\t}\n\n\t\tString[] splitTopic = topic.split(\"/\");\n\t\tif (splitTopic[0].equals(NAMESPACE) && splitTopic[1].equals(groupId) && splitTopic[2].equals(\"NCMD\")\n\t\t\t\t&& splitTopic[3].equals(edgeNode)) {\n\t\t\tfor (Metric metric : inboundPayload.getMetrics()) {\n\t\t\t\tif (\"Node Control/Rebirth\".equals(metric.getName()) && ((Boolean) metric.getValue())) {\n\t\t\t\t\tpublishBirth();\n\t\t\t\t} else {\n\t\t\t\t\t// TODO\n\t\t\t\t\tSystem.out.println(\"TODO - handle writes to tag: \" + metric.getName());\n\t\t\t\t}\n\t\t\t}\n\t\t} else if (splitTopic[0].equals(NAMESPACE) && splitTopic[1].equals(groupId) && splitTopic[2].equals(\"DCMD\")\n\t\t\t\t&& splitTopic[3].equals(edgeNode)) {\n\t\t\tSystem.out.println(\"Command recevied for device \" + splitTopic[4]);\n\n\t\t\tSparkplugBPayload outboundPayload =\n\t\t\t\t\tnew SparkplugBPayload(new Date(), new ArrayList<Metric>(), getSeqNum(), newUUID(), null);\n\t\t\tfor (Metric metric : inboundPayload.getMetrics()) {\n\t\t\t\t// TODO\n\t\t\t\tSystem.out.println(\"TODO - handle writes to tag: \" + metric.getName());\n\t\t\t}\n\n\t\t\t// Publish the message in a new thread\n\t\t\texecutor.execute(\n\t\t\t\t\tnew Publisher(NAMESPACE + \"/\" + groupId + \"/DDATA/\" + edgeNode + \"/\" + deviceId, outboundPayload));\n\t\t}\n\t}\n\n\tpublic void deliveryComplete(IMqttDeliveryToken token) {\n\t\tSystem.out.println(\"Published message: \" + token);\n\t}\n\n\tprivate String newUUID() {\n\t\treturn java.util.UUID.randomUUID().toString();\n\t}\n\n\tprivate List<Parameter> newParams() throws SparkplugException {\n\t\tRandom random = new Random();\n\t\tList<Parameter> params = new ArrayList<Parameter>();\n\t\tparams.add(new Parameter(\"ParamInt32\", ParameterDataType.Int32, random.nextInt()));\n\t\tparams.add(new Parameter(\"ParamFloat\", ParameterDataType.Float, random.nextFloat()));\n\t\tparams.add(new Parameter(\"ParamDouble\", ParameterDataType.Double, random.nextDouble()));\n\t\tparams.add(new Parameter(\"ParamBoolean\", ParameterDataType.Boolean, random.nextBoolean()));\n\t\tparams.add(new Parameter(\"ParamString\", ParameterDataType.String, newUUID()));\n\t\treturn params;\n\t}\n\n\tprivate List<Metric> newComplexTemplateDefs() throws SparkplugInvalidTypeException {\n\t\tArrayList<Metric> metrics = new ArrayList<Metric>();\n\n\t\t// Add a new template \"subType\" definition with two primitive members\n\t\tmetrics.add(new MetricBuilder(\"subType\", Template,\n\t\t\t\tnew TemplateBuilder().definition(true)\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"StringMember\", String, \"value\").createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"IntegerMember\", Int32, 0).createMetric()).createTemplate())\n\t\t\t\t\t\t\t\t.createMetric());\n\t\t// Add new template \"complexType\" definition that contains an instance of \"subType\" as a member\n\t\tmetrics.add(\n\t\t\t\tnew MetricBuilder(\"complexType\", Template,\n\t\t\t\t\t\tnew TemplateBuilder().definition(true).addMetric(new MetricBuilder(\"mySubType\", Template,\n\t\t\t\t\t\t\t\tnew TemplateBuilder().definition(false).templateRef(\"subType\")\n\t\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(\"StringMember\", String, \"value\").createMetric())\n\t\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(\"IntegerMember\", Int32, 0).createMetric())\n\t\t\t\t\t\t\t\t\t\t.createTemplate()).createMetric())\n\t\t\t\t\t\t\t\t.createTemplate()).createMetric());\n\n\t\treturn metrics;\n\n\t}\n\n\tprivate List<Metric> newComplexTemplateInstance() throws SparkplugInvalidTypeException {\n\t\tArrayList<Metric> metrics = new ArrayList<Metric>();\n\n\t\t// Add an instance of \"complexType\n\t\tmetrics.add(new MetricBuilder(\"myNewType\", Template,\n\t\t\t\tnew TemplateBuilder().definition(false).templateRef(\"complexType\")\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"mySubType\", Template,\n\t\t\t\t\t\t\t\tnew TemplateBuilder().definition(false).templateRef(\"subType\")\n\t\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(\"StringMember\", String, \"myValue\").createMetric())\n\t\t\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(\"IntegerMember\", Int32, 1).createMetric())\n\t\t\t\t\t\t\t\t\t\t.createTemplate()).createMetric())\n\t\t\t\t\t\t.createTemplate()).createMetric());\n\n\t\treturn metrics;\n\t}\n\n\tprivate Template newSimpleTemplate(boolean isDef, String templatRef) throws SparkplugException {\n\t\tRandom random = new Random();\n\t\tList<Metric> metrics = new ArrayList<Metric>();\n\t\tmetrics.add(new MetricBuilder(\"MyInt8\", Int8, (byte) random.nextInt()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyInt16\", Int16, (short) random.nextInt()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyInt32\", Int32, random.nextInt()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyInt64\", Int64, random.nextLong()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyUInt8\", UInt8, (short) random.nextInt()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyUInt16\", UInt16, random.nextInt()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyUInt32\", UInt32, random.nextLong()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyUInt64\", UInt64, BigInteger.valueOf(random.nextLong())).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyFloat\", Float, random.nextFloat()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyDouble\", Double, random.nextDouble()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyBoolean\", Boolean, random.nextBoolean()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyString\", String, newUUID()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyDateTime\", DateTime, new Date()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyText\", Text, newUUID()).createMetric());\n\t\tmetrics.add(new MetricBuilder(\"MyUUID\", UUID, newUUID()).createMetric());\n\n\t\treturn new TemplateBuilder().version(\"v1.0\").templateRef(templatRef).definition(isDef)\n\t\t\t\t.addParameters(newParams()).addMetrics(metrics).createTemplate();\n\t}\n\n\tprivate class Publisher implements Runnable {\n\n\t\tprivate String topic;\n\t\tprivate SparkplugBPayload outboundPayload;\n\n\t\tpublic Publisher(String topic, SparkplugBPayload outboundPayload) {\n\t\t\tthis.topic = topic;\n\t\t\tthis.outboundPayload = outboundPayload;\n\t\t}\n\n\t\tpublic void run() {\n\t\t\ttry {\n\t\t\t\toutboundPayload.setTimestamp(new Date());\n\t\t\t\tSparkplugBPayloadEncoder encoder = new SparkplugBPayloadEncoder();\n\t\t\t\tclient.publish(topic, encoder.getBytes(outboundPayload, false), 0, false);\n\t\t\t} catch (MqttPersistenceException e) {\n\t\t\t\te.printStackTrace();\n\t\t\t} catch (MqttException e) {\n\t\t\t\te.printStackTrace();\n\t\t\t} catch (Exception e) {\n\t\t\t\te.printStackTrace();\n\t\t\t}\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/pom.xml",
    "content": "<!--/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n  xsi:schemaLocation=\"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">\n  <modelVersion>4.0.0</modelVersion>\n\n  <parent>\n    <groupId>org.eclipse.tahu</groupId>\n    <artifactId>tahu</artifactId>\n    <version>1.0.7</version>\n    <relativePath>../../pom.xml</relativePath>\n  </parent>\n\n  <artifactId>tahu-core</artifactId>\n  <packaging>bundle</packaging>\n  <name>Tahu Core</name>\n\n  <dependencies>\n    <dependency>\n      <groupId>com.google.protobuf</groupId>\n      <artifactId>protobuf-java</artifactId>\n      <version>${protobuf.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>org.apache.commons</groupId>\n      <artifactId>commons-compress</artifactId>\n      <version>1.21</version>\n    </dependency>\n  </dependencies>\n\n  <build>\n    <plugins>\n      <plugin>\n        <groupId>org.sonatype.plugins</groupId>\n        <artifactId>nexus-staging-maven-plugin</artifactId>\n        <configuration>\n          <skipNexusStagingDeployMojo>false</skipNexusStagingDeployMojo>\n        </configuration>\n      </plugin>\n      <plugin>\n        <groupId>org.codehaus.mojo</groupId>\n        <artifactId>license-maven-plugin</artifactId>\n        <version>1.8</version>\n        <executions>\n          <execution>\n            <id>add-third-party</id>\n            <phase>package</phase>\n            <goals>\n              <goal>add-third-party</goal>\n              <goal>download-licenses</goal>\n            </goals>\n            <configuration>\n              <useMissingFile>true</useMissingFile>\n              <excludedScopes>test</excludedScopes>\n              <excludedGroups> (org.eclipse.tahu*)\n              </excludedGroups>\n              <licenseMerges>\n                <licenseMerge>The Apache Software License, Version\n                  2.0|Apache License, Version 2.0|Apache Public License\n                  2.0|Apache License 2.0|Apache Software License -\n                  Version 2.0</licenseMerge>\n              </licenseMerges>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <groupId>org.apache.felix</groupId>\n        <artifactId>maven-bundle-plugin</artifactId>\n        <version>${maven.bundle.version}</version>\n        <extensions>true</extensions>\n        <configuration>\n          <instructions>\n            <Export-Package>org.eclipse.tahu.*</Export-Package>\n            <Import-Package>*;resolution:=optional</Import-Package>\n          </instructions>\n        </configuration>\n        <executions>\n          <execution>\n            <id>bundle-manifest</id>\n            <phase>process-classes</phase>\n            <goals>\n              <goal>manifest</goal>\n            </goals>\n          </execution>\n        </executions>\n      </plugin>\n    </plugins>\n  </build>\n</project>\n"
  },
  {
    "path": "java/lib/core/readme.txt",
    "content": "# To generate the base protobuf sparkplug_b Java library\nprotoc --proto_path=../../ --java_out=src/main/java ../../sparkplug_b/sparkplug_b.proto\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/SparkplugException.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu;\n\n/**\n * A Sparkplug Exception\n */\npublic class SparkplugException extends Exception {\n\n\tprivate static final long serialVersionUID = 1L;\n\n\t/**\n\t * Default constructor.\n\t */\n\tpublic SparkplugException() {\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param message an error message\n\t */\n\tpublic SparkplugException(String message) {\n\t\tsuper(message);\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param message an error message\n\t * @param exception an underlying exception\n\t */\n\tpublic SparkplugException(String message, Throwable exception) {\n\t\tsuper(message, exception);\n\t}\n\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/SparkplugInvalidTypeException.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu;\n\n/**\n * An Exception caused by an invalid type.\n */\npublic class SparkplugInvalidTypeException extends SparkplugException {\n\n\tprivate static final long serialVersionUID = 1L;\n\n\t/**\n\t * An Exception for handling invalid types\n\t *\n\t * @param type the invalid class type\n\t */\n\tpublic SparkplugInvalidTypeException(Class<?> type) {\n\t\tsuper(\"Invalid type \" + type);\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/SparkplugParsingException.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu;\n\n/**\n * An Exception thrown if an error is encountered while parsing a payload or topic.\n */\npublic class SparkplugParsingException extends SparkplugException {\n\n\tprivate static final long serialVersionUID = 1L;\n\n\t/**\n\t * Constructor\n\t *\n\t * @param message an error message\n\t */\n\tpublic SparkplugParsingException(String message) {\n\t\tsuper(message);\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param message an error message\n\t * @param exception an underlying exception\n\t */\n\tpublic SparkplugParsingException(String message, Throwable exception) {\n\t\tsuper(message, exception);\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/exception/TahuErrorCode.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.exception;\n\npublic enum TahuErrorCode {\n\n\tALREADY_EXISTS,\n\tFORBIDDEN,\n\tINITIALIZATION_ERROR,\n\tINTERNAL_ERROR,\n\tINVALID_ARGUMENT,\n\tMISSING_FIELDS,\n\tNOT_AUTHORIZED,\n\tNOT_FOUND,\n\tNOT_SUPPORTED,\n\tNOT_SUPPORTED_TYPE,\n\tNULL_FIELD,\n\tPARSE_ERROR\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/exception/TahuException.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.exception;\n\n/**\n * A Sparkplug Exception\n */\npublic class TahuException extends Exception {\n\n\tprivate static final long serialVersionUID = 1L;\n\n\tprivate TahuErrorCode code;\n\n\t/**\n\t * Default constructor.\n\t */\n\tpublic TahuException() {\n\t\tsuper();\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param code the {@link TahuErrorCode} to associate with the {@link TahuException}\n\t */\n\tpublic TahuException(TahuErrorCode code) {\n\t\tsuper();\n\t\tthis.code = code;\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param code the {@link TahuErrorCode} to associate with the {@link TahuException}\n\t * @param message a message to associate with the {@link TahuException}\n\t * @param e an {@link Exception} that caused this {@link TahuException}\n\t */\n\tpublic TahuException(TahuErrorCode code, String message, Throwable e) {\n\t\tsuper(\"ErrorCode: \" + code.toString() + \" - Message: \" + message, e);\n\t\tthis.code = code;\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param code the {@link TahuErrorCode} to associate with the {@link TahuException}\n\t * @param e an {@link Exception} that caused this {@link TahuException}\n\t */\n\tpublic TahuException(TahuErrorCode code, Throwable e) {\n\t\tsuper(code.toString(), e);\n\t\tthis.code = code;\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param code the {@link TahuErrorCode} to associate with the {@link TahuException}\n\t * @param message a message to associate with the {@link TahuException}\n\t */\n\tpublic TahuException(TahuErrorCode code, String message) {\n\t\tsuper(message);\n\t\tthis.code = code;\n\t}\n\n\t/**\n\t * Gets the string based message associated with this {@link TahuException}\n\t *\n\t * @return\n\t */\n\tpublic String getDetails() {\n\t\treturn getMessage();\n\t}\n\n\t/**\n\t * Gets the {@link TahuErrorCode} associated with this {@link TahuException}\n\t *\n\t * @return\n\t */\n\tpublic TahuErrorCode getTahuErrorCode() {\n\t\treturn code;\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/json/DataSetDeserializer.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.json;\n\nimport java.io.IOException;\nimport java.math.BigInteger;\nimport java.util.ArrayList;\nimport java.util.Date;\nimport java.util.List;\n\nimport org.eclipse.tahu.SparkplugException;\nimport org.eclipse.tahu.message.model.DataSet;\nimport org.eclipse.tahu.message.model.DataSet.DataSetBuilder;\nimport org.eclipse.tahu.message.model.DataSetDataType;\nimport org.eclipse.tahu.message.model.Row;\nimport org.eclipse.tahu.message.model.Value;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\nimport com.fasterxml.jackson.core.JsonParser;\nimport com.fasterxml.jackson.core.JsonProcessingException;\nimport com.fasterxml.jackson.databind.DeserializationContext;\nimport com.fasterxml.jackson.databind.JsonNode;\nimport com.fasterxml.jackson.databind.deser.std.StdDeserializer;\n\n/**\n * A JSON deserializer for {@link DataSet} instances.\n */\npublic class DataSetDeserializer extends StdDeserializer<DataSet> {\n\n\tprivate static final long serialVersionUID = 1L;\n\n\tprivate static final Logger logger = LoggerFactory.getLogger(DataSetDeserializer.class.getName());\n\n\tprivate static final String FIELD_SIZE = \"numberOfColumns\";\n\tprivate static final String FIELD_TYPES = \"types\";\n\tprivate static final String FIELD_NAMES = \"columnNames\";\n\tprivate static final String FIELD_ROWS = \"rows\";\n\n\t/**\n\t * Constructor.\n\t * \n\t * @param clazz\n\t */\n\tprotected DataSetDeserializer(Class<DataSet> clazz) {\n\t\tsuper(clazz);\n\t}\n\n\t@Override\n\tpublic DataSet deserialize(JsonParser parser, DeserializationContext context)\n\t\t\tthrows IOException, JsonProcessingException {\n\t\tJsonNode node = parser.getCodec().readTree(parser);\n\t\tlong size = (Long) node.get(FIELD_SIZE).numberValue();\n\t\tDataSetBuilder builder = new DataSetBuilder(size);\n\t\tJsonNode namesNode = node.get(FIELD_NAMES);\n\t\tif (namesNode.isArray()) {\n\t\t\tfor (JsonNode nameNode : namesNode) {\n\t\t\t\tbuilder.addColumnName(nameNode.textValue());\n\t\t\t}\n\t\t}\n\t\tJsonNode typesNode = node.get(FIELD_TYPES);\n\t\tList<DataSetDataType> typesList = new ArrayList<DataSetDataType>();\n\t\tif (typesNode.isArray()) {\n\t\t\tfor (JsonNode typeNode : typesNode) {\n\t\t\t\ttypesList.add(DataSetDataType.valueOf(typeNode.textValue()));\n\t\t\t}\n\t\t\tbuilder.addTypes(typesList);\n\t\t}\n\t\tJsonNode rowsNode = node.get(FIELD_ROWS);\n\t\tif (rowsNode.isArray()) {\n\t\t\tfor (JsonNode rowNode : rowsNode) {\n\t\t\t\tList<Value<?>> values = new ArrayList<Value<?>>();\n\t\t\t\tfor (int i = 0; i < size; i++) {\n\t\t\t\t\tJsonNode value = rowNode.get(i);\n\t\t\t\t\tDataSetDataType type = typesList.get(i);\n\t\t\t\t\tvalues.add(getValueFromNode(value, type));\n\t\t\t\t}\n\t\t\t\tbuilder.addRow(new Row(values));\n\t\t\t}\n\t\t}\n\t\ttry {\n\t\t\treturn builder.createDataSet();\n\t\t} catch (SparkplugException e) {\n\t\t\tlogger.error(\"Error deserializing DataSet \", e);\n\t\t}\n\t\treturn null;\n\t}\n\n\t/*\n\t * Creates and returns a Value instance\n\t */\n\tprivate Value<?> getValueFromNode(JsonNode nodeValue, DataSetDataType type) {\n\t\tswitch (type) {\n\t\t\tcase Boolean:\n\t\t\t\treturn new Value<Boolean>(type, (boolean) nodeValue.asBoolean());\n\t\t\tcase DateTime:\n\t\t\t\treturn new Value<Date>(type, new Date(nodeValue.asLong()));\n\t\t\tcase Double:\n\t\t\t\treturn new Value<Double>(type, nodeValue.asDouble());\n\t\t\tcase Float:\n\t\t\t\treturn new Value<Float>(type, (float) nodeValue.asDouble());\n\t\t\tcase Int16:\n\t\t\tcase UInt8:\n\t\t\t\treturn new Value<Byte>(type, (byte) nodeValue.asInt());\n\t\t\tcase UInt16:\n\t\t\tcase Int32:\n\t\t\t\treturn new Value<Integer>(type, nodeValue.asInt());\n\t\t\tcase UInt32:\n\t\t\tcase Int64:\n\t\t\t\treturn new Value<Long>(type, (long) nodeValue.asLong());\n\t\t\tcase Text:\n\t\t\tcase String:\n\t\t\t\treturn new Value<String>(type, nodeValue.asText());\n\t\t\tcase UInt64:\n\t\t\t\treturn new Value<BigInteger>(type, BigInteger.valueOf(nodeValue.asLong()));\n\t\t\tcase Unknown:\n\t\t\tdefault:\n\t\t\t\treturn null;\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/json/DeserializerModifier.java",
    "content": "/********************************************************************************\n * Copyright (c) 2017-2023 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.json;\n\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.PropertySet;\nimport org.eclipse.tahu.message.model.Template;\n\nimport com.fasterxml.jackson.databind.BeanDescription;\nimport com.fasterxml.jackson.databind.DeserializationConfig;\nimport com.fasterxml.jackson.databind.JsonDeserializer;\nimport com.fasterxml.jackson.databind.deser.BeanDeserializerModifier;\n\n/**\n * A {@link BeanDeserializerModifier} for Sparkplug\n */\npublic class DeserializerModifier extends BeanDeserializerModifier {\n\n\t@Override\n\tpublic JsonDeserializer<?> modifyDeserializer(DeserializationConfig config, BeanDescription beanDesc,\n\t\t\tJsonDeserializer<?> deserializer) {\n\t\tif (Metric.class.equals(beanDesc.getBeanClass())) {\n\t\t\treturn new MetricDeserializer(deserializer);\n\t\t} else if (Template.class.equals(beanDesc.getBeanClass())) {\n\t\t\treturn new TemplateDeserializer(deserializer);\n\t\t} else if (PropertySet.class.equals(beanDesc.getBeanClass())) {\n\t\t\treturn new PropertySetDeserializer(deserializer);\n\t\t}\n\t\treturn super.modifyDeserializer(config, beanDesc, deserializer);\n\t}\n}"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/json/DeserializerModule.java",
    "content": "/********************************************************************************\n * Copyright (c) 2017-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.json;\n\nimport com.fasterxml.jackson.core.Version;\nimport com.fasterxml.jackson.databind.deser.BeanDeserializerModifier;\nimport com.fasterxml.jackson.databind.module.SimpleModule;\n\n/**\n * Used to register the {@link DeserializerModifier} instance.\n */\npublic class DeserializerModule extends SimpleModule {\n\n\tprivate static final long serialVersionUID = 1L;\n\n\tprivate BeanDeserializerModifier deserializerModifier;\n\n\tpublic DeserializerModule(BeanDeserializerModifier deserializerModifier) {\n\t\tsuper(\"DeserializerModule\", Version.unknownVersion());\n\t\tthis.deserializerModifier = deserializerModifier;\n\t}\n\n\t@Override\n\tpublic void setupModule(SetupContext context) {\n\t\tsuper.setupModule(context);\n\t\tcontext.addBeanDeserializerModifier(deserializerModifier);\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/json/FileSerializer.java",
    "content": "/********************************************************************************\n * Copyright (c) 2017-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.json;\n\nimport java.io.IOException;\n\nimport org.eclipse.paho.client.mqttv3.internal.websocket.Base64;\nimport org.eclipse.tahu.message.model.File;\n\nimport com.fasterxml.jackson.core.JsonGenerator;\nimport com.fasterxml.jackson.databind.SerializerProvider;\nimport com.fasterxml.jackson.databind.ser.std.StdSerializer;\n\n/**\n * Serializes a {@link File} instance.\n */\npublic class FileSerializer extends StdSerializer<File> {\n\n\tprivate static final long serialVersionUID = 1L;\n\n\t/**\n\t * Constructor.\n\t */\n\tprotected FileSerializer() {\n\t\tsuper(File.class);\n\t}\n\n\t/**\n\t * Constructor.\n\t * \n\t * @param clazz class.\n\t */\n\tprotected FileSerializer(Class<File> clazz) {\n\t\tsuper(clazz);\n\t}\n\n\t@Override\n\tpublic void serialize(File value, JsonGenerator generator, SerializerProvider provider) throws IOException {\n\t\tgenerator.writeString(Base64.encodeBytes(value.getBytes()));\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/json/JsonValidator.java",
    "content": "/********************************************************************************\n * Copyright (c) 2017-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.json;\n\n/**\n * Validates JSON.\n */\npublic class JsonValidator {\n\n\tprotected static final String JSON_SCHEMA_FILENAME = \"payload.json\";\n\n\tprivate static JsonValidator instance = null;\n\n\t/**\n\t * Constructor.\n\t */\n\tprotected JsonValidator() {\n\t}\n\n\t/**\n\t * Returns the {@link JsonValidator} instance.\n\t * \n\t * @return the {@link JsonValidator} instance.\n\t */\n\tpublic static JsonValidator getInstance() {\n\t\tif (instance == null) {\n\t\t\tinstance = new JsonValidator();\n\t\t}\n\t\treturn instance;\n\t}\n\n\t/**\n\t * Returns loads and returns the {@link JsonSchema} instance associated with this validator.\n\t * \n\t * @return the {@link JsonSchema} instance associated with this validator.\n\t * @throws IOException\n\t * @throws ProcessingException\n\t */\n//\tprotected JsonSchema getSchema() throws IOException, ProcessingException {\n//\t\t// Get file from resources folder\n//\t\tClassLoader classLoader = getClass().getClassLoader();\n//\t\tFile schemaFile = new File(classLoader.getResource(JSON_SCHEMA_FILENAME).getFile());\n//\t\treturn JsonSchemaFactory.byDefault().getJsonSchema(JsonLoader.fromFile(schemaFile));\n//\t}\n\n\t/**\n\t * Returns true if the supplied JSON text is valid, false otherwise.\n\t * \n\t * @param jsonText a {@link String} representing JSON text.\n\t * @return true if the supplied JSON text is valid, false otherwise.\n\t * @throws ProcessingException\n\t * @throws IOException\n\t */\n//\tpublic boolean isJsonValid(String jsonText) throws ProcessingException, IOException {\n//\t\treturn getSchema().validate(JsonLoader.fromString(jsonText)).isSuccess();\n//\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/json/MetricDeserializer.java",
    "content": "/********************************************************************************\n * Copyright (c) 2013-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.json;\n\nimport java.io.IOException;\nimport java.util.Base64;\n\nimport org.eclipse.tahu.message.model.File;\nimport org.eclipse.tahu.message.model.MetaData;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.MetricDataType;\n\nimport com.fasterxml.jackson.core.JsonParser;\nimport com.fasterxml.jackson.core.JsonProcessingException;\nimport com.fasterxml.jackson.databind.DeserializationContext;\nimport com.fasterxml.jackson.databind.JsonDeserializer;\nimport com.fasterxml.jackson.databind.JsonMappingException;\nimport com.fasterxml.jackson.databind.deser.ResolvableDeserializer;\nimport com.fasterxml.jackson.databind.deser.std.StdDeserializer;\n\n/**\n * A custom JSON deserializer for {@link Metric} instances.\n */\npublic class MetricDeserializer extends StdDeserializer<Metric> implements ResolvableDeserializer {\n\n\tprivate static final long serialVersionUID = 1L;\n\n\tprivate final JsonDeserializer<?> defaultDeserializer;\n\n\t/**\n\t * Constructor.\n\t */\n\tprotected MetricDeserializer(JsonDeserializer<?> defaultDeserializer) {\n\t\tsuper(Metric.class);\n\t\tthis.defaultDeserializer = defaultDeserializer;\n\t}\n\n\t@Override\n\tpublic Metric deserialize(JsonParser parser, DeserializationContext ctxt)\n\t\t\tthrows IOException, JsonProcessingException {\n\n\t\tMetric metric = (Metric) defaultDeserializer.deserialize(parser, ctxt);\n\n\t\t// Check if the data type is a File\n\t\tif (metric.getDataType().equals(MetricDataType.File)) {\n\t\t\t// Perform the custom logic for File types by building up the File object.\n\t\t\tMetaData metaData = metric.getMetaData();\n\t\t\tString fileName = metaData == null ? null : metaData.getFileName();\n\t\t\tFile file = new File(fileName, Base64.getDecoder().decode((String) metric.getValue()));\n\t\t\tmetric.setValue(file);\n\t\t}\n\t\treturn metric;\n\t}\n\n\t@Override\n\tpublic void resolve(DeserializationContext ctxt) throws JsonMappingException {\n\t\t((ResolvableDeserializer) defaultDeserializer).resolve(ctxt);\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/json/PropertySetDeserializer.java",
    "content": "/********************************************************************************\n * Copyright (c) 2023 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\npackage org.eclipse.tahu.json;\n\nimport java.io.IOException;\n\nimport org.eclipse.tahu.message.model.PropertySet;\n\nimport com.fasterxml.jackson.core.JacksonException;\nimport com.fasterxml.jackson.core.JsonParser;\nimport com.fasterxml.jackson.databind.DeserializationContext;\nimport com.fasterxml.jackson.databind.JsonDeserializer;\nimport com.fasterxml.jackson.databind.JsonMappingException;\nimport com.fasterxml.jackson.databind.deser.ResolvableDeserializer;\nimport com.fasterxml.jackson.databind.deser.std.StdDeserializer;\n\n/**\n * Defines PropertySet deserializer\n */\npublic class PropertySetDeserializer extends StdDeserializer<PropertySet> implements ResolvableDeserializer {\n\n\tprivate static final long serialVersionUID = 1L;\n\n\tprivate final JsonDeserializer<?> defaultDeserializer;\n\n\t/**\n\t * Constructor.\n\t */\n\tprotected PropertySetDeserializer(JsonDeserializer<?> defaultDeserializer) {\n\t\tsuper(PropertySet.class);\n\t\tthis.defaultDeserializer = defaultDeserializer;\n\t}\n\n\t@Override\n\tpublic void resolve(DeserializationContext ctxt) throws JsonMappingException {\n\t\t((ResolvableDeserializer) defaultDeserializer).resolve(ctxt);\n\t}\n\n\t@Override\n\tpublic PropertySet deserialize(JsonParser parser, DeserializationContext ctxt)\n\t\t\tthrows IOException, JacksonException {\n\t\tPropertySet propSet = (PropertySet) defaultDeserializer.deserialize(parser, ctxt);\n\t\treturn propSet;\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/json/TemplateDeserializer.java",
    "content": "/********************************************************************************\n * Copyright (c) 2023 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\npackage org.eclipse.tahu.json;\n\nimport java.io.IOException;\n\nimport org.eclipse.tahu.message.model.Template;\n\nimport com.fasterxml.jackson.core.JacksonException;\nimport com.fasterxml.jackson.core.JsonParser;\nimport com.fasterxml.jackson.databind.DeserializationContext;\nimport com.fasterxml.jackson.databind.JsonDeserializer;\nimport com.fasterxml.jackson.databind.JsonMappingException;\nimport com.fasterxml.jackson.databind.deser.ResolvableDeserializer;\nimport com.fasterxml.jackson.databind.deser.std.StdDeserializer;\n\n/**\n * Defines Template deserializer\n */\npublic class TemplateDeserializer extends StdDeserializer<Template> implements ResolvableDeserializer {\n\n\tprivate static final long serialVersionUID = 1L;\n\n\tprivate final JsonDeserializer<?> defaultDeserializer;\n\n\t/**\n\t * Constructor.\n\t */\n\tprotected TemplateDeserializer(JsonDeserializer<?> defaultDeserializer) {\n\t\tsuper(Template.class);\n\t\tthis.defaultDeserializer = defaultDeserializer;\n\t}\n\n\t@Override\n\tpublic void resolve(DeserializationContext ctxt) throws JsonMappingException {\n\t\t((ResolvableDeserializer) defaultDeserializer).resolve(ctxt);\n\t}\n\n\t@Override\n\tpublic Template deserialize(JsonParser parser, DeserializationContext ctxt) throws IOException, JacksonException {\n\t\tTemplate template = (Template) defaultDeserializer.deserialize(parser, ctxt);\n\t\treturn template;\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/BdSeqManager.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message;\n\n/**\n * Manages bdSeq numbers to be used by a Sparkplug Edge Node application\n */\npublic interface BdSeqManager {\n\n\t/**\n\t * Gets the next sequential bdSeq number to be published by a Sparkplug Edge Node. This number MUST be one greater\n\t * than the previous value returned unless the previous number was 255. If the previous value returned by this\n\t * method was 255 the next value MUST be zero.\n\t *\n\t * @return a long value between 0 and 255 (inclusive) that is always one greater than the previous number returned\n\t *         by this method\n\t */\n\tpublic long getNextDeathBdSeqNum();\n\n\t/**\n\t * Stores the next bdSeq number. This MUST override any next bdSeq number the {@link BdSeqManager} may have\n\t * currently stored.\n\t *\n\t * @param bdSeqNum the bdSeq number to store in the {@link BdSeqManager}\n\t */\n\tpublic void storeNextDeathBdSeqNum(long bdSeqNum);\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/DefaultBdSeqManager.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message;\n\nimport java.io.File;\nimport java.nio.charset.Charset;\n\nimport org.apache.commons.io.FileUtils;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class DefaultBdSeqManager implements BdSeqManager {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(DefaultBdSeqManager.class.getName());\n\n\tprivate static final String SPARKPLUG_DIRNAME = \"Tahu_Temp_Dir\";\n\n\tprivate static final String TMP_DIR = System.getProperty(\"java.io.tmpdir\");\n\n\tprivate static final String FILE_SEPARATOR = System.getProperty(\"file.separator\");\n\n\tprivate static final String BD_SEQ_NUM_FILE_NAME_PREFIX =\n\t\t\tTMP_DIR + (TMP_DIR.endsWith(FILE_SEPARATOR) ? \"\" : FILE_SEPARATOR) + SPARKPLUG_DIRNAME + FILE_SEPARATOR;\n\n\tprivate final String bdSeqNumFileName;\n\n\tpublic DefaultBdSeqManager(String fileName) {\n\t\tbdSeqNumFileName = BD_SEQ_NUM_FILE_NAME_PREFIX + fileName;\n\t}\n\n\t@Override\n\tpublic long getNextDeathBdSeqNum() {\n\t\ttry {\n\t\t\tlogger.info(\"bdSeqNumFileName: {}\", bdSeqNumFileName);\n\t\t\tFile bdSeqNumFile = new File(bdSeqNumFileName);\n\t\t\tif (bdSeqNumFile.exists()) {\n\t\t\t\tint bdSeqNum =\n\t\t\t\t\t\tInteger.parseInt(FileUtils.readFileToString(bdSeqNumFile, Charset.defaultCharset().toString()));\n\t\t\t\tlogger.info(\"Next Death bdSeq number: {}\", bdSeqNum);\n\t\t\t\treturn bdSeqNum;\n\t\t\t} else {\n\t\t\t\tstoreNextDeathBdSeqNum(0);\n\t\t\t\treturn 0;\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to get the bdSeq number from the persistent directory\", e);\n\t\t\tstoreNextDeathBdSeqNum(0);\n\t\t\treturn 0;\n\t\t}\n\t}\n\n\t@Override\n\tpublic void storeNextDeathBdSeqNum(long bdSeqNum) {\n\t\ttry {\n\t\t\tFile bdSeqNumFile = new File(bdSeqNumFileName);\n\t\t\tFileUtils.writeStringToFile(bdSeqNumFile, Long.toString(bdSeqNum), Charset.defaultCharset().toString(),\n\t\t\t\t\tfalse);\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to write the bdSeq number to the persistent directory\", e);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/PayloadDecoder.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message;\n\nimport org.eclipse.tahu.model.MetricDataTypeMap;\n\n/**\n * An interface for decoding payloads.\n * \n * @param <P> the type of payload.\n */\npublic interface PayloadDecoder<P> {\n\n\t/**\n\t * Builds a payload from a supplied byte array.\n\t *\n\t * @param bytes the bytes representing the payload\n\t * @param metricDataTypeMap the {@link MetricDataTypeMap} to be used in decoding\n\t * @return a payload object built from the byte array\n\t * @throws Exception\n\t */\n\tpublic P buildFromByteArray(byte[] bytes, MetricDataTypeMap metricDataTypeMap) throws Exception;\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/PayloadEncoder.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message;\n\nimport java.io.IOException;\n\n/**\n * An interface for encoding payloads.\n * \n * @param <P> the type of payload.\n */\npublic interface PayloadEncoder<P> {\n\n\t/**\n\t * Converts a payload object into a byte array.\n\t * \n\t * @param payload a payload object\n\t * @return the byte array representing the payload\n\t * @throws IOException\n\t */\n\tpublic byte[] getBytes(P payload, boolean stripDataTypes) throws IOException;\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/SparkplugBPayloadDecoder.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message;\n\nimport java.math.BigInteger;\nimport java.nio.ByteBuffer;\nimport java.nio.ByteOrder;\nimport java.nio.charset.StandardCharsets;\nimport java.util.ArrayList;\nimport java.util.Collection;\nimport java.util.Date;\nimport java.util.HashMap;\nimport java.util.List;\nimport java.util.Map;\n\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\nimport org.eclipse.tahu.message.model.DataSet.DataSetBuilder;\nimport org.eclipse.tahu.message.model.DataSetDataType;\nimport org.eclipse.tahu.message.model.File;\nimport org.eclipse.tahu.message.model.MetaData.MetaDataBuilder;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.Parameter;\nimport org.eclipse.tahu.message.model.ParameterDataType;\nimport org.eclipse.tahu.message.model.PropertyDataType;\nimport org.eclipse.tahu.message.model.PropertySet;\nimport org.eclipse.tahu.message.model.PropertySet.PropertySetBuilder;\nimport org.eclipse.tahu.message.model.PropertyValue;\nimport org.eclipse.tahu.message.model.Row;\nimport org.eclipse.tahu.message.model.Row.RowBuilder;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.eclipse.tahu.message.model.Template;\nimport org.eclipse.tahu.message.model.Template.TemplateBuilder;\nimport org.eclipse.tahu.message.model.Value;\nimport org.eclipse.tahu.model.MetricDataTypeMap;\nimport org.eclipse.tahu.protobuf.SparkplugBProto;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\n/**\n * A {@link PayloadDecode} implementation for decoding Sparkplug B payloads.\n */\npublic class SparkplugBPayloadDecoder implements PayloadDecoder<SparkplugBPayload> {\n\n\tprivate static final Logger logger = LoggerFactory.getLogger(SparkplugBPayloadDecoder.class.getName());\n\n\t/**\n\t * Default Constructor\n\t */\n\tpublic SparkplugBPayloadDecoder() {\n\t\tsuper();\n\t}\n\n\t@Override\n\tpublic SparkplugBPayload buildFromByteArray(byte[] bytes, MetricDataTypeMap metricDataTypeMap) throws Exception {\n\t\tSparkplugBProto.Payload protoPayload = SparkplugBProto.Payload.parseFrom(bytes);\n\t\tSparkplugBPayloadBuilder builder = new SparkplugBPayloadBuilder();\n\n\t\t// Set the timestamp\n\t\tif (protoPayload.hasTimestamp()) {\n\t\t\tbuilder.setTimestamp(new Date(protoPayload.getTimestamp()));\n\t\t}\n\n\t\t// Set the sequence number\n\t\tif (protoPayload.hasSeq()) {\n\t\t\tbuilder.setSeq(protoPayload.getSeq());\n\t\t}\n\n\t\t// Set the Metrics\n\t\tfor (SparkplugBProto.Payload.Metric protoMetric : protoPayload.getMetricsList()) {\n\t\t\tbuilder.addMetric(convertMetric(protoMetric, metricDataTypeMap, null));\n\t\t}\n\n\t\t// Set the body\n\t\tif (protoPayload.hasBody()) {\n\t\t\tbuilder.setBody(protoPayload.getBody().toByteArray());\n\t\t}\n\n\t\t// Set the body\n\t\tif (protoPayload.hasUuid()) {\n\t\t\tbuilder.setUuid(protoPayload.getUuid());\n\t\t}\n\n\t\treturn builder.createPayload();\n\t}\n\n\tprivate Metric convertMetric(SparkplugBProto.Payload.Metric protoMetric, MetricDataTypeMap metricDataTypeMap,\n\t\t\tString prefix) throws Exception {\n\t\t// Convert the dataType\n\t\tMetricDataType dataType = MetricDataType.fromInteger((protoMetric.getDatatype()));\n\t\tif (dataType == MetricDataType.Unknown) {\n\t\t\tif (metricDataTypeMap != null && !metricDataTypeMap.isEmpty()) {\n\t\t\t\tif (protoMetric.hasName()) {\n\t\t\t\t\tdataType = metricDataTypeMap\n\t\t\t\t\t\t\t.getMetricDataType(prefix != null ? prefix + protoMetric.getName() : protoMetric.getName());\n\t\t\t\t} else if (protoMetric.hasAlias()) {\n\t\t\t\t\tdataType = metricDataTypeMap.getMetricDataType(protoMetric.getAlias());\n\t\t\t\t} else {\n\t\t\t\t\tlogger.error(\"Failed to decode the payload on metric: {}\", protoMetric);\n\t\t\t\t\treturn null;\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tlogger.error(\"Failed to decode the payload on metric datatype: {}\", protoMetric);\n\t\t\t\treturn null;\n\t\t\t}\n\t\t}\n\n\t\t// Build and return the Metric\n\t\treturn new MetricBuilder(protoMetric.hasName() ? protoMetric.getName() : null, dataType,\n\t\t\t\tgetMetricValue(protoMetric, metricDataTypeMap, prefix))\n\t\t\t\t\t\t.isHistorical(protoMetric.hasIsHistorical() ? protoMetric.getIsHistorical() : null)\n\t\t\t\t\t\t.isTransient(\n\t\t\t\t\t\t\t\tprotoMetric.hasIsTransient() ? protoMetric.getIsTransient() : null)\n\t\t\t\t\t\t.timestamp(protoMetric\n\t\t\t\t\t\t\t\t.hasTimestamp() ? new Date(protoMetric.getTimestamp()) : null)\n\t\t\t\t\t\t.alias(protoMetric.hasAlias() ? protoMetric.getAlias() : null)\n\t\t\t\t\t\t.metaData(protoMetric.hasMetadata()\n\t\t\t\t\t\t\t\t? new MetaDataBuilder().contentType(protoMetric.getMetadata().getContentType())\n\t\t\t\t\t\t\t\t\t\t.size(protoMetric.getMetadata().getSize())\n\t\t\t\t\t\t\t\t\t\t.seq(protoMetric.getMetadata().getSeq())\n\t\t\t\t\t\t\t\t\t\t.fileName(protoMetric.getMetadata().getFileName())\n\t\t\t\t\t\t\t\t\t\t.fileType(protoMetric.getMetadata().getFileType())\n\t\t\t\t\t\t\t\t\t\t.md5(protoMetric.getMetadata().getMd5())\n\t\t\t\t\t\t\t\t\t\t.multiPart(protoMetric.getMetadata().getIsMultiPart())\n\t\t\t\t\t\t\t\t\t\t.description(protoMetric.getMetadata().getDescription()).createMetaData()\n\t\t\t\t\t\t\t\t: null)\n\t\t\t\t\t\t.properties(protoMetric.hasProperties()\n\t\t\t\t\t\t\t\t? new PropertySetBuilder().addProperties(convertProperties(protoMetric.getProperties()))\n\t\t\t\t\t\t\t\t\t\t.createPropertySet()\n\t\t\t\t\t\t\t\t: null)\n\t\t\t\t\t\t.createMetric();\n\t}\n\n\tprivate Map<String, PropertyValue> convertProperties(SparkplugBProto.Payload.PropertySet decodedPropSet)\n\t\t\tthrows SparkplugInvalidTypeException, Exception {\n\t\tMap<String, PropertyValue> map = new HashMap<String, PropertyValue>();\n\t\tList<String> keys = decodedPropSet.getKeysList();\n\t\tList<SparkplugBProto.Payload.PropertyValue> values = decodedPropSet.getValuesList();\n\t\tfor (int i = 0; i < keys.size(); i++) {\n\t\t\tSparkplugBProto.Payload.PropertyValue value = values.get(i);\n\t\t\tmap.put(keys.get(i),\n\t\t\t\t\tnew PropertyValue(PropertyDataType.fromInteger(value.getType()), getPropertyValue(value)));\n\t\t}\n\t\treturn map;\n\t}\n\n\tprivate Object getPropertyValue(SparkplugBProto.Payload.PropertyValue value) throws Exception {\n\t\tPropertyDataType type = PropertyDataType.fromInteger(value.getType());\n\t\tif (value.getIsNull()) {\n\t\t\treturn null;\n\t\t}\n\t\tswitch (type) {\n\t\t\tcase Boolean:\n\t\t\t\treturn value.getBooleanValue();\n\t\t\tcase DateTime:\n\t\t\t\treturn new Date(value.getLongValue());\n\t\t\tcase Float:\n\t\t\t\treturn value.getFloatValue();\n\t\t\tcase Double:\n\t\t\t\treturn value.getDoubleValue();\n\t\t\tcase Int8:\n\t\t\t\treturn (byte) value.getIntValue();\n\t\t\tcase Int16:\n\t\t\tcase UInt8:\n\t\t\t\treturn (short) value.getIntValue();\n\t\t\tcase Int32:\n\t\t\tcase UInt16:\n\t\t\t\treturn value.getIntValue();\n\t\t\tcase UInt32:\n\t\t\t\tif (value.hasIntValue()) {\n\t\t\t\t\treturn Integer.toUnsignedLong(value.getIntValue());\n\t\t\t\t} else if (value.hasLongValue()) {\n\t\t\t\t\treturn value.getLongValue();\n\t\t\t\t} else {\n\t\t\t\t\tlogger.error(\"Invalid value for UInt32 datatype\");\n\t\t\t\t}\n\t\t\tcase Int64:\n\t\t\t\treturn value.getLongValue();\n\t\t\tcase UInt64:\n\t\t\t\treturn new BigInteger(Long.toUnsignedString(value.getLongValue()));\n\t\t\tcase String:\n\t\t\tcase Text:\n\t\t\t\treturn value.getStringValue();\n\t\t\tcase PropertySet:\n\t\t\t\treturn new PropertySetBuilder().addProperties(convertProperties(value.getPropertysetValue()))\n\t\t\t\t\t\t.createPropertySet();\n\t\t\tcase PropertySetList:\n\t\t\t\tList<PropertySet> propertySetList = new ArrayList<PropertySet>();\n\t\t\t\tList<SparkplugBProto.Payload.PropertySet> list = value.getPropertysetsValue().getPropertysetList();\n\t\t\t\tfor (SparkplugBProto.Payload.PropertySet decodedPropSet : list) {\n\t\t\t\t\tpropertySetList.add(new PropertySetBuilder().addProperties(convertProperties(decodedPropSet))\n\t\t\t\t\t\t\t.createPropertySet());\n\t\t\t\t}\n\t\t\t\treturn propertySetList;\n\t\t\tcase Unknown:\n\t\t\tdefault:\n\t\t\t\tthrow new Exception(\"Failed to decode: Unknown PropertyDataType \" + type);\n\t\t}\n\t}\n\n\tprivate Object getMetricValue(SparkplugBProto.Payload.Metric protoMetric, MetricDataTypeMap metricDataTypeMap,\n\t\t\tString prefix) throws Exception {\n\t\t// Check if the null flag has been set indicating that the value is null\n\t\tif (protoMetric.getIsNull()) {\n\t\t\treturn null;\n\t\t}\n\n\t\t// Get the MetricDataType\n\t\tint metricType = protoMetric.getDatatype();\n\t\tif (metricType == 0) {\n\t\t\tif (metricDataTypeMap != null && !metricDataTypeMap.isEmpty()) {\n\t\t\t\tif (protoMetric.hasName()) {\n\t\t\t\t\tmetricType = metricDataTypeMap\n\t\t\t\t\t\t\t.getMetricDataType(prefix != null ? prefix + protoMetric.getName() : protoMetric.getName())\n\t\t\t\t\t\t\t.toIntValue();\n\t\t\t\t} else if (protoMetric.hasAlias()) {\n\t\t\t\t\tmetricType = metricDataTypeMap.getMetricDataType(protoMetric.getAlias()).toIntValue();\n\t\t\t\t} else {\n\t\t\t\t\tlogger.error(\"Failed to decode the payload on metric: {}\", protoMetric);\n\t\t\t\t\treturn null;\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tlogger.error(\"Failed to decode the payload on metric datatype: {}\", protoMetric);\n\t\t\t\treturn null;\n\t\t\t}\n\t\t}\n\n\t\tlogger.trace(\"For metricName={} and alias={} - handling metric type in decoder: {}\", protoMetric.getName(),\n\t\t\t\tprotoMetric.getAlias(), metricType);\n\t\tswitch (MetricDataType.fromInteger(metricType)) {\n\t\t\tcase Boolean:\n\t\t\t\treturn protoMetric.getBooleanValue();\n\t\t\tcase DateTime:\n\t\t\t\treturn new Date(protoMetric.getLongValue());\n\t\t\tcase File:\n\t\t\t\tString filename = protoMetric.getMetadata().getFileName();\n\t\t\t\tbyte[] fileBytes = protoMetric.getBytesValue().toByteArray();\n\t\t\t\treturn new File(filename, fileBytes);\n\t\t\tcase Float:\n\t\t\t\treturn protoMetric.getFloatValue();\n\t\t\tcase Double:\n\t\t\t\treturn protoMetric.getDoubleValue();\n\t\t\tcase Int8:\n\t\t\t\treturn (byte) protoMetric.getIntValue();\n\t\t\tcase Int16:\n\t\t\tcase UInt8:\n\t\t\t\treturn (short) protoMetric.getIntValue();\n\t\t\tcase Int32:\n\t\t\tcase UInt16:\n\t\t\t\treturn protoMetric.getIntValue();\n\t\t\tcase UInt32:\n\t\t\t\tif (protoMetric.hasIntValue()) {\n\t\t\t\t\treturn Integer.toUnsignedLong(protoMetric.getIntValue());\n\t\t\t\t} else if (protoMetric.hasLongValue()) {\n\t\t\t\t\treturn protoMetric.getLongValue();\n\t\t\t\t} else {\n\t\t\t\t\tlogger.error(\"Invalid value for UInt32 datatype\");\n\t\t\t\t}\n\t\t\tcase Int64:\n\t\t\t\treturn protoMetric.getLongValue();\n\t\t\tcase UInt64:\n\t\t\t\treturn new BigInteger(Long.toUnsignedString(protoMetric.getLongValue()));\n\t\t\tcase String:\n\t\t\tcase Text:\n\t\t\tcase UUID:\n\t\t\t\treturn protoMetric.getStringValue();\n\t\t\tcase Bytes:\n\t\t\t\treturn protoMetric.getBytesValue().toByteArray();\n\t\t\tcase DataSet:\n\t\t\t\tSparkplugBProto.Payload.DataSet protoDataSet = protoMetric.getDatasetValue();\n\t\t\t\t// Build the and create the DataSet\n\t\t\t\treturn new DataSetBuilder(protoDataSet.getNumOfColumns()).addColumnNames(protoDataSet.getColumnsList())\n\t\t\t\t\t\t.addTypes(convertDataSetDataTypes(protoDataSet.getTypesList()))\n\t\t\t\t\t\t.addRows(convertDataSetRows(protoDataSet.getRowsList(), protoDataSet.getTypesList()))\n\t\t\t\t\t\t.createDataSet();\n\t\t\tcase Template:\n\t\t\t\tSparkplugBProto.Payload.Template protoTemplate = protoMetric.getTemplateValue();\n\t\t\t\tList<Metric> metrics = new ArrayList<Metric>();\n\t\t\t\tList<Parameter> parameters = new ArrayList<Parameter>();\n\n\t\t\t\tfor (SparkplugBProto.Payload.Template.Parameter protoParameter : protoTemplate.getParametersList()) {\n\t\t\t\t\tString name = protoParameter.getName();\n\t\t\t\t\tParameterDataType type = ParameterDataType.fromInteger(protoParameter.getType());\n\t\t\t\t\tObject value = getParameterValue(protoParameter);\n\t\t\t\t\tif (logger.isTraceEnabled()) {\n\t\t\t\t\t\tlogger.trace(\"Setting template parameter name: \" + name + \", type: \" + type + \", value: \"\n\t\t\t\t\t\t\t\t+ value + \", valueType\" + value.getClass());\n\t\t\t\t\t}\n\n\t\t\t\t\tparameters.add(new Parameter(name, type, value));\n\t\t\t\t}\n\n\t\t\t\tfor (SparkplugBProto.Payload.Metric protoTemplateMetric : protoTemplate.getMetricsList()) {\n\t\t\t\t\tMetric templateMetric = convertMetric(protoTemplateMetric, metricDataTypeMap,\n\t\t\t\t\t\t\tprefix != null ? prefix + protoMetric.getName() + \"/\" : protoMetric.getName() + \"/\");\n\t\t\t\t\tif (logger.isTraceEnabled()) {\n\t\t\t\t\t\tlogger.trace(\"Setting template parameter name: \" + templateMetric.getName() + \", type: \"\n\t\t\t\t\t\t\t\t+ templateMetric.getDataType() + \", value: \" + templateMetric.getValue());\n\t\t\t\t\t}\n\t\t\t\t\tmetrics.add(templateMetric);\n\t\t\t\t}\n\n\t\t\t\tTemplate template = new TemplateBuilder().version(protoTemplate.getVersion())\n\t\t\t\t\t\t.templateRef(protoTemplate.getTemplateRef()).definition(protoTemplate.getIsDefinition())\n\t\t\t\t\t\t.addMetrics(metrics).addParameters(parameters).createTemplate();\n\n\t\t\t\tif (logger.isTraceEnabled()) {\n\t\t\t\t\tlogger.trace(\n\t\t\t\t\t\t\t\"Setting template - name: \" + protoMetric.getName() + \", version: \" + template.getVersion()\n\t\t\t\t\t\t\t\t\t+ \", ref: \" + template.getTemplateRef() + \", isDef: \" + template.isDefinition()\n\t\t\t\t\t\t\t\t\t+ \", metrics: \" + metrics.size() + \", params: \" + parameters.size());\n\t\t\t\t}\n\n\t\t\t\treturn template;\n\t\t\tcase Int8Array:\n\t\t\t\tByteBuffer int8ByteBuffer = ByteBuffer.wrap(protoMetric.getBytesValue().toByteArray());\n\t\t\t\tList<Byte> int8List = new ArrayList<>();\n\t\t\t\tint8ByteBuffer.order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\twhile (int8ByteBuffer.hasRemaining()) {\n\t\t\t\t\tbyte value = int8ByteBuffer.get();\n\t\t\t\t\tint8List.add(value);\n\t\t\t\t}\n\t\t\t\treturn int8List.toArray(new Byte[0]);\n\t\t\tcase Int16Array:\n\t\t\t\tByteBuffer int16ByteBuffer = ByteBuffer.wrap(protoMetric.getBytesValue().toByteArray());\n\t\t\t\tList<Short> int16List = new ArrayList<>();\n\t\t\t\tint16ByteBuffer.order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\twhile (int16ByteBuffer.hasRemaining()) {\n\t\t\t\t\tshort value = int16ByteBuffer.getShort();\n\t\t\t\t\tint16List.add(value);\n\t\t\t\t}\n\t\t\t\treturn int16List.toArray(new Short[0]);\n\t\t\tcase Int32Array:\n\t\t\t\tByteBuffer int32ByteBuffer = ByteBuffer.wrap(protoMetric.getBytesValue().toByteArray());\n\t\t\t\tList<Integer> int32List = new ArrayList<>();\n\t\t\t\tint32ByteBuffer.order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\twhile (int32ByteBuffer.hasRemaining()) {\n\t\t\t\t\tint value = int32ByteBuffer.getInt();\n\t\t\t\t\tint32List.add(value);\n\t\t\t\t}\n\t\t\t\treturn int32List.toArray(new Integer[0]);\n\t\t\tcase Int64Array:\n\t\t\t\tByteBuffer int64ByteBuffer = ByteBuffer.wrap(protoMetric.getBytesValue().toByteArray());\n\t\t\t\tList<Long> int64List = new ArrayList<>();\n\t\t\t\tint64ByteBuffer.order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\twhile (int64ByteBuffer.hasRemaining()) {\n\t\t\t\t\tlong value = int64ByteBuffer.getLong();\n\t\t\t\t\tint64List.add(value);\n\t\t\t\t}\n\t\t\t\treturn int64List.toArray(new Long[0]);\n\t\t\tcase UInt8Array:\n\t\t\t\tByteBuffer uInt8ByteBuffer = ByteBuffer.wrap(protoMetric.getBytesValue().toByteArray());\n\t\t\t\tList<Short> uInt8List = new ArrayList<>();\n\t\t\t\tuInt8ByteBuffer.order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\twhile (uInt8ByteBuffer.hasRemaining()) {\n\t\t\t\t\tbyte value = uInt8ByteBuffer.get();\n\t\t\t\t\tuInt8List.add(value >= 0 ? (short) value : (short) (0x10000 + value));\n\t\t\t\t}\n\t\t\t\treturn uInt8List.toArray(new Short[0]);\n\t\t\tcase UInt16Array:\n\t\t\t\tByteBuffer uInt16ByteBuffer = ByteBuffer.wrap(protoMetric.getBytesValue().toByteArray());\n\t\t\t\tList<Integer> uInt16List = new ArrayList<>();\n\t\t\t\tuInt16ByteBuffer.order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\twhile (uInt16ByteBuffer.hasRemaining()) {\n\t\t\t\t\tshort value = uInt16ByteBuffer.getShort();\n\t\t\t\t\tuInt16List.add(Short.toUnsignedInt(value));\n\t\t\t\t}\n\t\t\t\treturn uInt16List.toArray(new Integer[0]);\n\t\t\tcase UInt32Array:\n\t\t\t\tByteBuffer uInt32ByteBuffer = ByteBuffer.wrap(protoMetric.getBytesValue().toByteArray());\n\t\t\t\tList<Long> uInt32List = new ArrayList<>();\n\t\t\t\tuInt32ByteBuffer.order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\twhile (uInt32ByteBuffer.hasRemaining()) {\n\t\t\t\t\tint value = uInt32ByteBuffer.getInt();\n\t\t\t\t\tuInt32List.add(Integer.toUnsignedLong(value));\n\t\t\t\t}\n\t\t\t\treturn uInt32List.toArray(new Long[0]);\n\t\t\tcase UInt64Array:\n\t\t\t\tByteBuffer uInt64ByteBuffer = ByteBuffer.wrap(protoMetric.getBytesValue().toByteArray());\n\t\t\t\tList<BigInteger> uInt64List = new ArrayList<>();\n\t\t\t\tuInt64ByteBuffer.order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\twhile (uInt64ByteBuffer.hasRemaining()) {\n\t\t\t\t\tlong value = uInt64ByteBuffer.getLong();\n\t\t\t\t\tuInt64List.add(new BigInteger(Long.toUnsignedString(value)));\n\t\t\t\t}\n\t\t\t\treturn uInt64List.toArray(new BigInteger[0]);\n\t\t\tcase FloatArray:\n\t\t\t\tByteBuffer floatByteBuffer = ByteBuffer.wrap(protoMetric.getBytesValue().toByteArray());\n\t\t\t\tList<Float> floatList = new ArrayList<>();\n\t\t\t\tfloatByteBuffer.order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\twhile (floatByteBuffer.hasRemaining()) {\n\t\t\t\t\tfloat value = floatByteBuffer.getFloat();\n\t\t\t\t\tfloatList.add(value);\n\t\t\t\t}\n\t\t\t\treturn floatList.toArray(new Float[0]);\n\t\t\tcase DoubleArray:\n\t\t\t\tByteBuffer doubleByteBuffer = ByteBuffer.wrap(protoMetric.getBytesValue().toByteArray());\n\t\t\t\tList<Double> doubleList = new ArrayList<>();\n\t\t\t\tdoubleByteBuffer.order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\twhile (doubleByteBuffer.hasRemaining()) {\n\t\t\t\t\tdouble value = doubleByteBuffer.getDouble();\n\t\t\t\t\tdoubleList.add(value);\n\t\t\t\t}\n\t\t\t\treturn doubleList.toArray(new Double[0]);\n\t\t\tcase BooleanArray:\n\t\t\t\tByteBuffer booleanByteBuffer = ByteBuffer.wrap(protoMetric.getBytesValue().toByteArray());\n\t\t\t\tList<Boolean> booleanList = new ArrayList<>();\n\t\t\t\tbooleanByteBuffer.order(ByteOrder.LITTLE_ENDIAN);\n\n\t\t\t\t// The first 4 bytes is the number of boolean bytes\n\t\t\t\tint numberOfBooleans = booleanByteBuffer.getInt();\n\t\t\t\tint numberOfBytes = (int) Math.ceil((double) numberOfBooleans / 8);\n\n\t\t\t\t// Boolean[] booleanArray = new boolean[booleanBytes.length * 8];\n\t\t\t\tfor (int i = 0; i < numberOfBytes; i++) {\n\t\t\t\t\tbyte nextByte = booleanByteBuffer.get();\n\t\t\t\t\tfor (int j = 0; j < 8; j++) {\n\t\t\t\t\t\tif (i * 8 + j < numberOfBooleans) {\n\t\t\t\t\t\t\tif ((nextByte & (1 << (7 - j))) > 0) {\n\t\t\t\t\t\t\t\tbooleanList.add(true);\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\tbooleanList.add(false);\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t\treturn booleanList.toArray(new Boolean[0]);\n\t\t\tcase StringArray:\n\t\t\t\tByteBuffer stringByteBuffer = ByteBuffer.wrap(protoMetric.getBytesValue().toByteArray());\n\t\t\t\tList<String> stringList = new ArrayList<>();\n\t\t\t\tstringByteBuffer.order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\tByteBuffer subByteBuffer = ByteBuffer.allocate(protoMetric.getBytesValue().toByteArray().length);\n\t\t\t\twhile (stringByteBuffer.hasRemaining()) {\n\t\t\t\t\tbyte b = stringByteBuffer.get();\n\t\t\t\t\tif (b == (byte) 0) {\n\t\t\t\t\t\tString string = new String(subByteBuffer.array(), StandardCharsets.UTF_8);\n\t\t\t\t\t\tif (string != null && string.lastIndexOf(\"\\0\") == string.length() - 1) {\n\t\t\t\t\t\t\tstring = string.replace(\"\\0\", \"\");\n\t\t\t\t\t\t}\n\t\t\t\t\t\tstringList.add(string);\n\t\t\t\t\t\tsubByteBuffer = ByteBuffer.allocate(protoMetric.getBytesValue().toByteArray().length);\n\t\t\t\t\t} else {\n\t\t\t\t\t\tsubByteBuffer.put(b);\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t\treturn stringList.toArray(new String[0]);\n\t\t\tcase DateTimeArray:\n\t\t\t\tByteBuffer dateTimeByteBuffer = ByteBuffer.wrap(protoMetric.getBytesValue().toByteArray());\n\t\t\t\tList<Date> dateTimeList = new ArrayList<>();\n\t\t\t\tdateTimeByteBuffer.order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\twhile (dateTimeByteBuffer.hasRemaining()) {\n\t\t\t\t\tlong value = dateTimeByteBuffer.getLong();\n\t\t\t\t\tDate date = new Date(value);\n\t\t\t\t\tdateTimeList.add(date);\n\t\t\t\t}\n\t\t\t\treturn dateTimeList.toArray(new Date[0]);\n\t\t\tcase Unknown:\n\t\t\tdefault:\n\t\t\t\tthrow new Exception(\"Failed to decode: Unknown MetricDataType \" + metricType);\n\n\t\t}\n\t}\n\n\tprivate Collection<Row> convertDataSetRows(List<SparkplugBProto.Payload.DataSet.Row> protoRows,\n\t\t\tList<Integer> protoTypes) throws Exception {\n\t\tCollection<Row> rows = new ArrayList<Row>();\n\t\tif (protoRows != null) {\n\t\t\tfor (SparkplugBProto.Payload.DataSet.Row protoRow : protoRows) {\n\t\t\t\tList<SparkplugBProto.Payload.DataSet.DataSetValue> protoValues = protoRow.getElementsList();\n\t\t\t\tList<Value<?>> values = new ArrayList<Value<?>>();\n\t\t\t\tfor (int index = 0; index < protoRow.getElementsCount(); index++) {\n\t\t\t\t\tvalues.add(convertDataSetValue(protoTypes.get(index), protoValues.get(index)));\n\t\t\t\t}\n\t\t\t\t// Add the values to the row and the row to the rows\n\t\t\t\trows.add(new RowBuilder().addValues(values).createRow());\n\t\t\t}\n\t\t}\n\t\treturn rows;\n\t}\n\n\tprivate Collection<DataSetDataType> convertDataSetDataTypes(List<Integer> protoTypes) {\n\t\tList<DataSetDataType> types = new ArrayList<DataSetDataType>();\n\t\t// Build up a List of column types\n\t\tfor (int type : protoTypes) {\n\t\t\ttypes.add(DataSetDataType.fromInteger(type));\n\t\t}\n\t\treturn types;\n\t}\n\n\tprivate Object getParameterValue(SparkplugBProto.Payload.Template.Parameter protoParameter) throws Exception {\n\t\t// Otherwise convert the value based on the type\n\t\tint type = protoParameter.getType();\n\t\tswitch (MetricDataType.fromInteger(type)) {\n\t\t\tcase Boolean:\n\t\t\t\treturn protoParameter.getBooleanValue();\n\t\t\tcase DateTime:\n\t\t\t\treturn new Date(protoParameter.getLongValue());\n\t\t\tcase Float:\n\t\t\t\treturn protoParameter.getFloatValue();\n\t\t\tcase Double:\n\t\t\t\treturn protoParameter.getDoubleValue();\n\t\t\tcase Int8:\n\t\t\t\treturn (byte) protoParameter.getIntValue();\n\t\t\tcase Int16:\n\t\t\tcase UInt8:\n\t\t\t\treturn (short) protoParameter.getIntValue();\n\t\t\tcase Int32:\n\t\t\tcase UInt16:\n\t\t\t\treturn protoParameter.getIntValue();\n\t\t\tcase UInt32:\n\t\t\t\tif (protoParameter.hasIntValue()) {\n\t\t\t\t\treturn Integer.toUnsignedLong(protoParameter.getIntValue());\n\t\t\t\t} else if (protoParameter.hasLongValue()) {\n\t\t\t\t\treturn protoParameter.getLongValue();\n\t\t\t\t} else {\n\t\t\t\t\tlogger.error(\"Invalid value for UInt32 datatype\");\n\t\t\t\t}\n\t\t\tcase Int64:\n\t\t\t\treturn protoParameter.getLongValue();\n\t\t\tcase UInt64:\n\t\t\t\treturn new BigInteger(Long.toUnsignedString(protoParameter.getLongValue()));\n\t\t\tcase String:\n\t\t\tcase Text:\n\t\t\t\treturn protoParameter.getStringValue();\n\t\t\tcase Unknown:\n\t\t\tdefault:\n\t\t\t\tthrow new Exception(\"Failed to decode: Unknown Parameter Type \" + type);\n\t\t}\n\t}\n\n\tprivate Value<?> convertDataSetValue(int protoType, SparkplugBProto.Payload.DataSet.DataSetValue protoValue)\n\t\t\tthrows Exception {\n\n\t\tDataSetDataType type = DataSetDataType.fromInteger(protoType);\n\t\tswitch (type) {\n\t\t\tcase Boolean:\n\t\t\t\tif (protoValue.hasBooleanValue()) {\n\t\t\t\t\treturn new Value<Boolean>(type, protoValue.getBooleanValue());\n\t\t\t\t} else {\n\t\t\t\t\treturn new Value<Boolean>(type, null);\n\t\t\t\t}\n\t\t\tcase DateTime:\n\t\t\t\tif (protoValue.hasLongValue()) {\n\t\t\t\t\tif (protoValue.getLongValue() == -9223372036854775808L) {\n\t\t\t\t\t\treturn new Value<Date>(type, null);\n\t\t\t\t\t} else {\n\t\t\t\t\t\treturn new Value<Date>(type, new Date(protoValue.getLongValue()));\n\t\t\t\t\t}\n\t\t\t\t} else {\n\t\t\t\t\treturn new Value<Date>(type, null);\n\t\t\t\t}\n\t\t\tcase Float:\n\t\t\t\tif (protoValue.hasFloatValue()) {\n\t\t\t\t\treturn new Value<Float>(type, protoValue.getFloatValue());\n\t\t\t\t} else {\n\t\t\t\t\treturn new Value<Float>(type, null);\n\t\t\t\t}\n\t\t\tcase Double:\n\t\t\t\tif (protoValue.hasDoubleValue()) {\n\t\t\t\t\treturn new Value<Double>(type, protoValue.getDoubleValue());\n\t\t\t\t} else {\n\t\t\t\t\treturn new Value<Double>(type, null);\n\t\t\t\t}\n\t\t\tcase Int8:\n\t\t\t\tif (protoValue.hasIntValue()) {\n\t\t\t\t\treturn new Value<Byte>(type, (byte) protoValue.getIntValue());\n\t\t\t\t} else {\n\t\t\t\t\treturn new Value<Byte>(type, null);\n\t\t\t\t}\n\t\t\tcase UInt8:\n\t\t\tcase Int16:\n\t\t\t\tif (protoValue.hasIntValue()) {\n\t\t\t\t\treturn new Value<Short>(type, (short) protoValue.getIntValue());\n\t\t\t\t} else {\n\t\t\t\t\treturn new Value<Short>(type, null);\n\t\t\t\t}\n\t\t\tcase UInt16:\n\t\t\tcase Int32:\n\t\t\t\tif (protoValue.hasIntValue()) {\n\t\t\t\t\treturn new Value<Integer>(type, protoValue.getIntValue());\n\t\t\t\t} else {\n\t\t\t\t\treturn new Value<Integer>(type, null);\n\t\t\t\t}\n\t\t\tcase UInt32:\n\t\t\t\tif (protoValue.hasIntValue()) {\n\t\t\t\t\treturn new Value<Long>(type, Integer.toUnsignedLong(protoValue.getIntValue()));\n\t\t\t\t} else if (protoValue.hasLongValue()) {\n\t\t\t\t\treturn new Value<Long>(type, protoValue.getLongValue());\n\t\t\t\t} else {\n\t\t\t\t\treturn new Value<Long>(type, null);\n\t\t\t\t}\n\t\t\tcase Int64:\n\t\t\t\tif (protoValue.hasLongValue()) {\n\t\t\t\t\treturn new Value<Long>(type, protoValue.getLongValue());\n\t\t\t\t} else {\n\t\t\t\t\treturn new Value<Long>(type, null);\n\t\t\t\t}\n\t\t\tcase UInt64:\n\t\t\t\tif (protoValue.hasLongValue()) {\n\t\t\t\t\treturn new Value<BigInteger>(type,\n\t\t\t\t\t\t\tnew BigInteger(Long.toUnsignedString(protoValue.getLongValue())));\n\t\t\t\t} else {\n\t\t\t\t\treturn new Value<BigInteger>(type, null);\n\t\t\t\t}\n\t\t\tcase String:\n\t\t\tcase Text:\n\t\t\t\tif (protoValue.hasStringValue()) {\n\t\t\t\t\tif (protoValue.getStringValue().equals(\"null\")) {\n\t\t\t\t\t\treturn new Value<String>(type, null);\n\t\t\t\t\t} else {\n\t\t\t\t\t\treturn new Value<String>(type, protoValue.getStringValue());\n\t\t\t\t\t}\n\t\t\t\t} else {\n\t\t\t\t\treturn new Value<String>(type, null);\n\t\t\t\t}\n\t\t\tcase Unknown:\n\t\t\tdefault:\n\t\t\t\tlogger.error(\"Unknown DataSetDataType: \" + protoType);\n\t\t\t\tthrow new Exception(\"Failed to decode\");\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/SparkplugBPayloadEncoder.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2023 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message;\n\nimport java.io.IOException;\nimport java.math.BigInteger;\nimport java.nio.ByteBuffer;\nimport java.nio.ByteOrder;\nimport java.nio.charset.StandardCharsets;\nimport java.util.ArrayList;\nimport java.util.Date;\nimport java.util.List;\nimport java.util.Map;\n\nimport org.eclipse.tahu.message.model.DataSet;\nimport org.eclipse.tahu.message.model.DataSetDataType;\nimport org.eclipse.tahu.message.model.File;\nimport org.eclipse.tahu.message.model.MetaData;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Parameter;\nimport org.eclipse.tahu.message.model.ParameterDataType;\nimport org.eclipse.tahu.message.model.PropertyDataType;\nimport org.eclipse.tahu.message.model.PropertySet;\nimport org.eclipse.tahu.message.model.PropertyValue;\nimport org.eclipse.tahu.message.model.Row;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.Template;\nimport org.eclipse.tahu.message.model.Value;\nimport org.eclipse.tahu.protobuf.SparkplugBProto;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\nimport com.google.protobuf.ByteString;\n\n/**\n * A {@link PayloadDecode} implementation for encoding Sparkplug B payloads.\n */\npublic class SparkplugBPayloadEncoder implements PayloadEncoder<SparkplugBPayload> {\n\n\tprivate static final Logger logger = LoggerFactory.getLogger(SparkplugBPayloadEncoder.class.getName());\n\n\t/**\n\t * Default Constructor\n\t */\n\tpublic SparkplugBPayloadEncoder() {\n\t\tsuper();\n\t}\n\n\t@Override\n\tpublic byte[] getBytes(SparkplugBPayload payload, boolean stripDataTypes) throws IOException {\n\n\t\tSparkplugBProto.Payload.Builder protoMsg = SparkplugBProto.Payload.newBuilder();\n\n\t\t// Set the timestamp\n\t\tif (payload.getTimestamp() != null) {\n\t\t\tprotoMsg.setTimestamp(payload.getTimestamp().getTime());\n\t\t}\n\n\t\t// Set the sequence number\n\t\tif (payload.getSeq() != null) {\n\t\t\tprotoMsg.setSeq(payload.getSeq());\n\t\t}\n\n\t\t// Set the UUID if defined\n\t\tif (payload.getUuid() != null) {\n\t\t\tprotoMsg.setUuid(payload.getUuid());\n\t\t}\n\n\t\t// Set the metrics\n\t\tfor (Metric metric : payload.getMetrics()) {\n\t\t\tif (metric == null) {\n\t\t\t\tlogger.warn(\"Not adding NULL metric\");\n\t\t\t\tcontinue;\n\t\t\t}\n\t\t\ttry {\n\t\t\t\tprotoMsg.addMetrics(convertMetric(metric, stripDataTypes));\n\t\t\t} catch (Exception e) {\n\t\t\t\tlogger.error(\"Failed to add metric: {}\", metric.getName(), e);\n\t\t\t\tthrow new RuntimeException(e);\n\t\t\t}\n\t\t}\n\n\t\t// Set the body\n\t\tif (payload.getBody() != null) {\n\t\t\tprotoMsg.setBody(ByteString.copyFrom(payload.getBody()));\n\t\t}\n\n\t\treturn protoMsg.build().toByteArray();\n\t}\n\n\tprivate SparkplugBProto.Payload.Metric.Builder convertMetric(Metric metric, boolean stripDataTypes)\n\t\t\tthrows Exception {\n\n\t\t// build a metric\n\t\tSparkplugBProto.Payload.Metric.Builder builder = SparkplugBProto.Payload.Metric.newBuilder();\n\n\t\t// set the basic parameters\n\t\tif (!stripDataTypes) {\n\t\t\tbuilder.setDatatype(metric.getDataType().toIntValue());\n\t\t}\n\t\tbuilder = setMetricValue(builder, metric, stripDataTypes);\n\n\t\t// Set the name, data type, and value\n\t\tif (metric.hasName()) {\n\t\t\tbuilder.setName(metric.getName());\n\t\t} else {\n\t\t\t// name is an empty String by default and must be cleared\n\t\t\tbuilder.clearName();\n\t\t}\n\n\t\t// Set the alias\n\t\tif (metric.hasAlias()) {\n\t\t\tbuilder.setAlias(metric.getAlias());\n\t\t}\n\n\t\t// Set the timestamp\n\t\tif (metric.getTimestamp() != null) {\n\t\t\tbuilder.setTimestamp(metric.getTimestamp().getTime());\n\t\t}\n\n\t\t// Set isHistorical\n\t\tif (metric.getIsHistorical() != null) {\n\t\t\tbuilder.setIsHistorical(metric.isHistorical());\n\t\t}\n\n\t\t// Set isTransient\n\t\tif (metric.getIsTransient() != null) {\n\t\t\tbuilder.setIsTransient(metric.isTransient());\n\t\t}\n\n\t\t// Set isNull\n\t\tif (metric.getIsNull() != null) {\n\t\t\tbuilder.setIsNull(metric.isNull());\n\t\t}\n\n\t\t// Set the metadata\n\t\tif (metric.getMetaData() != null) {\n\t\t\tbuilder = setMetaData(builder, metric);\n\t\t}\n\n\t\t// Set the property set\n\t\tif (metric.getProperties() != null) {\n\t\t\tbuilder.setProperties(convertPropertySet(metric.getProperties()));\n\t\t}\n\n\t\treturn builder;\n\t}\n\n\tprivate SparkplugBProto.Payload.Template.Parameter.Builder convertParameter(Parameter parameter) throws Exception {\n\n\t\t// build a metric\n\t\tSparkplugBProto.Payload.Template.Parameter.Builder builder =\n\t\t\t\tSparkplugBProto.Payload.Template.Parameter.newBuilder();\n\n\t\tif (logger.isTraceEnabled()) {\n\t\t\tlogger.trace(\"Adding parameter: {}\", parameter.getName());\n\t\t\tlogger.trace(\"            type: {}\", parameter.getType());\n\t\t}\n\n\t\t// Set the name\n\t\tbuilder.setName(parameter.getName());\n\n\t\t// Set the type and value\n\t\tbuilder = setParameterValue(builder, parameter);\n\n\t\treturn builder;\n\t}\n\n\tprivate SparkplugBProto.Payload.PropertySet.Builder convertPropertySet(PropertySet propertySet) throws Exception {\n\t\tSparkplugBProto.Payload.PropertySet.Builder setBuilder = SparkplugBProto.Payload.PropertySet.newBuilder();\n\n\t\tMap<String, PropertyValue> map = propertySet.getPropertyMap();\n\t\tfor (String key : map.keySet()) {\n\t\t\tSparkplugBProto.Payload.PropertyValue.Builder builder = SparkplugBProto.Payload.PropertyValue.newBuilder();\n\t\t\tPropertyValue value = map.get(key);\n\t\t\tPropertyDataType type = value.getType();\n\t\t\tbuilder.setType(type.toIntValue());\n\t\t\tif (value.getValue() == null) {\n\t\t\t\tbuilder.setIsNull(true);\n\t\t\t} else {\n\t\t\t\tswitch (type) {\n\t\t\t\t\tcase Boolean:\n\t\t\t\t\t\tbuilder.setBooleanValue((Boolean) value.getValue());\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase DateTime:\n\t\t\t\t\t\tbuilder.setLongValue(((Date) value.getValue()).getTime());\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase Double:\n\t\t\t\t\t\tbuilder.setDoubleValue((Double) value.getValue());\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase Float:\n\t\t\t\t\t\tbuilder.setFloatValue((Float) value.getValue());\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase Int8:\n\t\t\t\t\t\tbuilder.setIntValue((Byte) value.getValue());\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase Int16:\n\t\t\t\t\t\tbuilder.setIntValue((Short) value.getValue());\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase Int32:\n\t\t\t\t\t\tbuilder.setIntValue((Integer) value.getValue());\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase Int64:\n\t\t\t\t\t\tbuilder.setLongValue((Long) value.getValue());\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase UInt8:\n\t\t\t\t\t\tbuilder.setIntValue(Short.toUnsignedInt((Short) value.getValue()));\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase UInt16:\n\t\t\t\t\t\tbuilder.setIntValue((int) Integer.toUnsignedLong((Integer) value.getValue()));\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase UInt32:\n\t\t\t\t\t\tbuilder.setLongValue(Long.parseUnsignedLong(Long.toUnsignedString((Long) value.getValue())));\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase UInt64:\n\t\t\t\t\t\tbuilder.setLongValue(bigIntegerToUnsignedLong((BigInteger) value.getValue()));\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase String:\n\t\t\t\t\tcase Text:\n\t\t\t\t\t\tbuilder.setStringValue((String) value.getValue());\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase PropertySet:\n\t\t\t\t\t\tbuilder.setPropertysetValue(convertPropertySet((PropertySet) value.getValue()));\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase PropertySetList:\n\t\t\t\t\t\tList<?> setList = (List<?>) value.getValue();\n\t\t\t\t\t\tSparkplugBProto.Payload.PropertySetList.Builder listBuilder =\n\t\t\t\t\t\t\t\tSparkplugBProto.Payload.PropertySetList.newBuilder();\n\t\t\t\t\t\tfor (Object obj : setList) {\n\t\t\t\t\t\t\tlistBuilder.addPropertyset(convertPropertySet((PropertySet) obj));\n\t\t\t\t\t\t}\n\t\t\t\t\t\tbuilder.setPropertysetsValue(listBuilder);\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase Unknown:\n\t\t\t\t\tdefault:\n\t\t\t\t\t\tlogger.error(\"Unsupported PropertyDataType: '{}' for the '{}' property\", value.getType(), key);\n\t\t\t\t\t\tthrow new Exception(\"Failed to convert value \" + value.getType());\n\t\t\t\t}\n\t\t\t}\n\t\t\tsetBuilder.addKeys(key);\n\t\t\tsetBuilder.addValues(builder);\n\t\t}\n\t\treturn setBuilder;\n\t}\n\n\tprivate SparkplugBProto.Payload.Template.Parameter.Builder setParameterValue(\n\t\t\tSparkplugBProto.Payload.Template.Parameter.Builder builder, Parameter parameter) throws Exception {\n\t\tParameterDataType type = parameter.getType();\n\t\tbuilder.setType(type.toIntValue());\n\n\t\tObject value = parameter.getValue();\n\t\tvalue = type == ParameterDataType.String && value == null ? \"\" : value;\n\t\tif (value != null) {\n\t\t\tswitch (type) {\n\t\t\t\tcase Boolean:\n\t\t\t\t\tbuilder.setBooleanValue(toBoolean(value));\n\t\t\t\t\tbreak;\n\t\t\t\tcase DateTime:\n\t\t\t\t\tbuilder.setLongValue(((Date) value).getTime());\n\t\t\t\t\tbreak;\n\t\t\t\tcase Double:\n\t\t\t\t\tbuilder.setDoubleValue((Double) value);\n\t\t\t\t\tbreak;\n\t\t\t\tcase Float:\n\t\t\t\t\tbuilder.setFloatValue((Float) value);\n\t\t\t\t\tbreak;\n\t\t\t\tcase Int8:\n\t\t\t\t\tbuilder.setIntValue((Byte) value);\n\t\t\t\t\tbreak;\n\t\t\t\tcase Int16:\n\t\t\t\t\tbuilder.setIntValue((Short) value);\n\t\t\t\t\tbreak;\n\t\t\t\tcase Int32:\n\t\t\t\t\tbuilder.setIntValue((Integer) value);\n\t\t\t\t\tbreak;\n\t\t\t\tcase Int64:\n\t\t\t\t\tbuilder.setLongValue((Long) value);\n\t\t\t\t\tbreak;\n\t\t\t\tcase UInt8:\n\t\t\t\t\tbuilder.setIntValue(Short.toUnsignedInt((Short) value));\n\t\t\t\t\tbreak;\n\t\t\t\tcase UInt16:\n\t\t\t\t\tbuilder.setIntValue((int) Integer.toUnsignedLong((Integer) value));\n\t\t\t\t\tbreak;\n\t\t\t\tcase UInt32:\n\t\t\t\t\tbuilder.setLongValue(Long.valueOf(Long.toUnsignedString(((BigInteger) value).longValue())));\n\t\t\t\t\tbreak;\n\t\t\t\tcase UInt64:\n\t\t\t\t\tbuilder.setLongValue(bigIntegerToUnsignedLong((BigInteger) value));\n\t\t\t\t\tbreak;\n\t\t\t\tcase Text:\n\t\t\t\tcase String:\n\t\t\t\t\tbuilder.setStringValue((String) value);\n\t\t\t\t\tbreak;\n\t\t\t\tcase Unknown:\n\t\t\t\tdefault:\n\t\t\t\t\tlogger.error(\"Unknown Type: {}\", type);\n\t\t\t\t\tthrow new Exception(\"Failed to encode\");\n\n\t\t\t}\n\t\t}\n\t\treturn builder;\n\t}\n\n\tprivate SparkplugBProto.Payload.Metric.Builder setMetricValue(SparkplugBProto.Payload.Metric.Builder metricBuilder,\n\t\t\tMetric metric, boolean stripDataTypes) throws Exception {\n\n\t\t// Set the data type\n\t\tif (!stripDataTypes) {\n\t\t\tmetricBuilder.setDatatype(metric.getDataType().toIntValue());\n\t\t}\n\n\t\tif (metric.getValue() == null) {\n\t\t\tmetricBuilder.setIsNull(true);\n\t\t} else {\n\t\t\tswitch (metric.getDataType()) {\n\t\t\t\tcase Boolean:\n\t\t\t\t\tmetricBuilder.setBooleanValue(toBoolean(metric.getValue()));\n\t\t\t\t\tbreak;\n\t\t\t\tcase DateTime:\n\t\t\t\t\tmetricBuilder.setLongValue(((Date) metric.getValue()).getTime());\n\t\t\t\t\tbreak;\n\t\t\t\tcase File:\n\t\t\t\t\tmetricBuilder.setBytesValue(ByteString.copyFrom(((File) metric.getValue()).getBytes()));\n\t\t\t\t\tSparkplugBProto.Payload.MetaData.Builder metaDataBuilder =\n\t\t\t\t\t\t\tSparkplugBProto.Payload.MetaData.newBuilder();\n\t\t\t\t\tmetaDataBuilder.setFileName(((File) metric.getValue()).getFileName());\n\t\t\t\t\tmetricBuilder.setMetadata(metaDataBuilder);\n\t\t\t\t\tbreak;\n\t\t\t\tcase Float:\n\t\t\t\t\tmetricBuilder.setFloatValue((Float) metric.getValue());\n\t\t\t\t\tbreak;\n\t\t\t\tcase Double:\n\t\t\t\t\tmetricBuilder.setDoubleValue((Double) metric.getValue());\n\t\t\t\t\tbreak;\n\t\t\t\tcase Int8:\n\t\t\t\t\tmetricBuilder.setIntValue((Byte) metric.getValue());\n\t\t\t\t\tbreak;\n\t\t\t\tcase Int16:\n\t\t\t\t\tmetricBuilder.setIntValue((Short) metric.getValue());\n\t\t\t\t\tbreak;\n\t\t\t\tcase Int32:\n\t\t\t\t\tmetricBuilder.setIntValue((Integer) metric.getValue());\n\t\t\t\t\tbreak;\n\t\t\t\tcase Int64:\n\t\t\t\t\tmetricBuilder.setLongValue((Long) metric.getValue());\n\t\t\t\t\tbreak;\n\t\t\t\tcase UInt8:\n\t\t\t\t\tmetricBuilder.setIntValue(Short.toUnsignedInt((Short) metric.getValue()));\n\t\t\t\t\tbreak;\n\t\t\t\tcase UInt16:\n\t\t\t\t\tmetricBuilder.setIntValue((int) Integer.toUnsignedLong((Integer) metric.getValue()));\n\t\t\t\t\tbreak;\n\t\t\t\tcase UInt32:\n\t\t\t\t\tmetricBuilder.setLongValue(Long.parseUnsignedLong(Long.toUnsignedString((Long) metric.getValue())));\n\t\t\t\t\tbreak;\n\t\t\t\tcase UInt64:\n\t\t\t\t\tmetricBuilder.setLongValue(bigIntegerToUnsignedLong((BigInteger) metric.getValue()));\n\t\t\t\t\tbreak;\n\t\t\t\tcase String:\n\t\t\t\tcase Text:\n\t\t\t\tcase UUID:\n\t\t\t\t\tmetricBuilder.setStringValue((String) metric.getValue());\n\t\t\t\t\tbreak;\n\t\t\t\tcase Bytes:\n\t\t\t\t\tmetricBuilder.setBytesValue(ByteString.copyFrom((byte[]) metric.getValue()));\n\t\t\t\t\tbreak;\n\t\t\t\tcase DataSet:\n\t\t\t\t\tDataSet dataSet = (DataSet) metric.getValue();\n\t\t\t\t\tSparkplugBProto.Payload.DataSet.Builder dataSetBuilder =\n\t\t\t\t\t\t\tSparkplugBProto.Payload.DataSet.newBuilder();\n\n\t\t\t\t\tdataSetBuilder.setNumOfColumns(dataSet.getNumOfColumns());\n\n\t\t\t\t\t// Column names\n\t\t\t\t\tList<String> columnNames = dataSet.getColumnNames();\n\t\t\t\t\tif (columnNames != null && !columnNames.isEmpty()) {\n\t\t\t\t\t\tfor (String name : columnNames) {\n\t\t\t\t\t\t\t// Add the column name\n\t\t\t\t\t\t\tdataSetBuilder.addColumns(name);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\n\t\t\t\t\t// Column types\n\t\t\t\t\tList<DataSetDataType> columnTypes = dataSet.getTypes();\n\t\t\t\t\tif (columnTypes != null && !columnTypes.isEmpty()) {\n\t\t\t\t\t\tfor (DataSetDataType type : columnTypes) {\n\t\t\t\t\t\t\t// Add the column type\n\t\t\t\t\t\t\tdataSetBuilder.addTypes(type.toIntValue());\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\n\t\t\t\t\t// Dataset rows\n\t\t\t\t\tList<Row> rows = dataSet.getRows();\n\t\t\t\t\tif (rows != null && !rows.isEmpty()) {\n\t\t\t\t\t\tfor (Row row : rows) {\n\t\t\t\t\t\t\tSparkplugBProto.Payload.DataSet.Row.Builder protoRowBuilder =\n\t\t\t\t\t\t\t\t\tSparkplugBProto.Payload.DataSet.Row.newBuilder();\n\t\t\t\t\t\t\tList<Value<?>> values = row.getValues();\n\t\t\t\t\t\t\tif (values != null && !values.isEmpty()) {\n\t\t\t\t\t\t\t\tfor (Value<?> value : values) {\n\t\t\t\t\t\t\t\t\t// Add the converted element\n\t\t\t\t\t\t\t\t\tprotoRowBuilder.addElements(convertDataSetValue(value));\n\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\tdataSetBuilder.addRows(protoRowBuilder);\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\n\t\t\t\t\t// Finally add the dataset\n\t\t\t\t\tmetricBuilder.setDatasetValue(dataSetBuilder);\n\t\t\t\t\tbreak;\n\t\t\t\tcase Template:\n\t\t\t\t\tTemplate template = (Template) metric.getValue();\n\t\t\t\t\tSparkplugBProto.Payload.Template.Builder templateBuilder =\n\t\t\t\t\t\t\tSparkplugBProto.Payload.Template.newBuilder();\n\n\t\t\t\t\t// Set isDefinition\n\t\t\t\t\ttemplateBuilder.setIsDefinition(template.isDefinition());\n\n\t\t\t\t\t// Set Version\n\t\t\t\t\tif (template.getVersion() != null) {\n\t\t\t\t\t\ttemplateBuilder.setVersion(template.getVersion());\n\t\t\t\t\t}\n\n\t\t\t\t\t// Set Template Reference\n\t\t\t\t\tif (template.getTemplateRef() != null) {\n\t\t\t\t\t\ttemplateBuilder.setTemplateRef(template.getTemplateRef());\n\t\t\t\t\t}\n\n\t\t\t\t\t// Set the template metrics\n\t\t\t\t\tif (template.getMetrics() != null) {\n\t\t\t\t\t\tfor (Metric templateMetric : template.getMetrics()) {\n\t\t\t\t\t\t\ttemplateBuilder.addMetrics(convertMetric(templateMetric, stripDataTypes));\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\n\t\t\t\t\t// Set the template parameters\n\t\t\t\t\tif (template.getParameters() != null) {\n\t\t\t\t\t\tfor (Parameter parameter : template.getParameters()) {\n\t\t\t\t\t\t\ttemplateBuilder.addParameters(convertParameter(parameter));\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\n\t\t\t\t\t// Add the template to the metric\n\t\t\t\t\tmetricBuilder.setTemplateValue(templateBuilder);\n\t\t\t\t\tbreak;\n\t\t\t\tcase Int8Array:\n\t\t\t\t\tByte[] int8ArrayValue = (Byte[]) metric.getValue();\n\t\t\t\t\tByteBuffer int8ByteBuffer =\n\t\t\t\t\t\t\tByteBuffer.allocate(int8ArrayValue.length).order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\t\tboolean hasNullInt8ArrayElements = false;\n\t\t\t\t\tfor (Byte value : int8ArrayValue) {\n\t\t\t\t\t\tif (value != null) {\n\t\t\t\t\t\t\tint8ByteBuffer.put(value);\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\thasNullInt8ArrayElements = true;\n\t\t\t\t\t\t\tint8ByteBuffer.put((byte) 0);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t\tif (hasNullInt8ArrayElements) {\n\t\t\t\t\t\tlogger.warn(\n\t\t\t\t\t\t\t\t\"SparkplugB doesn't support 'null' elements in the {} Int8Array. All such elements will be set to 0.\",\n\t\t\t\t\t\t\t\tmetric.getName());\n\t\t\t\t\t}\n\t\t\t\t\tif (int8ByteBuffer.hasArray()) {\n\t\t\t\t\t\tmetricBuilder.setBytesValue(ByteString.copyFrom(int8ByteBuffer.array()));\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\t\t\t\tcase Int16Array:\n\t\t\t\t\tShort[] int16ArrayValue = (Short[]) metric.getValue();\n\t\t\t\t\tByteBuffer int16ByteBuffer =\n\t\t\t\t\t\t\tByteBuffer.allocate(int16ArrayValue.length * 2).order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\t\tboolean hasNullInt16ArrayElements = false;\n\t\t\t\t\tfor (Short value : int16ArrayValue) {\n\t\t\t\t\t\tif (value != null) {\n\t\t\t\t\t\t\tint16ByteBuffer.putShort(value);\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\thasNullInt16ArrayElements = true;\n\t\t\t\t\t\t\tint16ByteBuffer.putShort((short) 0);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t\tif (hasNullInt16ArrayElements) {\n\t\t\t\t\t\tlogger.warn(\n\t\t\t\t\t\t\t\t\"SparkplugB doesn't support 'null' elements in the {} Int16Array. All such elements will be set to 0.\",\n\t\t\t\t\t\t\t\tmetric.getName());\n\t\t\t\t\t}\n\t\t\t\t\tif (int16ByteBuffer.hasArray()) {\n\t\t\t\t\t\tmetricBuilder.setBytesValue(ByteString.copyFrom(int16ByteBuffer.array()));\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\t\t\t\tcase Int32Array:\n\t\t\t\t\tInteger[] int32ArrayValue = (Integer[]) metric.getValue();\n\t\t\t\t\tByteBuffer int32ByteBuffer =\n\t\t\t\t\t\t\tByteBuffer.allocate(int32ArrayValue.length * 4).order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\t\tboolean hasNullInt32ArrayElements = false;\n\t\t\t\t\tfor (Integer value : int32ArrayValue) {\n\t\t\t\t\t\tif (value != null) {\n\t\t\t\t\t\t\tint32ByteBuffer.putInt(value);\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\thasNullInt32ArrayElements = true;\n\t\t\t\t\t\t\tint32ByteBuffer.putInt(0);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t\tif (hasNullInt32ArrayElements) {\n\t\t\t\t\t\tlogger.warn(\n\t\t\t\t\t\t\t\t\"SparkplugB doesn't support 'null' elements in the {} Int32Array. All such elements will be set to 0.\",\n\t\t\t\t\t\t\t\tmetric.getName());\n\t\t\t\t\t}\n\t\t\t\t\tif (int32ByteBuffer.hasArray()) {\n\t\t\t\t\t\tmetricBuilder.setBytesValue(ByteString.copyFrom(int32ByteBuffer.array()));\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\t\t\t\tcase Int64Array:\n\t\t\t\t\tLong[] int64ArrayValue = (Long[]) metric.getValue();\n\t\t\t\t\tByteBuffer int64ByteBuffer =\n\t\t\t\t\t\t\tByteBuffer.allocate(int64ArrayValue.length * 8).order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\t\tboolean hasNullInt64ArrayElements = false;\n\t\t\t\t\tfor (Long value : int64ArrayValue) {\n\t\t\t\t\t\tif (value != null) {\n\t\t\t\t\t\t\tint64ByteBuffer.putLong(value);\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\thasNullInt64ArrayElements = true;\n\t\t\t\t\t\t\tint64ByteBuffer.putLong(0L);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t\tif (hasNullInt64ArrayElements) {\n\t\t\t\t\t\tlogger.warn(\n\t\t\t\t\t\t\t\t\"SparkplugB doesn't support 'null' elements in the {} Int64Array. All such elements will be set to 0.\",\n\t\t\t\t\t\t\t\tmetric.getName());\n\t\t\t\t\t}\n\t\t\t\t\tif (int64ByteBuffer.hasArray()) {\n\t\t\t\t\t\tmetricBuilder.setBytesValue(ByteString.copyFrom(int64ByteBuffer.array()));\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\t\t\t\tcase UInt8Array:\n\t\t\t\t\tShort[] uInt8ArrayValue = (Short[]) metric.getValue();\n\t\t\t\t\tByteBuffer uInt8ByteBuffer =\n\t\t\t\t\t\t\tByteBuffer.allocate(uInt8ArrayValue.length).order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\t\tboolean hasNullUnt8ArrayElements = false;\n\t\t\t\t\tfor (Short value : uInt8ArrayValue) {\n\t\t\t\t\t\tif (value != null) {\n\t\t\t\t\t\t\tuInt8ByteBuffer.put((byte) (value & 0xffff));\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\thasNullUnt8ArrayElements = true;\n\t\t\t\t\t\t\tuInt8ByteBuffer.put((byte) 0);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t\tif (hasNullUnt8ArrayElements) {\n\t\t\t\t\t\tlogger.warn(\n\t\t\t\t\t\t\t\t\"SparkplugB doesn't support 'null' elements in the {} UInt8Array. All such elements will be set to 0.\",\n\t\t\t\t\t\t\t\tmetric.getName());\n\t\t\t\t\t}\n\t\t\t\t\tif (uInt8ByteBuffer.hasArray()) {\n\t\t\t\t\t\tmetricBuilder.setBytesValue(ByteString.copyFrom(uInt8ByteBuffer.array()));\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\t\t\t\tcase UInt16Array:\n\t\t\t\t\tInteger[] uInt16ArrayValue = (Integer[]) metric.getValue();\n\t\t\t\t\tByteBuffer uInt16ByteBuffer =\n\t\t\t\t\t\t\tByteBuffer.allocate(uInt16ArrayValue.length * 2).order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\t\tboolean hasNullUnt16ArrayElements = false;\n\t\t\t\t\tfor (Integer value : uInt16ArrayValue) {\n\t\t\t\t\t\tif (value != null) {\n\t\t\t\t\t\t\tuInt16ByteBuffer.putShort((short) (value & 0xffffffff));\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\thasNullUnt16ArrayElements = true;\n\t\t\t\t\t\t\tuInt16ByteBuffer.putShort((short) 0);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t\tif (hasNullUnt16ArrayElements) {\n\t\t\t\t\t\tlogger.warn(\n\t\t\t\t\t\t\t\t\"SparkplugB doesn't support 'null' elements in the {} UInt16Array. All such elements will be set to 0.\",\n\t\t\t\t\t\t\t\tmetric.getName());\n\t\t\t\t\t}\n\t\t\t\t\tif (uInt16ByteBuffer.hasArray()) {\n\t\t\t\t\t\tmetricBuilder.setBytesValue(ByteString.copyFrom(uInt16ByteBuffer.array()));\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\t\t\t\tcase UInt32Array:\n\t\t\t\t\tLong[] uInt32ArrayValue = (Long[]) metric.getValue();\n\t\t\t\t\tByteBuffer uInt32ByteBuffer =\n\t\t\t\t\t\t\tByteBuffer.allocate(uInt32ArrayValue.length * 4).order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\t\tboolean hasNullUnt32ArrayElements = false;\n\t\t\t\t\tfor (Long value : uInt32ArrayValue) {\n\t\t\t\t\t\tif (value != null) {\n\t\t\t\t\t\t\tuInt32ByteBuffer.putInt((int) (value & 0xffffffffffffffffL));\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\thasNullUnt32ArrayElements = true;\n\t\t\t\t\t\t\tuInt32ByteBuffer.putInt(0);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t\tif (hasNullUnt32ArrayElements) {\n\t\t\t\t\t\tlogger.warn(\n\t\t\t\t\t\t\t\t\"SparkplugB doesn't support 'null' elements in the {} UInt32Array. All such elements will be set to 0.\",\n\t\t\t\t\t\t\t\tmetric.getName());\n\t\t\t\t\t}\n\t\t\t\t\tif (uInt32ByteBuffer.hasArray()) {\n\t\t\t\t\t\tmetricBuilder.setBytesValue(ByteString.copyFrom(uInt32ByteBuffer.array()));\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\t\t\t\tcase UInt64Array:\n\t\t\t\t\tBigInteger[] uInt64ArrayValue = (BigInteger[]) metric.getValue();\n\t\t\t\t\tByteBuffer uInt64ByteBuffer =\n\t\t\t\t\t\t\tByteBuffer.allocate(uInt64ArrayValue.length * 8).order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\t\tboolean hasNullUnt64ArrayElements = false;\n\t\t\t\t\tfor (BigInteger value : uInt64ArrayValue) {\n\t\t\t\t\t\tif (value != null) {\n\t\t\t\t\t\t\tuInt64ByteBuffer.putLong(bigIntegerToUnsignedLong(value));\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\thasNullUnt64ArrayElements = true;\n\t\t\t\t\t\t\tuInt64ByteBuffer.putLong(0L);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t\tif (hasNullUnt64ArrayElements) {\n\t\t\t\t\t\tlogger.warn(\n\t\t\t\t\t\t\t\t\"SparkplugB doesn't support 'null' elements in the {} UInt64Array. All such elements will be set to 0.\",\n\t\t\t\t\t\t\t\tmetric.getName());\n\t\t\t\t\t}\n\t\t\t\t\tif (uInt64ByteBuffer.hasArray()) {\n\t\t\t\t\t\tmetricBuilder.setBytesValue(ByteString.copyFrom(uInt64ByteBuffer.array()));\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\t\t\t\tcase FloatArray:\n\t\t\t\t\tFloat[] floatArrayValue = (Float[]) metric.getValue();\n\t\t\t\t\tByteBuffer floatByteBuffer =\n\t\t\t\t\t\t\tByteBuffer.allocate(floatArrayValue.length * 4).order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\t\tboolean hasNullFloatArrayElements = false;\n\t\t\t\t\tfor (Float value : floatArrayValue) {\n\t\t\t\t\t\tif (value != null) {\n\t\t\t\t\t\t\tfloatByteBuffer.putFloat(value);\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\thasNullFloatArrayElements = true;\n\t\t\t\t\t\t\tfloatByteBuffer.putFloat(0);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t\tif (hasNullFloatArrayElements) {\n\t\t\t\t\t\tlogger.warn(\n\t\t\t\t\t\t\t\t\"SparkplugB doesn't support 'null' elements in the {} FloatArray. All such elements will be set to 0.\",\n\t\t\t\t\t\t\t\tmetric.getName());\n\t\t\t\t\t}\n\t\t\t\t\tif (floatByteBuffer.hasArray()) {\n\t\t\t\t\t\tmetricBuilder.setBytesValue(ByteString.copyFrom(floatByteBuffer.array()));\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\t\t\t\tcase DoubleArray:\n\t\t\t\t\tDouble[] doubleArrayValue = (Double[]) metric.getValue();\n\t\t\t\t\tByteBuffer doubleByteBuffer =\n\t\t\t\t\t\t\tByteBuffer.allocate(doubleArrayValue.length * 8).order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\t\tboolean hasNullDoubleArrayElements = false;\n\t\t\t\t\tfor (Double value : doubleArrayValue) {\n\t\t\t\t\t\tif (value != null) {\n\t\t\t\t\t\t\tdoubleByteBuffer.putDouble(value);\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\thasNullDoubleArrayElements = true;\n\t\t\t\t\t\t\tdoubleByteBuffer.putDouble(0);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t\tif (hasNullDoubleArrayElements) {\n\t\t\t\t\t\tlogger.warn(\n\t\t\t\t\t\t\t\t\"SparkplugB doesn't support 'null' elements in the {} DoubleArray. All such elements will be set to 0.\",\n\t\t\t\t\t\t\t\tmetric.getName());\n\t\t\t\t\t}\n\t\t\t\t\tif (doubleByteBuffer.hasArray()) {\n\t\t\t\t\t\tmetricBuilder.setBytesValue(ByteString.copyFrom(doubleByteBuffer.array()));\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\t\t\t\tcase BooleanArray:\n\t\t\t\t\tBoolean[] booleanArrayValue = (Boolean[]) metric.getValue();\n\t\t\t\t\tint numberOfBytes = (int) Math.ceil((double) booleanArrayValue.length / 8);\n\t\t\t\t\tByteBuffer booleanByteBuffer =\n\t\t\t\t\t\t\tByteBuffer.allocate(4 + numberOfBytes).order(ByteOrder.LITTLE_ENDIAN);\n\n\t\t\t\t\t// The first 4 bytes is the number of booleans in the array\n\t\t\t\t\tbooleanByteBuffer.putInt(booleanArrayValue.length);\n\n\t\t\t\t\t// Get the remaining bytes\n\t\t\t\t\tboolean hasNullBooleanArrayElements = false;\n\t\t\t\t\tfor (int i = 0; i < numberOfBytes; i++) {\n\t\t\t\t\t\tbyte nextByte = 0;\n\t\t\t\t\t\tfor (int bit = 0; bit < 8; bit++) {\n\t\t\t\t\t\t\tint index = i * 8 + bit;\n\t\t\t\t\t\t\tif (index < booleanArrayValue.length) {\n\t\t\t\t\t\t\t\tBoolean value = booleanArrayValue[index];\n\t\t\t\t\t\t\t\tif (value == null) {\n\t\t\t\t\t\t\t\t\thasNullBooleanArrayElements = true;\n\t\t\t\t\t\t\t\t\tvalue = Boolean.valueOf(false);\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\tif (value.booleanValue()) {\n\t\t\t\t\t\t\t\t\tnextByte |= (128 >> bit);\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t\tbooleanByteBuffer.put(nextByte);\n\t\t\t\t\t}\n\t\t\t\t\tif (hasNullBooleanArrayElements) {\n\t\t\t\t\t\tlogger.warn(\n\t\t\t\t\t\t\t\t\"SparkplugB doesn't support 'null' elements in the {} BooleanArray. All such elements will be set to 'false'.\",\n\t\t\t\t\t\t\t\tmetric.getName());\n\t\t\t\t\t}\n\t\t\t\t\tmetricBuilder.setBytesValue(ByteString.copyFrom(booleanByteBuffer.array()));\n\t\t\t\t\tbreak;\n\t\t\t\tcase StringArray:\n\t\t\t\t\tString[] stringArrayValue = (String[]) metric.getValue();\n\n\t\t\t\t\tint size = 0;\n\t\t\t\t\tList<byte[]> bytesArrays = new ArrayList<>();\n\t\t\t\t\tboolean hasNullStringArrayElements = false;\n\t\t\t\t\tfor (String string : stringArrayValue) {\n\t\t\t\t\t\tbyte[] stringBytes = null;\n\t\t\t\t\t\tif (string != null) {\n\t\t\t\t\t\t\tstringBytes = string.getBytes(StandardCharsets.UTF_8);\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\thasNullStringArrayElements = true;\n\t\t\t\t\t\t\tstringBytes = new byte[0];\n\t\t\t\t\t\t}\n\t\t\t\t\t\tsize = size + stringBytes.length + 1;\n\t\t\t\t\t\tbytesArrays.add(stringBytes);\n\t\t\t\t\t}\n\t\t\t\t\tif (hasNullStringArrayElements) {\n\t\t\t\t\t\tlogger.warn(\n\t\t\t\t\t\t\t\t\"SparkplugB doesn't support 'null' elements in the {} StringArray. All such elements will be set to an empty string.\",\n\t\t\t\t\t\t\t\tmetric.getName());\n\t\t\t\t\t}\n\t\t\t\t\tByteBuffer stringByteBuffer = ByteBuffer.allocate(size).order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\t\tfor (byte[] bytesArray : bytesArrays) {\n\t\t\t\t\t\tstringByteBuffer.put(bytesArray);\n\t\t\t\t\t\tstringByteBuffer.put((byte) 0);\n\t\t\t\t\t}\n\t\t\t\t\tif (stringByteBuffer.hasArray()) {\n\t\t\t\t\t\tmetricBuilder.setBytesValue(ByteString.copyFrom(stringByteBuffer.array()));\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\t\t\t\tcase DateTimeArray:\n\t\t\t\t\tDate[] dateTimeArrayValue = (Date[]) metric.getValue();\n\t\t\t\t\tByteBuffer dateTimeByteBuffer =\n\t\t\t\t\t\t\tByteBuffer.allocate(dateTimeArrayValue.length * 8).order(ByteOrder.LITTLE_ENDIAN);\n\t\t\t\t\tboolean hasNullDateTimeArrayElements = false;\n\t\t\t\t\tfor (Date value : dateTimeArrayValue) {\n\t\t\t\t\t\tif (value != null) {\n\t\t\t\t\t\t\tdateTimeByteBuffer.putLong(value.getTime());\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\thasNullDateTimeArrayElements = true;\n\t\t\t\t\t\t\tdateTimeByteBuffer.putLong(new Date(0L).getTime());\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t\tif (hasNullDateTimeArrayElements) {\n\t\t\t\t\t\tlogger.warn(\n\t\t\t\t\t\t\t\t\"SparkplugB doesn't support 'null' elements in the {} DateTimeArray. All such elements will be set to start of epoch.\",\n\t\t\t\t\t\t\t\tmetric.getName());\n\t\t\t\t\t}\n\t\t\t\t\tif (dateTimeByteBuffer.hasArray()) {\n\t\t\t\t\t\tmetricBuilder.setBytesValue(ByteString.copyFrom(dateTimeByteBuffer.array()));\n\t\t\t\t\t}\n\t\t\t\t\tbreak;\n\t\t\t\tcase Unknown:\n\t\t\t\tdefault:\n\t\t\t\t\tlogger.error(\"Unsupported MetricDataType: {} for the {} metric\", metric.getDataType(),\n\t\t\t\t\t\t\tmetric.getName());\n\t\t\t\t\tthrow new Exception(\"Failed to encode\");\n\n\t\t\t}\n\t\t}\n\t\treturn metricBuilder;\n\t}\n\n\tprivate SparkplugBProto.Payload.Metric.Builder setMetaData(SparkplugBProto.Payload.Metric.Builder metricBuilder,\n\t\t\tMetric metric) throws Exception {\n\t\t// If the builder has been built already - use it\n\t\tSparkplugBProto.Payload.MetaData.Builder metaDataBuilder = metricBuilder.getMetadataBuilder() != null\n\t\t\t\t? metricBuilder.getMetadataBuilder()\n\t\t\t\t: SparkplugBProto.Payload.MetaData.newBuilder();\n\n\t\tMetaData metaData = metric.getMetaData();\n\t\tif (metaData.getContentType() != null) {\n\t\t\tmetaDataBuilder.setContentType(metaData.getContentType());\n\t\t}\n\t\tif (metaData.getSize() != null) {\n\t\t\tmetaDataBuilder.setSize(metaData.getSize());\n\t\t}\n\t\tif (metaData.getSeq() != null) {\n\t\t\tmetaDataBuilder.setSeq(metaData.getSeq());\n\t\t}\n\t\tif (metaData.getFileName() != null) {\n\t\t\tmetaDataBuilder.setFileName(metaData.getFileName());\n\t\t}\n\t\tif (metaData.getFileType() != null) {\n\t\t\tmetaDataBuilder.setFileType(metaData.getFileType());\n\t\t}\n\t\tif (metaData.getMd5() != null) {\n\t\t\tmetaDataBuilder.setMd5(metaData.getMd5());\n\t\t}\n\t\tif (metaData.isMultiPart() != null) {\n\t\t\tmetaDataBuilder.setIsMultiPart(metaData.isMultiPart());\n\t\t}\n\t\tif (metaData.getDescription() != null) {\n\t\t\tmetaDataBuilder.setDescription(metaData.getDescription());\n\t\t}\n\t\tmetricBuilder.setMetadata(metaDataBuilder);\n\n\t\treturn metricBuilder;\n\t}\n\n\tprivate SparkplugBProto.Payload.DataSet.DataSetValue.Builder convertDataSetValue(Value<?> value) throws Exception {\n\t\tSparkplugBProto.Payload.DataSet.DataSetValue.Builder protoValueBuilder =\n\t\t\t\tSparkplugBProto.Payload.DataSet.DataSetValue.newBuilder();\n\n\t\t// Set the value\n\t\tDataSetDataType type = value.getType();\n\n\t\tswitch (type) {\n\t\t\tcase Int8:\n\t\t\t\tif (value == null || value.getValue() == null) {\n\t\t\t\t\treturn protoValueBuilder;\n\t\t\t\t}\n\t\t\t\tprotoValueBuilder.setIntValue((Byte) value.getValue());\n\t\t\t\tbreak;\n\t\t\tcase Int16:\n\t\t\t\tif (value == null || value.getValue() == null) {\n\t\t\t\t\treturn protoValueBuilder;\n\t\t\t\t}\n\t\t\t\tprotoValueBuilder.setIntValue((Short) value.getValue());\n\t\t\t\tbreak;\n\t\t\tcase Int32:\n\t\t\t\tif (value == null || value.getValue() == null) {\n\t\t\t\t\treturn protoValueBuilder;\n\t\t\t\t}\n\t\t\t\tprotoValueBuilder.setIntValue((Integer) value.getValue());\n\t\t\t\tbreak;\n\t\t\tcase Int64:\n\t\t\t\tif (value == null || value.getValue() == null) {\n\t\t\t\t\treturn protoValueBuilder;\n\t\t\t\t}\n\t\t\t\tprotoValueBuilder.setLongValue((Long) value.getValue());\n\t\t\t\tbreak;\n\t\t\tcase UInt8:\n\t\t\t\tif (value == null || value.getValue() == null) {\n\t\t\t\t\treturn protoValueBuilder;\n\t\t\t\t}\n\t\t\t\tprotoValueBuilder.setIntValue(Short.toUnsignedInt((Short) value.getValue()));\n\t\t\t\tbreak;\n\t\t\tcase UInt16:\n\t\t\t\tif (value == null || value.getValue() == null) {\n\t\t\t\t\treturn protoValueBuilder;\n\t\t\t\t}\n\t\t\t\tprotoValueBuilder.setIntValue((int) Integer.toUnsignedLong((Integer) value.getValue()));\n\t\t\t\tbreak;\n\t\t\tcase UInt32:\n\t\t\t\tif (value == null || value.getValue() == null) {\n\t\t\t\t\treturn protoValueBuilder;\n\t\t\t\t}\n\t\t\t\tprotoValueBuilder.setLongValue(Long.parseUnsignedLong(Long.toUnsignedString((Long) value.getValue())));\n\t\t\t\tbreak;\n\t\t\tcase UInt64:\n\t\t\t\tif (value == null || value.getValue() == null) {\n\t\t\t\t\treturn protoValueBuilder;\n\t\t\t\t}\n\t\t\t\tprotoValueBuilder.setLongValue(bigIntegerToUnsignedLong((BigInteger) value.getValue()));\n\t\t\t\tbreak;\n\t\t\tcase Float:\n\t\t\t\tif (value == null || value.getValue() == null) {\n\t\t\t\t\treturn protoValueBuilder;\n\t\t\t\t}\n\t\t\t\tprotoValueBuilder.setFloatValue((Float) value.getValue());\n\t\t\t\tbreak;\n\t\t\tcase Double:\n\t\t\t\tif (value == null || value.getValue() == null) {\n\t\t\t\t\treturn protoValueBuilder;\n\t\t\t\t}\n\t\t\t\tprotoValueBuilder.setDoubleValue((Double) value.getValue());\n\t\t\t\tbreak;\n\t\t\tcase String:\n\t\t\tcase Text:\n\t\t\t\tif (value == null || value.getValue() == null) {\n\t\t\t\t\treturn protoValueBuilder;\n\t\t\t\t}\n\t\t\t\tprotoValueBuilder.setStringValue((String) value.getValue());\n\t\t\t\tbreak;\n\t\t\tcase Boolean:\n\t\t\t\tif (value == null || value.getValue() == null) {\n\t\t\t\t\treturn protoValueBuilder;\n\t\t\t\t}\n\t\t\t\tprotoValueBuilder.setBooleanValue(toBoolean(value.getValue()));\n\t\t\t\tbreak;\n\t\t\tcase DateTime:\n\t\t\t\tif (value == null || value.getValue() == null) {\n\t\t\t\t\treturn protoValueBuilder;\n\t\t\t\t}\n\t\t\t\tprotoValueBuilder.setLongValue(((Date) value.getValue()).getTime());\n\t\t\t\tbreak;\n\t\t\tdefault:\n\t\t\t\tlogger.error(\"Unknown DataSetDataType DataType: \" + value.getType());\n\t\t\t\tthrow new Exception(\"Failed to convert value \" + value.getType());\n\t\t}\n\n\t\treturn protoValueBuilder;\n\t}\n\n\tprivate Boolean toBoolean(Object value) {\n\t\tif (value == null) {\n\t\t\treturn null;\n\t\t}\n\t\tif (value instanceof Integer) {\n\t\t\treturn ((Integer) value).intValue() == 0 ? Boolean.FALSE : Boolean.TRUE;\n\t\t} else if (value instanceof Long) {\n\t\t\treturn ((Long) value).longValue() == 0 ? Boolean.FALSE : Boolean.TRUE;\n\t\t} else if (value instanceof Float) {\n\t\t\treturn ((Float) value).floatValue() == 0 ? Boolean.FALSE : Boolean.TRUE;\n\t\t} else if (value instanceof Double) {\n\t\t\treturn ((Double) value).doubleValue() == 0 ? Boolean.FALSE : Boolean.TRUE;\n\t\t} else if (value instanceof Short) {\n\t\t\treturn ((Short) value).shortValue() == 0 ? Boolean.FALSE : Boolean.TRUE;\n\t\t} else if (value instanceof Byte) {\n\t\t\treturn ((Byte) value).byteValue() == 0 ? Boolean.FALSE : Boolean.TRUE;\n\t\t} else if (value instanceof String) {\n\t\t\treturn Boolean.parseBoolean(value.toString());\n\t\t}\n\t\treturn (Boolean) value;\n\t}\n\n\tprivate long bigIntegerToUnsignedLong(BigInteger bigInteger) {\n\t\tBigInteger bref = BigInteger.ONE.shiftLeft(64);\n\t\tif (bigInteger.compareTo(BigInteger.ZERO) < 0)\n\t\t\tbigInteger = bigInteger.add(bref);\n\t\tif (bigInteger.compareTo(bref) >= 0 || bigInteger.compareTo(BigInteger.ZERO) < 0)\n\t\t\tthrow new RuntimeException(\"Out of range: \" + bigInteger);\n\t\treturn bigInteger.longValue();\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/DataSet.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.util.ArrayList;\nimport java.util.Collection;\nimport java.util.List;\n\nimport org.eclipse.tahu.SparkplugException;\nimport org.eclipse.tahu.json.DataSetDeserializer;\nimport org.eclipse.tahu.message.model.Row.RowBuilder;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\nimport com.fasterxml.jackson.annotation.JsonGetter;\nimport com.fasterxml.jackson.annotation.JsonProperty;\nimport com.fasterxml.jackson.databind.annotation.JsonDeserialize;\n\n/**\n * A data set that represents a table of data.\n */\n@JsonDeserialize(\n\t\tusing = DataSetDeserializer.class)\npublic class DataSet {\n\n\tprivate static final Logger logger = LoggerFactory.getLogger(DataSet.class.getName());\n\n\t/**\n\t * The number of columns\n\t */\n\t@JsonProperty(\"numberOfColumns\")\n\tprivate long numOfColumns;\n\n\t/**\n\t * A list containing the names of each column\n\t */\n\t@JsonProperty(\"columnNames\")\n\tprivate List<String> columnNames;\n\n\t/**\n\t * A list containing the data types of each column\n\t */\n\t@JsonProperty(\"types\")\n\tprivate List<DataSetDataType> types;\n\n\t/**\n\t * A list containing the rows in the data set\n\t */\n\tprivate List<Row> rows;\n\n\t/**\n\t * Default Constructor\n\t */\n\tpublic DataSet() {\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param numOfColumns the number of columns in the {@link DataSet}\n\t * @param columnNames a {@link List} of column names in the {@link DataSet}\n\t * @param types a {@link List} of {@link DataSetDataTypes}s for the columns\n\t * @param rows a {@link List} of {@link Row}s in the {@link DataSet}\n\t */\n\tpublic DataSet(long numOfColumns, List<String> columnNames, List<DataSetDataType> types, List<Row> rows) {\n\t\tthis.numOfColumns = numOfColumns;\n\t\tthis.columnNames = columnNames;\n\t\tthis.types = types;\n\t\tthis.rows = rows;\n\t}\n\n\t/**\n\t * Returns the number of columns in the {@link DataSet}\n\t *\n\t * @return the number of columns in the {@link DataSet}\n\t */\n\tpublic long getNumOfColumns() {\n\t\treturn numOfColumns;\n\t}\n\n\t/**\n\t * Sets the number of columns in the {@link DataSet}\n\t *\n\t * @param numOfColumns the number of columns to set in the {@link DataSet}\n\t */\n\tpublic void setNumOfColumns(long numOfColumns) {\n\t\tthis.numOfColumns = numOfColumns;\n\t}\n\n\t/**\n\t * Gets a {@link List} of the column names in the {@link DataSet}\n\t *\n\t * @return a {@link List} of the column names in the {@link DataSet}\n\t */\n\tpublic List<String> getColumnNames() {\n\t\treturn columnNames;\n\t}\n\n\t/**\n\t * Sets a {@link List} of column names in the {@link DataSet}\n\t *\n\t * @param columnNames a {@link List} of column names to set in the {@link DataSet}\n\t */\n\tpublic void setColumnNames(List<String> columnNames) {\n\t\tthis.columnNames = columnNames;\n\t}\n\n\t/**\n\t * Adds a column name to the {@link List} of column names in the {@link DataSet}\n\t *\n\t * @param columnName the column name to add to the {@link List} of column names in the {@link DataSet}\n\t */\n\tpublic void addColumnName(String columnName) {\n\t\tthis.columnNames.add(columnName);\n\t}\n\n\t/**\n\t * Gets a {@link List} of {@link Row}s for the {@link DataSet}\n\t *\n\t * @return a {@link List} of {@link Row}s for the {@link DataSet}\n\t */\n\tpublic List<Row> getRows() {\n\t\treturn rows;\n\t}\n\n\t/**\n\t * Gets a {@link List} of {@link List}s of {@link Object}s representing the rows in the {@link DataSet}\n\t *\n\t * @return a {@link List} of {@link List}s of {@link Object}s representing the rows in the {@link DataSet}\n\t */\n\t@JsonGetter(\"rows\")\n\tpublic List<List<Object>> getRowsAsLists() {\n\t\tList<List<Object>> list = new ArrayList<List<Object>>(getRows().size());\n\t\tfor (Row row : getRows()) {\n\t\t\tlist.add(Row.toValues(row));\n\t\t}\n\t\treturn list;\n\t}\n\n\t/**\n\t * Adds a {@link Row} to the {@link DataSet}\n\t *\n\t * @param row a {@link Row} to add to the {@link DataSet}\n\t */\n\tpublic void addRow(Row row) {\n\t\tthis.rows.add(row);\n\t}\n\n\t/**\n\t * Adds a {@link Row} to the {@link DataSet} at a specified index\n\t *\n\t * @param index the index to add the {@link Row} in the {@link List} of {@link Row}s\n\t * @param row the {@link Row} to add\n\t */\n\tpublic void addRow(int index, Row row) {\n\t\tthis.rows.add(index, row);\n\t}\n\n\t/**\n\t * Removes a {@link Row} at a specified index\n\t *\n\t * @param index the index to use when removing the {@link Row}\n\t *\n\t * @return the removed {@link Row}\n\t */\n\tpublic Row removeRow(int index) {\n\t\treturn rows.remove(index);\n\t}\n\n\t/**\n\t * Removes a {@link Row} by equality to another {@link Row}\n\t *\n\t * @param row the {@link Row} to remove\n\t *\n\t * @return true if the {@link Row} was removed, otherwise false\n\t */\n\tpublic boolean removeRow(Row row) {\n\t\treturn rows.remove(row);\n\t}\n\n\t/**\n\t * Sets the {@link List} of {@link Row}s to set for the {@link DataSet}\n\t *\n\t * @param rows the {@link List} of {@link Row}s to set for the {@link DataSet}\n\t */\n\tpublic void setRows(List<Row> rows) {\n\t\tthis.rows = rows;\n\t}\n\n\t/**\n\t * Gets a {@link List} of {@link DataSetDataType}s for the {@link DataSet}\n\t *\n\t * @return a {@link List} of {@link DataSetDataType}s for the {@link DataSet}\n\t */\n\tpublic List<DataSetDataType> getTypes() {\n\t\treturn types;\n\t}\n\n\t/**\n\t * Sets the {@link List} of {@link DataSetDataType}s for the {@link DataSet}\n\t *\n\t * @param types the {@link List} of {@link DataSetDataType}s to set for the {@link DataSet}\n\t */\n\tpublic void setTypes(List<DataSetDataType> types) {\n\t\tthis.types = types;\n\t}\n\n\t/**\n\t * Adds a {@link DataSetDataType} to the end of the {@link List} of {@link DataSetDataType}s in the {@link DataSet}\n\t *\n\t * @param type the {@link DataSetDateType} to add to the end of the {@link DataSet}\n\t */\n\tpublic void addType(DataSetDataType type) {\n\t\tthis.types.add(type);\n\t}\n\n\t/**\n\t * Adds a {@link DataSetDataType} to the at the specified index to the {@link List} of {@link DataSetDataType}s in\n\t * the {@link DataSet}\n\t *\n\t * @param index the index at which to add the new {@link DataSetDataType}\n\t * @param type the {@link DataSetDateType} to add to the {@link DataSet}\n\t */\n\tpublic void addType(int index, DataSetDataType type) {\n\t\tthis.types.add(index, type);\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder builder = new StringBuilder();\n\t\tbuilder.append(\"DataSet [numOfColumns=\");\n\t\tbuilder.append(numOfColumns);\n\t\tbuilder.append(\", columnNames=\");\n\t\tbuilder.append(columnNames);\n\t\tbuilder.append(\", types=\");\n\t\tbuilder.append(types);\n\t\tbuilder.append(\", rows=\");\n\t\tbuilder.append(rows);\n\t\tbuilder.append(\"]\");\n\t\treturn builder.toString();\n\t}\n\n\t/**\n\t * A builder for creating a {@link DataSet} instance.\n\t */\n\tpublic static class DataSetBuilder {\n\n\t\tprivate long numOfColumns;\n\t\tprivate List<String> columnNames;\n\t\tprivate List<DataSetDataType> types;\n\t\tprivate List<Row> rows;\n\n\t\tpublic DataSetBuilder(long numOfColumns) {\n\t\t\tthis.numOfColumns = numOfColumns;\n\t\t\tthis.columnNames = new ArrayList<String>();\n\t\t\tthis.types = new ArrayList<DataSetDataType>();\n\t\t\tthis.rows = new ArrayList<Row>();\n\t\t}\n\n\t\tpublic DataSetBuilder(DataSet dataSet) {\n\t\t\tthis.numOfColumns = dataSet.getNumOfColumns();\n\t\t\tthis.columnNames = new ArrayList<String>(dataSet.getColumnNames());\n\t\t\tthis.types = new ArrayList<DataSetDataType>(dataSet.getTypes());\n\t\t\tthis.rows = new ArrayList<Row>(dataSet.getRows().size());\n\t\t\tfor (Row row : dataSet.getRows()) {\n\t\t\t\trows.add(new RowBuilder(row).createRow());\n\t\t\t}\n\t\t}\n\n\t\tpublic DataSetBuilder addColumnNames(Collection<String> columnNames) {\n\t\t\tthis.columnNames.addAll(columnNames);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic DataSetBuilder addColumnName(String columnName) {\n\t\t\tthis.columnNames.add(columnName);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic DataSetBuilder addType(DataSetDataType type) {\n\t\t\tthis.types.add(type);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic DataSetBuilder addTypes(Collection<DataSetDataType> types) {\n\t\t\tthis.types.addAll(types);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic DataSetBuilder addRow(Row row) {\n\t\t\tthis.rows.add(row);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic DataSetBuilder addRows(Collection<Row> rows) {\n\t\t\tthis.rows.addAll(rows);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic DataSet createDataSet() throws SparkplugException {\n\t\t\tlogger.trace(\"Number of columns: \" + numOfColumns);\n\t\t\tfor (String columnName : columnNames) {\n\t\t\t\tlogger.trace(\"\\tcolumnName: \" + columnName);\n\t\t\t}\n\t\t\tfor (DataSetDataType type : types) {\n\t\t\t\tlogger.trace(\"\\ttypes: \" + type);\n\t\t\t}\n\t\t\tfor (Row row : rows) {\n\t\t\t\tlogger.trace(\"\\t\\trow: \" + row);\n\t\t\t}\n\n\t\t\tvalidate();\n\t\t\treturn new DataSet(numOfColumns, columnNames, types, rows);\n\t\t}\n\n\t\tpublic void validate() throws SparkplugException {\n\t\t\tif (columnNames.size() != numOfColumns) {\n\t\t\t\tthrow new SparkplugException(\"Invalid number of columns in data set column names: \" + columnNames.size()\n\t\t\t\t\t\t+ \" vs expected \" + numOfColumns);\n\t\t\t}\n\t\t\tif (types.size() != numOfColumns) {\n\t\t\t\tthrow new SparkplugException(\"Invalid number of columns in data set types: \" + types.size()\n\t\t\t\t\t\t+ \" vs expected: \" + numOfColumns);\n\t\t\t}\n\t\t\tfor (int i = 0; i < types.size(); i++) {\n\t\t\t\tfor (Row row : rows) {\n\t\t\t\t\tList<Value<?>> values = row.getValues();\n\t\t\t\t\tif (values.size() != numOfColumns) {\n\t\t\t\t\t\tthrow new SparkplugException(\"Invalid number of columns in data set row: \" + values.size()\n\t\t\t\t\t\t\t\t+ \" vs expected: \" + numOfColumns);\n\t\t\t\t\t}\n\t\t\t\t\ttypes.get(i).checkType(row.getValues().get(i).getValue());\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/DataSetDataType.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.math.BigInteger;\nimport java.util.Date;\n\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\n\n/**\n * A enumeration of data types of values in a {@link DataSet}\n */\npublic enum DataSetDataType {\n\n\t// Basic Types\n\tInt8(1, Byte.class),\n\tInt16(2, Short.class),\n\tInt32(3, Integer.class),\n\tInt64(4, Long.class),\n\tUInt8(5, Short.class),\n\tUInt16(6, Integer.class),\n\tUInt32(7, Long.class),\n\tUInt64(8, BigInteger.class),\n\tFloat(9, Float.class),\n\tDouble(10, Double.class),\n\tBoolean(11, Boolean.class),\n\tString(12, String.class),\n\tDateTime(13, Date.class),\n\tText(14, String.class),\n\n\t// Unknown\n\tUnknown(0, Object.class);\n\n\tprivate Class<?> clazz = null;\n\tprivate int intValue = 0;\n\n\tprivate DataSetDataType(int intValue, Class<?> clazz) {\n\t\tthis.intValue = intValue;\n\t\tthis.clazz = clazz;\n\t}\n\n\tpublic void checkType(Object value) throws SparkplugInvalidTypeException {\n\t\tif (value != null && !clazz.isAssignableFrom(value.getClass())) {\n\t\t\tthrow new SparkplugInvalidTypeException(value.getClass());\n\t\t}\n\t}\n\n\t/**\n\t * Returns an integer representation of the data type.\n\t * \n\t * @return an integer representation of the data type.\n\t */\n\tpublic int toIntValue() {\n\t\treturn this.intValue;\n\t}\n\n\t/**\n\t * Converts the integer representation of the data type into a {@link DataSetDataType} instance.\n\t * \n\t * @param i the integer representation of the data type.\n\t * @return a {@link DataSetDataType} instance.\n\t */\n\tpublic static DataSetDataType fromInteger(int i) {\n\t\tswitch (i) {\n\t\t\tcase 1:\n\t\t\t\treturn Int8;\n\t\t\tcase 2:\n\t\t\t\treturn Int16;\n\t\t\tcase 3:\n\t\t\t\treturn Int32;\n\t\t\tcase 4:\n\t\t\t\treturn Int64;\n\t\t\tcase 5:\n\t\t\t\treturn UInt8;\n\t\t\tcase 6:\n\t\t\t\treturn UInt16;\n\t\t\tcase 7:\n\t\t\t\treturn UInt32;\n\t\t\tcase 8:\n\t\t\t\treturn UInt64;\n\t\t\tcase 9:\n\t\t\t\treturn Float;\n\t\t\tcase 10:\n\t\t\t\treturn Double;\n\t\t\tcase 11:\n\t\t\t\treturn Boolean;\n\t\t\tcase 12:\n\t\t\t\treturn String;\n\t\t\tcase 13:\n\t\t\t\treturn DateTime;\n\t\t\tcase 14:\n\t\t\t\treturn Text;\n\t\t\tdefault:\n\t\t\t\treturn Unknown;\n\t\t}\n\t}\n\n\t/**\n\t * Returns the class type for this DataType\n\t * \n\t * @return the class type for this DataType\n\t */\n\tpublic Class<?> getClazz() {\n\t\treturn clazz;\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/DeviceDescriptor.java",
    "content": "/********************************************************************************\n * Copyright (c) 2020-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\npublic class DeviceDescriptor extends EdgeNodeDescriptor {\n\n\tprivate final String deviceId;\n\tprivate final String descriptorString;\n\n\t/**\n\t * Constructor\n\t *\n\t * @param groupId the Sparkplug Group ID associated with this {@link DeviceDescriptor}\n\t * @param edgeNodeId the Sparkplug Edge Node ID associated with this {@link DeviceDescriptor}\n\t * @param deviceId the Sparkplug Device ID associated with this {@link DeviceDescriptor}\n\t */\n\tpublic DeviceDescriptor(String groupId, String edgeNodeId, String deviceId) {\n\t\tsuper(groupId, edgeNodeId);\n\t\tthis.deviceId = deviceId;\n\t\tthis.descriptorString = groupId + \"/\" + edgeNodeId + \"/\" + deviceId;\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param descriptorString a {@link String} representing the Sparkplug Device Descriptor and MUST be of the form\n\t *            group_id/edge_node_id/device_id\n\t */\n\tpublic DeviceDescriptor(String descriptorString) {\n\t\tsuper(descriptorString.substring(0, descriptorString.lastIndexOf(\"/\")));\n\t\tthis.deviceId = descriptorString.substring(descriptorString.lastIndexOf(\"/\") + 1);\n\t\tthis.descriptorString = descriptorString;\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param edgeNodeDescriptor an {@link EdgeNodeDescriptor} that is the parent Edge Node of this Device\n\t *\n\t * @param deviceId the Sparkplug Device ID associated with this {@link DeviceDescriptor}\n\t */\n\tpublic DeviceDescriptor(EdgeNodeDescriptor edgeNodeDescriptor, String deviceId) {\n\t\tsuper(edgeNodeDescriptor.getGroupId(), edgeNodeDescriptor.getEdgeNodeId());\n\t\tthis.deviceId = deviceId;\n\t\tthis.descriptorString = edgeNodeDescriptor.getDescriptorString() + \"/\" + deviceId;\n\t}\n\n\t/**\n\t * Returns true because this is a {@link DeviceDescriptor}\n\t *\n\t * @return true because this is a {@link DeviceDescriptor}\n\t */\n\t@Override\n\tpublic boolean isDeviceDescriptor() {\n\t\treturn true;\n\t}\n\n\t/**\n\t * Gets the Sparkplug Device ID associated with this {@link DeviceDescriptor}\n\t *\n\t * @return the Sparkplug Device ID associated with this {@link DeviceDescriptor}\n\t */\n\t@Override\n\tpublic String getDeviceId() {\n\t\treturn deviceId;\n\t}\n\n\t/**\n\t * Returns a {@link String} representing the Device's Descriptor of the form:\n\t * \"<groupName>/<edgeNodeName>/<deviceId>\".\n\t *\n\t * @return a {@link String} representing the Device's Descriptor.\n\t */\n\t@Override\n\tpublic String getDescriptorString() {\n\t\treturn descriptorString;\n\t}\n\n\t/**\n\t * Returns the {@link EdgeNodeDescriptor} associated with this DeviceDescriptor\n\t *\n\t * @return a {@link EdgeNodeDescriptor} representing the Device's parent Edge Node Descriptor.\n\t */\n\tpublic EdgeNodeDescriptor getEdgeNodeDescriptor() {\n\t\treturn super.getEdgeNodeDescriptor();\n\t}\n\n\t/**\n\t * Returns a {@link String} representing the Device's parent Edge Node Descriptor of the form:\n\t * \"<groupName>/<edgeNodeName>\".\n\t *\n\t * @return a {@link String} representing the Device's parent Edge Node Descriptor.\n\t */\n\tpublic String getEdgeNodeDescriptorString() {\n\t\treturn super.getDescriptorString();\n\t}\n\n\t@Override\n\tpublic int hashCode() {\n\t\treturn this.getDescriptorString().hashCode();\n\t}\n\n\t@Override\n\tpublic boolean equals(Object object) {\n\t\tif (object instanceof DeviceDescriptor) {\n\t\t\treturn this.getDescriptorString().equals(((DeviceDescriptor) object).getDescriptorString());\n\t\t}\n\t\treturn this.getDescriptorString().equals(object);\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\treturn getDescriptorString();\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/EdgeNodeDescriptor.java",
    "content": "/********************************************************************************\n * Copyright (c) 2017-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport com.fasterxml.jackson.annotation.JsonValue;\n\n/**\n * An Edge Node Identifier\n */\npublic class EdgeNodeDescriptor implements SparkplugDescriptor {\n\n\tprivate final String groupId;\n\tprivate final String edgeNodeId;\n\tprivate final String descriptorString;\n\n\t/**\n\t * Constructor\n\t *\n\t * @param groupId the Sparkplug Group ID associated with this {@link EdgeNodeDescriptor}\n\t * @param edgeNodeId the Sparkplug Edge Node ID associated with this {@link EdgeNodeDescriptor}\n\t */\n\tpublic EdgeNodeDescriptor(String groupId, String edgeNodeId) {\n\t\tthis.groupId = groupId;\n\t\tthis.edgeNodeId = edgeNodeId;\n\t\tthis.descriptorString = groupId + \"/\" + edgeNodeId;\n\t}\n\n\t/**\n\t * Creates and EdgeNodeDescriptor from a {@link String} of the form group_name/edge_node_name\n\t * \n\t * @param descriptorString the {@link String} representation of an EdgeNodeDescriptor\n\t */\n\tpublic EdgeNodeDescriptor(String descriptorString) {\n\t\tString[] tokens = descriptorString.split(\"/\");\n\t\tthis.groupId = tokens[0];\n\t\tthis.edgeNodeId = tokens[1];\n\t\tthis.descriptorString = descriptorString;\n\t}\n\n\t/**\n\t * Gets the Sparkplug Group ID for this {@link EdgeNodeDescriptor}\n\t *\n\t * @return the Sparkplug Group ID associated with this {@link EdgeNodeDescriptor}\n\t */\n\t@Override\n\tpublic String getGroupId() {\n\t\treturn groupId;\n\t}\n\n\t/**\n\t * Gets the Sparkplug Edge Node ID for this {@link EdgeNodeDescriptor}\n\t *\n\t * @return the Sparkplug Edge Node ID associated with this {@link EdgeNodeDescriptor}\n\t */\n\t@Override\n\tpublic String getEdgeNodeId() {\n\t\treturn edgeNodeId;\n\t}\n\n\t/**\n\t * Gets the Sparkplug Device ID for this {@link EdgeNodeDescriptor}. It is always null for an\n\t * {@link EdgeNodeDescriptor}.\n\t *\n\t * @return null\n\t */\n\t@Override\n\tpublic String getDeviceId() {\n\t\treturn null;\n\t}\n\n\t/**\n\t * Always returns false of an {@link EdgeNodeDescriptor}\n\t */\n\t@Override\n\tpublic boolean isDeviceDescriptor() {\n\t\treturn false;\n\t}\n\n\t/**\n\t * Returns the {@link EdgeNodeDescriptor}\n\t *\n\t * @return the {@link EdgeNodeDescriptor}\n\t */\n\tprotected EdgeNodeDescriptor getEdgeNodeDescriptor() {\n\t\treturn new EdgeNodeDescriptor(groupId, edgeNodeId);\n\t}\n\n\t/**\n\t * Returns a {@link String} representing the Edge Node's Descriptor of the form: \"<groupId>/<edgeNodeId>\".\n\t *\n\t * @return a {@link String} representing the Edge Node's Descriptor.\n\t */\n\t@Override\n\tpublic String getDescriptorString() {\n\t\treturn descriptorString;\n\t}\n\n\t@Override\n\tpublic int hashCode() {\n\t\treturn this.getDescriptorString().hashCode();\n\t}\n\n\t@Override\n\tpublic boolean equals(Object object) {\n\t\tif (object instanceof EdgeNodeDescriptor) {\n\t\t\treturn this.getDescriptorString().equals(((EdgeNodeDescriptor) object).getDescriptorString());\n\t\t}\n\t\treturn this.getDescriptorString().equals(object);\n\t}\n\n\t@Override\n\t@JsonValue\n\tpublic String toString() {\n\t\treturn getDescriptorString();\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/File.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.util.Arrays;\n\nimport org.eclipse.tahu.json.FileSerializer;\n\nimport com.fasterxml.jackson.annotation.JsonIgnoreProperties;\nimport com.fasterxml.jackson.databind.annotation.JsonSerialize;\n\n@JsonIgnoreProperties(\n\t\tvalue = { \"fileName\" })\n@JsonSerialize(\n\t\tusing = FileSerializer.class)\npublic class File {\n\n\tprivate String fileName;\n\tprivate byte[] bytes;\n\n\t/**\n\t * Default Constructor\n\t */\n\tpublic File() {\n\t\tsuper();\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param fileName the full file name path\n\t * @param bytes the array of bytes that represent the contents of the file\n\t */\n\tpublic File(String fileName, byte[] bytes) {\n\t\tsuper();\n\t\tthis.fileName = fileName == null\n\t\t\t\t? null\n\t\t\t\t: fileName.replace(\"/\", System.getProperty(\"file.separator\")).replace(\"\\\\\",\n\t\t\t\t\t\tSystem.getProperty(\"file.separator\"));\n\t\tthis.bytes = Arrays.copyOf(bytes, bytes.length);\n\t}\n\n\t/**\n\t * Gets the full filename path\n\t *\n\t * @return the full filename path\n\t */\n\tpublic String getFileName() {\n\t\treturn fileName;\n\t}\n\n\t/**\n\t * Sets the full filename path\n\t *\n\t * @param fileName the full filename path\n\t */\n\tpublic void setFileName(String fileName) {\n\t\tthis.fileName = fileName;\n\t}\n\n\t/**\n\t * Gets the bytes that represent the contents of the file\n\t *\n\t * @return the bytes that represent the contents of the file\n\t */\n\tpublic byte[] getBytes() {\n\t\treturn bytes;\n\t}\n\n\t/**\n\t * Sets the bytes that represent the contents of the file\n\t *\n\t * @param bytes the bytes that represent the contents of the file\n\t */\n\tpublic void setBytes(byte[] bytes) {\n\t\tthis.bytes = bytes;\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder builder = new StringBuilder();\n\t\tbuilder.append(\"File [fileName=\");\n\t\tbuilder.append(fileName);\n\t\tbuilder.append(\", bytes=\");\n\t\tbuilder.append(Arrays.toString(bytes));\n\t\tbuilder.append(\"]\");\n\t\treturn builder.toString();\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/Message.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\n/**\n * A class to represent a Message.\n */\npublic class Message {\n\n\tprivate Topic topic;\n\n\tprivate SparkplugBPayload payload;\n\n\t/**\n\t * Default Constructor\n\t */\n\tpublic Message() {\n\t}\n\n\t/**\n\t * Copy Constructor\n\t *\n\t * @param message the {@link Message} to copy\n\t */\n\tpublic Message(Message message) {\n\t\tthis.topic = message.getTopic();\n\t\tthis.payload = new SparkplugBPayload(message.getPayload());\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param topic the {@link Topic} associated with the {@link Message}\n\t * @param payload the {@link SparkplugBPayload} associated with the {@link Message}\n\t */\n\tprivate Message(Topic topic, SparkplugBPayload payload) {\n\t\tsuper();\n\t\tthis.topic = topic;\n\t\tthis.payload = payload;\n\t}\n\n\t/**\n\t * Gets the {@link Topic} of this {@link Message}\n\t *\n\t * @return the {@link Topic} of this {@link Message}\n\t */\n\tpublic Topic getTopic() {\n\t\treturn topic;\n\t}\n\n\t/**\n\t * Gets the {@link SparkplugBPayload} of this {@link Message}\n\t *\n\t * @return the {@link SparkplugBPayload} of this {@link Message}\n\t */\n\tpublic SparkplugBPayload getPayload() {\n\t\treturn payload;\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder builder = new StringBuilder();\n\t\tbuilder.append(\"Message [topic=\");\n\t\tbuilder.append(topic);\n\t\tbuilder.append(\", payload=\");\n\t\tbuilder.append(payload);\n\t\tbuilder.append(\"]\");\n\t\treturn builder.toString();\n\t}\n\n\t/**\n\t * A builder for creating a {@link SparkplugBPayload} instance.\n\t */\n\tpublic static class MessageBuilder {\n\n\t\tprivate Topic topic;\n\n\t\tprivate SparkplugBPayload payload;\n\n\t\tpublic MessageBuilder(Topic topic, SparkplugBPayload payload) {\n\t\t\tthis.topic = topic;\n\t\t\tthis.payload = payload;\n\t\t}\n\n\t\tpublic MessageBuilder() {\n\t\t}\n\n\t\tpublic MessageBuilder topic(Topic topic) {\n\t\t\tthis.topic = topic;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MessageBuilder payload(SparkplugBPayload payload) {\n\t\t\tthis.payload = payload;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic Message build() {\n\t\t\treturn new Message(this.topic, this.payload);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/MessageType.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport org.eclipse.tahu.SparkplugParsingException;\n\n/**\n * An enumeration of Sparkplug MQTT message types. The type provides an indication as to what the MQTT Payload of\n * message will contain.\n */\npublic enum MessageType {\n\n\t/**\n\t * Birth certificate for MQTT Edge of Network (EoN) Nodes.\n\t */\n\tNBIRTH,\n\n\t/**\n\t * Death certificate for MQTT Edge of Network (EoN) Nodes.\n\t */\n\tNDEATH,\n\n\t/**\n\t * Birth certificate for MQTT Devices.\n\t */\n\tDBIRTH,\n\n\t/**\n\t * Death certificate for MQTT Devices.\n\t */\n\tDDEATH,\n\n\t/**\n\t * Edge of Network (EoN) Node data message.\n\t */\n\tNDATA,\n\n\t/**\n\t * Device data message.\n\t */\n\tDDATA,\n\n\t/**\n\t * Edge of Network (EoN) Node command message.\n\t */\n\tNCMD,\n\n\t/**\n\t * Device command message.\n\t */\n\tDCMD,\n\n\t/**\n\t * Critical application state message.\n\t */\n\tSTATE,\n\n\t/**\n\t * Device record message.\n\t */\n\tDRECORD,\n\n\t/**\n\t * Edge of Network (EoN) Node record message.\n\t */\n\tNRECORD;\n\n\t/**\n\t * Parses a Sparkplug message type which MUST be one of the valid {@link MessageType}s\n\t *\n\t * @param type the {@link String} representing the potential {@link MessageType}\n\t *\n\t * @return the {@link MessageType} that represents the {@link String} type argument\n\t *\n\t * @throws SparkplugParsingException if the incoming {@link String} type does not represent a {@link MessageType}\n\t */\n\tpublic static MessageType parseMessageType(String type) throws SparkplugParsingException {\n\t\tfor (MessageType messageType : MessageType.values()) {\n\t\t\tif (messageType.name().equals(type)) {\n\t\t\t\treturn messageType;\n\t\t\t}\n\t\t}\n\t\tthrow new SparkplugParsingException(\"Invalid message type: \" + type);\n\t}\n\n\t/**\n\t * Whether or not this is an NDEATH or DDEATH\n\t *\n\t * @return true if this {@link MessageType} is an NDEATH or DDEATH\n\t */\n\tpublic boolean isDeath() {\n\t\treturn this.equals(DDEATH) || this.equals(NDEATH);\n\t}\n\n\t/**\n\t * Whether or not this is an NCMD or DCMD\n\t *\n\t * @return true if this {@link MessageType} is an NCMD or DCMD\n\t */\n\tpublic boolean isCommand() {\n\t\treturn this.equals(DCMD) || this.equals(NCMD);\n\t}\n\n\t/**\n\t * Whether or not this is an NDATA or DDATA\n\t *\n\t * @return true if this {@link MessageType} is an NDATA or DDATA\n\t */\n\tpublic boolean isData() {\n\t\treturn this.equals(DDATA) || this.equals(NDATA);\n\t}\n\n\t/**\n\t * Whether or not this is an NBIRTH or DBIRTH\n\t *\n\t * @return true if this {@link MessageType} is an NBIRTH or DBIRTH\n\t */\n\tpublic boolean isBirth() {\n\t\treturn this.equals(DBIRTH) || this.equals(NBIRTH);\n\t}\n\n\t/**\n\t * Whether or not this is an NRECORD or DRECORD\n\t *\n\t * @return true if this {@link MessageType} is an NRECORD or DRECORD\n\t */\n\tpublic boolean isRecord() {\n\t\treturn this.equals(DRECORD) || this.equals(NRECORD);\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/MetaData.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.util.Objects;\n\nimport com.fasterxml.jackson.annotation.JsonInclude;\nimport com.fasterxml.jackson.annotation.JsonInclude.Include;\n\n/**\n * A class to represent the meta data associated with a metric.\n */\n@JsonInclude(Include.NON_NULL)\npublic class MetaData {\n\n\t/**\n\t * Indicates if the metric represents one of multiple parts.\n\t */\n\tprivate Boolean isMultiPart;\n\n\t/**\n\t * A content type associated with the metric.\n\t */\n\tprivate String contentType;\n\n\t/**\n\t * A size associated with the metric.\n\t */\n\tprivate Long size;\n\n\t/**\n\t * A sequence associated with the metric.\n\t */\n\tprivate Long seq;\n\n\t/**\n\t * A file name associated with the metric.\n\t */\n\tprivate String fileName;\n\n\t/**\n\t * A file type associated with the metric.\n\t */\n\tprivate String fileType;\n\n\t/**\n\t * A MD5 sum associated with the metric.\n\t */\n\tprivate String md5;\n\n\t/**\n\t * A description associated with the metric.\n\t */\n\tprivate String description;\n\n\t/**\n\t * Default no-arg constructor.\n\t */\n\tpublic MetaData() {\n\t}\n\n\t/**\n\t * Constructor with fields.\n\t * \n\t * @param isMultiPart if the metric represents one of multiple parts.\n\t * @param contentType a content type associated with the metric.\n\t * @param size a size associated with the metric.\n\t * @param seq a sequence associated with the metric.\n\t * @param fileName a file name associated with the metric.\n\t * @param fileType a file type associated with the metric.\n\t * @param md5 a MD5 sum associated with the metric.\n\t * @param description a description associated with the metric\n\t */\n\tpublic MetaData(Boolean isMultiPart, String contentType, Long size, Long seq, String fileName, String fileType,\n\t\t\tString md5, String description) {\n\t\tthis.isMultiPart = isMultiPart;\n\t\tthis.contentType = contentType;\n\t\tthis.size = size;\n\t\tthis.seq = seq;\n\t\tthis.fileName = fileName;\n\t\tthis.fileType = fileType;\n\t\tthis.md5 = md5;\n\t\tthis.description = description;\n\t}\n\n\t/**\n\t * Copy Constructor\n\t *\n\t * @param metaData the {@link MetaData} to copy\n\t */\n\tpublic MetaData(MetaData metaData) {\n\t\tthis(metaData.isMultiPart(), metaData.getContentType(), metaData.getSize(), metaData.getSeq(),\n\t\t\t\tmetaData.getFileName(), metaData.getFileType(), metaData.getMd5(), metaData.getDescription());\n\t}\n\n\t/**\n\t * Whether or not this is a mult-part {@link MetaData}\n\t *\n\t * @return true if this is multi-part {@link MetaData} otherwise false\n\t */\n\tpublic Boolean isMultiPart() {\n\t\treturn isMultiPart;\n\t}\n\n\t/**\n\t * Sets whether or not this is multi-part {@link MetaData}\n\t *\n\t * @param isMultiPart whether or not this is multi-part {@link MetaData}\n\t *\n\t * @return the {@link MetaData} that was just modified\n\t */\n\tpublic MetaData setMultiPart(Boolean isMultiPart) {\n\t\tthis.isMultiPart = isMultiPart;\n\t\treturn this;\n\t}\n\n\t/**\n\t * Gets the ContentType of this {@link MetaData}\n\t *\n\t * @return the ContentType of this {@link MetaData}\n\t */\n\tpublic String getContentType() {\n\t\treturn contentType;\n\t}\n\n\t/**\n\t * Sets the ContentType of this {@link MetaData}\n\t *\n\t * @param contentType the ContentType of this {@link MetaData}\n\t *\n\t * @return the {@link MetaData} that was just modified\n\t */\n\tpublic MetaData setContentType(String contentType) {\n\t\tthis.contentType = contentType;\n\t\treturn this;\n\t}\n\n\t/**\n\t * Gets the size of this {@link MetaData}\n\t *\n\t * @return the size of this {@link MetaData}\n\t */\n\tpublic Long getSize() {\n\t\treturn size;\n\t}\n\n\t/**\n\t * Sets the size of this {@link MetaData}\n\t *\n\t * @param size the size of this {@link MetaData}\n\t *\n\t * @return the {@link MetaData} that was just modified\n\t */\n\tpublic MetaData setSize(Long size) {\n\t\tthis.size = size;\n\t\treturn this;\n\t}\n\n\t/**\n\t * Gets the sequence number of this {@link MetaData}\n\t *\n\t * @return the sequence number of this {@link MetaData}\n\t */\n\tpublic Long getSeq() {\n\t\treturn seq;\n\t}\n\n\t/**\n\t * Sets the sequence number of this {@link MetaData}\n\t *\n\t * @param seq the sequence number of this {@link MetaData}\n\t *\n\t * @return the {@link MetaData} that was just modified\n\t */\n\tpublic MetaData setSeq(Long seq) {\n\t\tthis.seq = seq;\n\t\treturn this;\n\t}\n\n\t/**\n\t * Gets the filename of this {@link MetaData}\n\t *\n\t * @return the filename of this {@link MetaData}\n\t */\n\tpublic String getFileName() {\n\t\treturn fileName;\n\t}\n\n\t/**\n\t * Sets the filename of this {@link MetaData}\n\t *\n\t * @param fileName the filename of this {@link MetaData}\n\t *\n\t * @return the {@link MetaData} that was just modified\n\t */\n\tpublic MetaData setFileName(String fileName) {\n\t\tthis.fileName = fileName;\n\t\treturn this;\n\t}\n\n\t/**\n\t * Gets the file type of this {@link MetaData}\n\t *\n\t * @return the file type of this {@link MetaData}\n\t */\n\tpublic String getFileType() {\n\t\treturn fileType;\n\t}\n\n\t/**\n\t * Sets the file type of this {@link MetaData}\n\t *\n\t * @param fileType the file type of this {@link MetaData}\n\t *\n\t * @return the {@link MetaData} that was just modified\n\t */\n\tpublic MetaData setFileType(String fileType) {\n\t\tthis.fileType = fileType;\n\t\treturn this;\n\t}\n\n\t/**\n\t * Gets the MD5 sum of this {@link MetaData}\n\t *\n\t * @return the MD5 sum of this {@link MetaData}\n\t */\n\tpublic String getMd5() {\n\t\treturn md5;\n\t}\n\n\t/**\n\t * Sets the MD5 sum of this {@link MetaData}\n\t *\n\t * @param md5 the MD% sum of this {@link MetaData}\n\t *\n\t * @return the {@link MetaData} that was just modified\n\t */\n\tpublic MetaData setMd5(String md5) {\n\t\tthis.md5 = md5;\n\t\treturn this;\n\t}\n\n\t/**\n\t * Gets the description of this {@link MetaData}\n\t *\n\t * @return the description of this {@link MetaData}\n\t */\n\tpublic String getDescription() {\n\t\treturn description;\n\t}\n\n\t/**\n\t * Sets the description of this {@link MetaData}\n\t *\n\t * @param description the description of this {@link MetaData}\n\t *\n\t * @return the {@link MetaData} that was just modified\n\t */\n\tpublic MetaData setDescription(String description) {\n\t\tthis.description = description;\n\t\treturn this;\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder builder = new StringBuilder();\n\t\tbuilder.append(\"MetaData [isMultiPart=\");\n\t\tbuilder.append(isMultiPart);\n\t\tbuilder.append(\", contentType=\");\n\t\tbuilder.append(contentType);\n\t\tbuilder.append(\", size=\");\n\t\tbuilder.append(size);\n\t\tbuilder.append(\", seq=\");\n\t\tbuilder.append(seq);\n\t\tbuilder.append(\", fileName=\");\n\t\tbuilder.append(fileName);\n\t\tbuilder.append(\", fileType=\");\n\t\tbuilder.append(fileType);\n\t\tbuilder.append(\", md5=\");\n\t\tbuilder.append(md5);\n\t\tbuilder.append(\", description=\");\n\t\tbuilder.append(description);\n\t\tbuilder.append(\"]\");\n\t\treturn builder.toString();\n\t}\n\n\t@Override\n\tpublic boolean equals(Object object) {\n\t\tif (this == object) {\n\t\t\treturn true;\n\t\t}\n\t\tif (object == null || this.getClass() != object.getClass()) {\n\t\t\treturn false;\n\t\t}\n\t\tMetaData meta = (MetaData) object;\n\t\treturn Objects.equals(isMultiPart, meta.isMultiPart()) && Objects.equals(contentType, meta.getContentType())\n\t\t\t\t&& Objects.equals(size, meta.getSize()) && Objects.equals(seq, meta.getSeq())\n\t\t\t\t&& Objects.equals(fileName, meta.getFileName()) && Objects.equals(fileType, meta.getFileType())\n\t\t\t\t&& Objects.equals(md5, meta.getMd5()) && Objects.equals(description, meta.getDescription());\n\t}\n\n\t/**\n\t * A Builder for a MetaData instance.\n\t */\n\tpublic static class MetaDataBuilder {\n\n\t\tprivate Boolean isMultiPart;\n\t\tprivate String contentType;\n\t\tprivate Long size;\n\t\tprivate Long seq;\n\t\tprivate String fileName;\n\t\tprivate String fileType;\n\t\tprivate String md5;\n\t\tprivate String description;\n\n\t\tpublic MetaDataBuilder() {\n\t\t};\n\n\t\tpublic MetaDataBuilder(MetaData metaData) {\n\t\t\tthis.isMultiPart = metaData.isMultiPart();\n\t\t\tthis.contentType = metaData.getContentType();\n\t\t\tthis.size = metaData.getSize();\n\t\t\tthis.seq = metaData.getSeq();\n\t\t\tthis.fileName = metaData.getFileName();\n\t\t\tthis.fileType = metaData.getFileType();\n\t\t\tthis.md5 = metaData.getMd5();\n\t\t\tthis.description = metaData.getDescription();\n\t\t}\n\n\t\tpublic MetaDataBuilder multiPart(Boolean isMultiPart) {\n\t\t\tthis.isMultiPart = isMultiPart;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetaDataBuilder contentType(String contentType) {\n\t\t\tthis.contentType = contentType;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetaDataBuilder size(Long size) {\n\t\t\tthis.size = size;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetaDataBuilder seq(Long seq) {\n\t\t\tthis.seq = seq;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetaDataBuilder fileName(String fileName) {\n\t\t\tthis.fileName = fileName;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetaDataBuilder fileType(String fileType) {\n\t\t\tthis.fileType = fileType;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetaDataBuilder md5(String md5) {\n\t\t\tthis.md5 = md5;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetaDataBuilder description(String description) {\n\t\t\tthis.description = description;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetaData createMetaData() {\n\t\t\treturn new MetaData(isMultiPart, contentType, size, seq, fileName, fileType, md5, description);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/Metric.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.math.BigInteger;\nimport java.util.Arrays;\nimport java.util.Date;\n\nimport org.eclipse.tahu.SparkplugException;\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\nimport org.eclipse.tahu.message.model.DataSet.DataSetBuilder;\nimport org.eclipse.tahu.message.model.MetaData.MetaDataBuilder;\nimport org.eclipse.tahu.message.model.PropertySet.PropertySetBuilder;\nimport org.eclipse.tahu.message.model.Template.TemplateBuilder;\nimport org.eclipse.tahu.protobuf.SparkplugBProto.DataType;\n\nimport com.fasterxml.jackson.annotation.JsonGetter;\nimport com.fasterxml.jackson.annotation.JsonIgnore;\nimport com.fasterxml.jackson.annotation.JsonIgnoreProperties;\nimport com.fasterxml.jackson.annotation.JsonInclude;\nimport com.fasterxml.jackson.annotation.JsonInclude.Include;\nimport com.fasterxml.jackson.annotation.JsonProperty;\nimport com.fasterxml.jackson.annotation.JsonSetter;\n\n/**\n * A metric of a Sparkplug Payload.\n */\n@JsonIgnoreProperties(\n\t\tvalue = { \"isNull\" })\n@JsonInclude(Include.NON_NULL)\npublic class Metric {\n\n\t@JsonProperty(\"name\")\n\tprivate String name;\n\n\t@JsonProperty(\"alias\")\n\tprivate Long alias;\n\n\t@JsonProperty(\"timestamp\")\n\tprivate Date timestamp;\n\n\t@JsonProperty(\"dataType\")\n\tprivate MetricDataType dataType;\n\n\t@JsonProperty(\"isHistorical\")\n\tprivate Boolean isHistorical;\n\n\t@JsonProperty(\"isTransient\")\n\tprivate Boolean isTransient;\n\n\t@JsonProperty(\"metaData\")\n\tprivate MetaData metaData;\n\n\t@JsonProperty(\"properties\")\n\t@JsonInclude(Include.NON_EMPTY)\n\tprivate PropertySet properties;\n\n\t@JsonProperty(\"value\")\n\t@JsonInclude(Include.NON_EMPTY)\n\tprivate Object value;\n\n\tprivate Boolean isNull = null;\n\n\t/**\n\t * Default Constructor\n\t */\n\tpublic Metric() {\n\t};\n\n\t/**\n\t * Constructor\n\t *\n\t * @param name the name of the {@link Metric}\n\t * @param alias the alias of the {@link Metric}\n\t * @param timestamp the timestamp of the {@link Metric} representing the time at which the {@link Metric} changed in\n\t *            UDT time\n\t * @param dataType the {@link MetricDataType} of the {@link Metric}\n\t * @param isHistorical whether or not this {@link Metric} is a historical value\n\t * @param isTransient whether or not this {@link Metric} is a transient value\n\t * @param metaData the {@link MetaData} assocated with this {@link Metric}\n\t * @param properties the {@link PropertySet} associated with this {@link Metric}\n\t * @param value the {@link Object} value of this {@link Metric} that must be a {@link Object} type for the\n\t *            {@link MetricDataType}\n\t *\n\t * @throws SparkplugInvalidTypeException if the value is not a valid {@link Object} type for the supplied\n\t *             {@link MetricDataType}\n\t */\n\tpublic Metric(String name, Long alias, Date timestamp, MetricDataType dataType, Boolean isHistorical,\n\t\t\tBoolean isTransient, MetaData metaData, PropertySet properties, Object value)\n\t\t\tthrows SparkplugInvalidTypeException {\n\t\tsuper();\n\t\tthis.name = name;\n\t\tthis.alias = alias;\n\t\tthis.timestamp = timestamp;\n\t\tthis.dataType = dataType;\n\t\tthis.isHistorical = isHistorical;\n\t\tthis.isTransient = isTransient;\n\t\tisNull = (value == null) ? true : false;\n\t\tthis.metaData = metaData;\n\t\tthis.properties = properties;\n\t\tthis.value = value;\n\t\tthis.dataType.checkType(value);\n\t}\n\n\t/**\n\t * Copy Constructor\n\t *\n\t * @param metric the {@link Metric} to copy\n\t * @throws SparkplugInvalidTypeException if the {@link Metric} can not be copied due to an invalid {@link DataType}\n\t */\n\tpublic Metric(Metric metric) throws SparkplugInvalidTypeException {\n\t\tthis(metric.getName(), metric.getAlias(), metric.getTimestamp(), metric.getDataType(), metric.getIsHistorical(),\n\t\t\t\tmetric.getIsTransient(), metric.getMetaData() != null ? new MetaData(metric.getMetaData()) : null,\n\t\t\t\tmetric.getProperties() != null ? new PropertySet(metric.getProperties()) : null, metric.getValue());\n\t}\n\n\t/**\n\t * Gets the name of the {@link Metric}\n\t *\n\t * @return the name of the {@link Metric}\n\t */\n\tpublic String getName() {\n\t\treturn name;\n\t}\n\n\t/**\n\t * Sets the name of the {@link Metric}\n\t *\n\t * @param name the name of the {@link Metric}\n\t */\n\tpublic void setName(String name) {\n\t\tthis.name = name;\n\t}\n\n\t/**\n\t * Whether or not this {@link Metric} has a name\n\t *\n\t * @return true if the name is not null, otherwise false\n\t */\n\tpublic boolean hasName() {\n\t\treturn !(name == null);\n\t}\n\n\t/**\n\t * Whether or not this {@link Metric} has an alias\n\t *\n\t * @return true if the {@link Metric} has an alias, otherwise false\n\t */\n\tpublic boolean hasAlias() {\n\t\treturn !(alias == null);\n\t}\n\n\t/**\n\t * Gets the alias associated with the {@link Metric}\n\t *\n\t * @return the alias associated with the {@link Metric}\n\t */\n\tpublic Long getAlias() {\n\t\treturn alias;\n\t}\n\n\t/**\n\t * Sets the alias for the {@link Metric}\n\t *\n\t * @param alias the alias to set for the {@link Metric}\n\t */\n\tpublic void setAlias(long alias) {\n\t\tthis.alias = alias;\n\t}\n\n\t/**\n\t * Gets the timestamp associated with the {@link Metric}\n\t *\n\t * @return the timestamp associated with the {@link Metric}\n\t */\n\tpublic Date getTimestamp() {\n\t\treturn timestamp;\n\t}\n\n\t/**\n\t * Sets the timestamp associated with the {@link Metric}\n\t *\n\t * @param timestamp the timestamp associated with the {@link Metric}\n\t */\n\tpublic void setTimestamp(Date timestamp) {\n\t\tthis.timestamp = timestamp;\n\t}\n\n\t/**\n\t * Gets the {@link MetricDataType} associated with the {@link Metric}\n\t *\n\t * @return the {@link MetricDataType} associated with the {@link Metric}\n\t */\n\tpublic MetricDataType getDataType() {\n\t\treturn dataType;\n\t}\n\n\t/**\n\t * Sets the {@link MetricDataType} associated with the {@link Metric}\n\t *\n\t * @param dataType the {@link MetricDataType} associated with the {@link Metric}\n\t */\n\tpublic void setDataType(MetricDataType dataType) {\n\t\tthis.dataType = dataType;\n\t}\n\n\t/**\n\t * Gets the {@link MetaData} associated with the {@link Metric}\n\t *\n\t * @return the {@link MetaData} associated with the {@link Metric}\n\t */\n\t@JsonGetter(\"metaData\")\n\tpublic MetaData getMetaData() {\n\t\treturn metaData;\n\t}\n\n\t/**\n\t * Sets the {@link MetaData} associated with the {@link Metric}\n\t *\n\t * @param metadata the {@link MetaData} associated with the {@link Metric}\n\t */\n\t@JsonSetter(\"metaData\")\n\tpublic void setMetaData(MetaData metaData) {\n\t\tthis.metaData = metaData;\n\t}\n\n\t/**\n\t * Gets the {@link Object} value associated with the {@link Metric}\n\t *\n\t * @return the {@link Object} value associated with the {@link Metric}\n\t */\n\tpublic Object getValue() {\n\t\treturn value;\n\t}\n\n\t/**\n\t * Sets the {@link Object} value associated with the {@link Metric}\n\t *\n\t * @param value the {@link Object} value associated with the {@link Metric}\n\t */\n\tpublic void setValue(Object value) {\n\t\tthis.value = value;\n\t\tisNull = (value == null);\n\t}\n\n\t/**\n\t * Gets the {@link PropertySet} associated with the {@link Metric}\n\t *\n\t * @return the {@link PropertySet} associated with the {@link Metric}\n\t */\n\tpublic PropertySet getProperties() {\n\t\treturn this.properties;\n\t}\n\n\t/**\n\t * Sets the {@link PropertySet} associated with the {@link Metric}\n\t *\n\t * @param metadata the {@link PropertySet} associated with the {@link Metric}\n\t */\n\tpublic void setProperties(PropertySet properties) {\n\t\tthis.properties = properties;\n\t}\n\n\t/**\n\t * Whether or not this {@link Metric} is historical\n\t *\n\t * @return true if this is a historical {@link Metric}, otherwise false\n\t */\n\t@JsonIgnore\n\tpublic Boolean isHistorical() {\n\t\treturn isHistorical == null ? false : isHistorical;\n\t}\n\n\t/**\n\t * Whether or not this {@link Metric} is historical\n\t *\n\t * @return true if this is a historical {@link Metric}, otherwise false\n\t */\n\t@JsonGetter(\"isHistorical\")\n\tpublic Boolean getIsHistorical() {\n\t\treturn isHistorical;\n\t}\n\n\t/**\n\t * Sets the historical flag for this {@link Metric}\n\t *\n\t * @param isHistorical true if this is a historical {@link Metric}, otherwise false\n\t */\n\t@JsonSetter(\"isHistorical\")\n\tpublic void setHistorical(Boolean isHistorical) {\n\t\tthis.isHistorical = isHistorical;\n\t}\n\n\t/**\n\t * Whether or not this {@link Metric} is transient\n\t *\n\t * @return true if this is a transient {@link Metric}, otherwise false\n\t */\n\t@JsonIgnore\n\tpublic Boolean isTransient() {\n\t\treturn isTransient == null ? false : isTransient;\n\t}\n\n\t/**\n\t * Whether or not this {@link Metric} is transient\n\t *\n\t * @return true if this is a transient {@link Metric}, otherwise false\n\t */\n\t@JsonGetter(\"isTransient\")\n\tpublic Boolean getIsTransient() {\n\t\treturn isTransient;\n\t}\n\n\t/**\n\t * Sets the transient flag for this {@link Metric}\n\t *\n\t * @param transient true if this is a transient {@link Metric}, otherwise false\n\t */\n\t@JsonSetter(\"isTransient\")\n\tpublic void setTransient(Boolean isTransient) {\n\t\tthis.isTransient = isTransient;\n\t}\n\n\t/**\n\t * Return true if this value is null, otherwise false\n\t *\n\t * @return true if this value is null, otherwise false\n\t */\n\t@JsonIgnore\n\tpublic Boolean isNull() {\n\t\treturn isNull == null ? false : isNull;\n\t}\n\n\t/**\n\t * Return true if this value is null, otherwise false\n\t *\n\t * @return true if this value is null, otherwise false\n\t */\n\t@JsonIgnore\n\tpublic Boolean getIsNull() {\n\t\treturn isNull;\n\t}\n\n\t@Override\n\tpublic int hashCode() {\n\t\tfinal int prime = 31;\n\t\tint result = 1;\n\t\tresult = prime * result + ((alias == null) ? 0 : alias.hashCode());\n\t\tresult = prime * result + ((dataType == null) ? 0 : dataType.hashCode());\n\t\tresult = prime * result + ((isHistorical == null) ? 0 : isHistorical.hashCode());\n\t\tresult = prime * result + ((isNull == null) ? 0 : isNull.hashCode());\n\t\tresult = prime * result + ((isTransient == null) ? 0 : isTransient.hashCode());\n\t\tresult = prime * result + ((metaData == null) ? 0 : metaData.hashCode());\n\t\tresult = prime * result + ((name == null) ? 0 : name.hashCode());\n\t\tresult = prime * result + ((properties == null) ? 0 : properties.hashCode());\n\t\tresult = prime * result + ((timestamp == null) ? 0 : timestamp.hashCode());\n\t\tresult = prime * result + ((value == null) ? 0 : value.hashCode());\n\t\treturn result;\n\t}\n\n\t@Override\n\tpublic boolean equals(Object obj) {\n\t\tif (this == obj)\n\t\t\treturn true;\n\t\tif (obj == null)\n\t\t\treturn false;\n\t\tif (getClass() != obj.getClass())\n\t\t\treturn false;\n\t\tMetric other = (Metric) obj;\n\t\tif (alias == null) {\n\t\t\tif (other.alias != null)\n\t\t\t\treturn false;\n\t\t} else if (!alias.equals(other.alias))\n\t\t\treturn false;\n\t\tif (dataType != other.dataType)\n\t\t\treturn false;\n\t\tif (isHistorical == null) {\n\t\t\tif (other.isHistorical != null)\n\t\t\t\treturn false;\n\t\t} else if (!isHistorical.equals(other.isHistorical))\n\t\t\treturn false;\n\t\tif (isNull == null) {\n\t\t\tif (other.isNull != null)\n\t\t\t\treturn false;\n\t\t} else if (!isNull.equals(other.isNull))\n\t\t\treturn false;\n\t\tif (isTransient == null) {\n\t\t\tif (other.isTransient != null)\n\t\t\t\treturn false;\n\t\t} else if (!isTransient.equals(other.isTransient))\n\t\t\treturn false;\n\t\tif (metaData == null) {\n\t\t\tif (other.metaData != null)\n\t\t\t\treturn false;\n\t\t} else if (!metaData.equals(other.metaData))\n\t\t\treturn false;\n\t\tif (name == null) {\n\t\t\tif (other.name != null)\n\t\t\t\treturn false;\n\t\t} else if (!name.equals(other.name))\n\t\t\treturn false;\n\t\tif (properties == null) {\n\t\t\tif (other.properties != null)\n\t\t\t\treturn false;\n\t\t} else if (!properties.equals(other.properties))\n\t\t\treturn false;\n\t\tif (timestamp == null) {\n\t\t\tif (other.timestamp != null)\n\t\t\t\treturn false;\n\t\t} else if (!timestamp.equals(other.timestamp))\n\t\t\treturn false;\n\t\tif (value == null) {\n\t\t\tif (other.value != null)\n\t\t\t\treturn false;\n\t\t} else if (!value.equals(other.value))\n\t\t\treturn false;\n\t\treturn true;\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder builder = new StringBuilder();\n\t\tbuilder.append(\"Metric [name=\");\n\t\tbuilder.append(name);\n\t\tbuilder.append(\", alias=\");\n\t\tbuilder.append(alias);\n\t\tbuilder.append(\", timestamp=\");\n\t\tbuilder.append(timestamp != null ? timestamp.getTime() : \"null\");\n\t\tbuilder.append(\", dataType=\");\n\t\tbuilder.append(dataType);\n\t\tbuilder.append(\", isHistorical=\");\n\t\tbuilder.append(isHistorical);\n\t\tbuilder.append(\", isTransient=\");\n\t\tbuilder.append(isTransient);\n\t\tbuilder.append(\", metaData=\");\n\t\tbuilder.append(metaData);\n\t\tbuilder.append(\", properties=\");\n\t\tbuilder.append(properties);\n\t\tbuilder.append(\", value=\");\n\t\tif (dataType == MetricDataType.BooleanArray) {\n\t\t\tbuilder.append(Arrays.toString((Boolean[]) value));\n\t\t} else if (dataType == MetricDataType.DateTimeArray) {\n\t\t\tbuilder.append(Arrays.toString((Date[]) value));\n\t\t} else if (dataType == MetricDataType.DoubleArray) {\n\t\t\tbuilder.append(Arrays.toString((Double[]) value));\n\t\t} else if (dataType == MetricDataType.FloatArray) {\n\t\t\tbuilder.append(Arrays.toString((Float[]) value));\n\t\t} else if (dataType == MetricDataType.Int8Array) {\n\t\t\tbuilder.append(Arrays.toString((Byte[]) value));\n\t\t} else if (dataType == MetricDataType.Int16Array) {\n\t\t\tbuilder.append(Arrays.toString((Short[]) value));\n\t\t} else if (dataType == MetricDataType.Int32Array) {\n\t\t\tbuilder.append(Arrays.toString((Integer[]) value));\n\t\t} else if (dataType == MetricDataType.Int64Array) {\n\t\t\tbuilder.append(Arrays.toString((Long[]) value));\n\t\t} else if (dataType == MetricDataType.StringArray) {\n\t\t\tbuilder.append(Arrays.toString((String[]) value));\n\t\t} else if (dataType == MetricDataType.UInt8Array) {\n\t\t\tbuilder.append(Arrays.toString((Short[]) value));\n\t\t} else if (dataType == MetricDataType.UInt16Array) {\n\t\t\tbuilder.append(Arrays.toString((Integer[]) value));\n\t\t} else if (dataType == MetricDataType.UInt32Array) {\n\t\t\tbuilder.append(Arrays.toString((Long[]) value));\n\t\t} else if (dataType == MetricDataType.UInt64Array) {\n\t\t\tbuilder.append(Arrays.toString((BigInteger[]) value));\n\t\t} else {\n\t\t\tbuilder.append(value);\n\t\t}\n\t\tbuilder.append(\", isNull=\");\n\t\tbuilder.append(isNull);\n\t\tbuilder.append(\"]\");\n\t\treturn builder.toString();\n\t}\n\n\t/**\n\t * A builder for creating a {@link Metric} instance.\n\t */\n\tpublic static class MetricBuilder {\n\n\t\tprivate String name;\n\t\tprivate Long alias;\n\t\tprivate Date timestamp;\n\t\tprivate MetricDataType dataType;\n\t\tprivate Boolean isHistorical;\n\t\tprivate Boolean isTransient;\n\t\tprivate MetaData metaData = null;\n\t\tprivate PropertySet properties = null;\n\t\tprivate Object value;\n\n\t\tpublic MetricBuilder(String name, MetricDataType dataType, Object value) {\n\t\t\tthis.name = name;\n\t\t\tthis.timestamp = new Date();\n\t\t\tthis.dataType = dataType;\n\t\t\tthis.value = value;\n\t\t}\n\n\t\tpublic MetricBuilder(Long alias, MetricDataType dataType, Object value) {\n\t\t\tthis.alias = alias;\n\t\t\tthis.timestamp = new Date();\n\t\t\tthis.dataType = dataType;\n\t\t\tthis.value = value;\n\t\t}\n\n\t\tpublic MetricBuilder(Metric metric) throws SparkplugException {\n\t\t\tthis.name = metric.getName();\n\t\t\tthis.alias = metric.getAlias();\n\t\t\tthis.timestamp = metric.getTimestamp();\n\t\t\tthis.dataType = metric.getDataType();\n\t\t\tthis.isHistorical = metric.isHistorical();\n\t\t\tthis.isTransient = metric.isTransient();\n\t\t\tthis.metaData =\n\t\t\t\t\tmetric.getMetaData() != null ? new MetaDataBuilder(metric.getMetaData()).createMetaData() : null;\n\t\t\tthis.properties = metric.getMetaData() != null\n\t\t\t\t\t? new PropertySetBuilder(metric.getProperties()).createPropertySet()\n\t\t\t\t\t: null;\n\t\t\tswitch (dataType) {\n\t\t\t\tcase DataSet:\n\t\t\t\t\tthis.value = metric.getValue() != null\n\t\t\t\t\t\t\t? new DataSetBuilder((DataSet) metric.getValue()).createDataSet()\n\t\t\t\t\t\t\t: null;\n\t\t\t\t\tbreak;\n\t\t\t\tcase Template:\n\t\t\t\t\tthis.value = metric.getValue() != null\n\t\t\t\t\t\t\t? new TemplateBuilder((Template) metric.getValue()).createTemplate()\n\t\t\t\t\t\t\t: null;\n\t\t\t\t\tbreak;\n\t\t\t\tdefault:\n\t\t\t\t\tthis.value = metric.getValue();\n\t\t\t}\n\t\t}\n\n\t\tpublic MetricBuilder name(String name) {\n\t\t\tthis.name = name;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetricBuilder alias(Long alias) {\n\t\t\tthis.alias = alias;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetricBuilder timestamp(Date timestamp) {\n\t\t\tthis.timestamp = timestamp;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetricBuilder dataType(MetricDataType dataType) {\n\t\t\tthis.dataType = dataType;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetricBuilder isHistorical(Boolean isHistorical) {\n\t\t\tthis.isHistorical = isHistorical;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetricBuilder isTransient(Boolean isTransient) {\n\t\t\tthis.isTransient = isTransient;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetricBuilder metaData(MetaData metaData) {\n\t\t\tthis.metaData = metaData;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetricBuilder properties(PropertySet properties) {\n\t\t\tthis.properties = properties;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic MetricBuilder value(Object value) {\n\t\t\tthis.value = value;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic Metric createMetric() throws SparkplugInvalidTypeException {\n\t\t\treturn new Metric(name, alias, timestamp, dataType, isHistorical, isTransient, metaData, properties, value);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/MetricDataType.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.math.BigInteger;\nimport java.util.Date;\n\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\n/**\n * An enumeration of data types associated with the value of a {@link Metric}\n */\npublic enum MetricDataType {\n\n\t// Basic Types\n\tInt8(1, Byte.class),\n\tInt16(2, Short.class),\n\tInt32(3, Integer.class),\n\tInt64(4, Long.class),\n\tUInt8(5, Short.class),\n\tUInt16(6, Integer.class),\n\tUInt32(7, Long.class),\n\tUInt64(8, BigInteger.class),\n\tFloat(9, Float.class),\n\tDouble(10, Double.class),\n\tBoolean(11, Boolean.class),\n\tString(12, String.class),\n\tDateTime(13, Date.class),\n\tText(14, String.class),\n\n\t// Custom Types for Metrics\n\tUUID(15, String.class),\n\tDataSet(16, DataSet.class),\n\tBytes(17, byte[].class),\n\tFile(18, File.class),\n\tTemplate(19, Template.class),\n\n\t// PropertyValue Types (20 and 21) are NOT metric datatypes\n\n\t// Array Types\n\tInt8Array(22, Byte[].class),\n\tInt16Array(23, Short[].class),\n\tInt32Array(24, Integer[].class),\n\tInt64Array(25, Long[].class),\n\tUInt8Array(26, Short[].class),\n\tUInt16Array(27, Integer[].class),\n\tUInt32Array(28, Long[].class),\n\tUInt64Array(29, BigInteger[].class),\n\tFloatArray(30, Float[].class),\n\tDoubleArray(31, Double[].class),\n\tBooleanArray(32, Boolean[].class),\n\tStringArray(33, String[].class),\n\tDateTimeArray(34, Date[].class),\n\n\t// Unknown\n\tUnknown(0, Object.class);\n\n\tprivate static final Logger logger = LoggerFactory.getLogger(MetricDataType.class.getName());\n\n\tprivate Class<?> clazz = null;\n\tprivate int intValue = 0;\n\n\t/**\n\t * Constructor\n\t *\n\t * @param intValue the integer value of this {@link MetricDataType}\n\t *\n\t * @param clazz the {@link Class} type associated with this {@link MetricDataType}\n\t */\n\tprivate MetricDataType(int intValue, Class<?> clazz) {\n\t\tthis.intValue = intValue;\n\t\tthis.clazz = clazz;\n\t}\n\n\t/**\n\t * Checks the type of a specified value against the specified {@link MetricDataType}\n\t *\n\t * @param value the {@link Object} value to check against the {@link MetricDataType}\n\t *\n\t * @throws SparkplugInvalidTypeException if the value is not a valid type for the given {@link MetricDataType}\n\t */\n\tpublic void checkType(Object value) throws SparkplugInvalidTypeException {\n\t\tif (value != null && !clazz.isAssignableFrom(value.getClass())) {\n\t\t\tlogger.warn(\n\t\t\t\t\t\"Failed type check - \" + clazz + \" != \" + ((value != null) ? value.getClass().toString() : \"null\"));\n\t\t\tthrow new SparkplugInvalidTypeException(value.getClass());\n\t\t}\n\t}\n\n\t/**\n\t * Returns an integer representation of the data type.\n\t * \n\t * @return an integer representation of the data type.\n\t */\n\tpublic int toIntValue() {\n\t\treturn this.intValue;\n\t}\n\n\t/**\n\t * Converts the integer representation of the data type into a {@link MetricDataType} instance.\n\t * \n\t * @param i the integer representation of the data type.\n\t * @return a {@link MetricDataType} instance.\n\t */\n\tpublic static MetricDataType fromInteger(int i) {\n\t\tswitch (i) {\n\t\t\tcase 1:\n\t\t\t\treturn Int8;\n\t\t\tcase 2:\n\t\t\t\treturn Int16;\n\t\t\tcase 3:\n\t\t\t\treturn Int32;\n\t\t\tcase 4:\n\t\t\t\treturn Int64;\n\t\t\tcase 5:\n\t\t\t\treturn UInt8;\n\t\t\tcase 6:\n\t\t\t\treturn UInt16;\n\t\t\tcase 7:\n\t\t\t\treturn UInt32;\n\t\t\tcase 8:\n\t\t\t\treturn UInt64;\n\t\t\tcase 9:\n\t\t\t\treturn Float;\n\t\t\tcase 10:\n\t\t\t\treturn Double;\n\t\t\tcase 11:\n\t\t\t\treturn Boolean;\n\t\t\tcase 12:\n\t\t\t\treturn String;\n\t\t\tcase 13:\n\t\t\t\treturn DateTime;\n\t\t\tcase 14:\n\t\t\t\treturn Text;\n\t\t\tcase 15:\n\t\t\t\treturn UUID;\n\t\t\tcase 16:\n\t\t\t\treturn DataSet;\n\t\t\tcase 17:\n\t\t\t\treturn Bytes;\n\t\t\tcase 18:\n\t\t\t\treturn File;\n\t\t\tcase 19:\n\t\t\t\treturn Template;\n\t\t\tcase 22:\n\t\t\t\treturn Int8Array;\n\t\t\tcase 23:\n\t\t\t\treturn Int16Array;\n\t\t\tcase 24:\n\t\t\t\treturn Int32Array;\n\t\t\tcase 25:\n\t\t\t\treturn Int64Array;\n\t\t\tcase 26:\n\t\t\t\treturn UInt8Array;\n\t\t\tcase 27:\n\t\t\t\treturn UInt16Array;\n\t\t\tcase 28:\n\t\t\t\treturn UInt32Array;\n\t\t\tcase 29:\n\t\t\t\treturn UInt64Array;\n\t\t\tcase 30:\n\t\t\t\treturn FloatArray;\n\t\t\tcase 31:\n\t\t\t\treturn DoubleArray;\n\t\t\tcase 32:\n\t\t\t\treturn BooleanArray;\n\t\t\tcase 33:\n\t\t\t\treturn StringArray;\n\t\t\tcase 34:\n\t\t\t\treturn DateTimeArray;\n\t\t\tdefault:\n\t\t\t\treturn Unknown;\n\t\t}\n\t}\n\n\t/**\n\t * Returns the class type for this DataType\n\t * \n\t * @return the class type for this DataType\n\t */\n\tpublic Class<?> getClazz() {\n\t\treturn clazz;\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/Parameter.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.util.Objects;\n\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\n\nimport com.fasterxml.jackson.annotation.JsonGetter;\nimport com.fasterxml.jackson.annotation.JsonInclude;\nimport com.fasterxml.jackson.annotation.JsonProperty;\nimport com.fasterxml.jackson.annotation.JsonSetter;\nimport com.fasterxml.jackson.annotation.JsonInclude.Include;\n\n/**\n * A class to represent a parameter associated with a template.\n */\n@JsonInclude(Include.NON_NULL)\npublic class Parameter {\n\n\t/**\n\t * The name of the parameter\n\t */\n\t@JsonProperty(\"name\")\n\tprivate String name;\n\n\t/**\n\t * The data type of the parameter\n\t */\n\t@JsonProperty(\"type\")\n\tprivate ParameterDataType type;\n\n\t/**\n\t * The value of the parameter\n\t */\n\t@JsonProperty(\"value\")\n\t@JsonInclude(Include.NON_EMPTY)\n\tprivate Object value;\n\n\tpublic Parameter() {\n\t}\n\n\t/**\n\t * Constructs a Parameter instance.\n\t * \n\t * @param name The name of the parameter.\n\t * @param type The type of the parameter.\n\t * @param value The value of the parameter.\n\t * @throws SparkplugInvalidTypeException\n\t */\n\tpublic Parameter(String name, ParameterDataType type, Object value) throws SparkplugInvalidTypeException {\n\t\tthis.name = name;\n\t\tthis.type = type;\n\t\tthis.value = value;\n\t\tif (value != null) {\n\t\t\tthis.type.checkType(value);\n\t\t}\n\t}\n\n\t/**\n\t * Gets the name of this {@link Parameter}\n\t *\n\t * @return the name of this {@link Parameter}\n\t */\n\t@JsonGetter(\"name\")\n\tpublic String getName() {\n\t\treturn name;\n\t}\n\n\t/**\n\t * Sets the name of this {@link Parameter}\n\t *\n\t * @param name the name of this {@link Parameter}\n\t */\n\t@JsonSetter(\"name\")\n\tpublic void setName(String name) {\n\t\tthis.name = name;\n\t}\n\n\t/**\n\t * Gets the {@link ParameterDataType} of this {@link Parameter}\n\t *\n\t * @return the {@link ParameterDataType} of this {@link Parameter}\n\t */\n\tpublic ParameterDataType getType() {\n\t\treturn type;\n\t}\n\n\t/**\n\t * Sets the {@link ParameterDataType} of this {@link Parameter}\n\t *\n\t * @param type the {@link ParameterDataType} of this {@link Parameter}\n\t */\n\tpublic void setType(ParameterDataType type) {\n\t\tthis.type = type;\n\t}\n\n\t/**\n\t * Gets the {@link Object} value of this {@link Parameter}\n\t *\n\t * @return the {@link Object} value of this {@link Parameter}\n\t */\n\tpublic Object getValue() {\n\t\treturn value;\n\t}\n\n\t/**\n\t * Sets the {@link Object} value of this {@link Parameter}\n\t *\n\t * @param type the {@link Object} value of this {@link Parameter}\n\t */\n\tpublic void setValue(Object value) {\n\t\tthis.value = value;\n\t}\n\n\t@Override\n\tpublic boolean equals(Object object) {\n\t\tif (this == object) {\n\t\t\treturn true;\n\t\t}\n\t\tif (object == null || this.getClass() != object.getClass()) {\n\t\t\treturn false;\n\t\t}\n\t\tParameter param = (Parameter) object;\n\t\treturn Objects.equals(name, param.getName()) && Objects.equals(type, param.getType())\n\t\t\t\t&& Objects.equals(value, param.getValue());\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder builder = new StringBuilder();\n\t\tbuilder.append(\"Parameter [name=\");\n\t\tbuilder.append(name);\n\t\tbuilder.append(\", type=\");\n\t\tbuilder.append(type);\n\t\tbuilder.append(\", value=\");\n\t\tbuilder.append(value);\n\t\tbuilder.append(\"]\");\n\t\treturn builder.toString();\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/ParameterDataType.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.math.BigInteger;\nimport java.util.Date;\n\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\n/**\n * An enumeration of data types for the value of a {@link Parameter} for a {@link Template}\n */\npublic enum ParameterDataType {\n\n\t// Basic Types\n\tInt8(1, Byte.class),\n\tInt16(2, Short.class),\n\tInt32(3, Integer.class),\n\tInt64(4, Long.class),\n\tUInt8(5, Short.class),\n\tUInt16(6, Integer.class),\n\tUInt32(7, Long.class),\n\tUInt64(8, BigInteger.class),\n\tFloat(9, Float.class),\n\tDouble(10, Double.class),\n\tBoolean(11, Boolean.class),\n\tString(12, String.class),\n\tDateTime(13, Date.class),\n\tText(14, String.class),\n\n\t// Unknown\n\tUnknown(0, Object.class);\n\n\tprivate static final Logger logger = LoggerFactory.getLogger(ParameterDataType.class.getName());\n\n\tprivate Class<?> clazz = null;\n\tprivate int intValue = 0;\n\n\t/**\n\t * Constructor\n\t *\n\t * @param intValue the integer representation of this {@link ParameterDatatype}\n\t *\n\t * @param clazz the {@link Class} type of this {@link ParameterDataType}\n\t */\n\tprivate ParameterDataType(int intValue, Class<?> clazz) {\n\t\tthis.intValue = intValue;\n\t\tthis.clazz = clazz;\n\t}\n\n\t/**\n\t * Checks the type of this {@link ParameterDataType} against an {@link Object} value\n\t *\n\t * @param value the {@link Object} value to validate against the {@link ParameterDataType}\n\t *\n\t * @throws SparkplugInvalidTypeException if the validation of the {@link Object} value against the\n\t *             {@link ParameterDataType} fails\n\t */\n\tpublic void checkType(Object value) throws SparkplugInvalidTypeException {\n\t\tif (value != null && !clazz.isAssignableFrom(value.getClass())) {\n\t\t\tlogger.warn(\"Failed type check - \" + clazz + \" != \" + value.getClass().toString());\n\t\t\tthrow new SparkplugInvalidTypeException(value.getClass());\n\t\t}\n\t}\n\n\t/**\n\t * Returns an integer representation of the data type.\n\t * \n\t * @return an integer representation of the data type.\n\t */\n\tpublic int toIntValue() {\n\t\treturn this.intValue;\n\t}\n\n\t/**\n\t * Converts the integer representation of the data type into a {@link ParameterDataType} instance.\n\t * \n\t * @param i the integer representation of the data type.\n\t * @return a {@link ParameterDataType} instance.\n\t */\n\tpublic static ParameterDataType fromInteger(int i) {\n\t\tswitch (i) {\n\t\t\tcase 1:\n\t\t\t\treturn Int8;\n\t\t\tcase 2:\n\t\t\t\treturn Int16;\n\t\t\tcase 3:\n\t\t\t\treturn Int32;\n\t\t\tcase 4:\n\t\t\t\treturn Int64;\n\t\t\tcase 5:\n\t\t\t\treturn UInt8;\n\t\t\tcase 6:\n\t\t\t\treturn UInt16;\n\t\t\tcase 7:\n\t\t\t\treturn UInt32;\n\t\t\tcase 8:\n\t\t\t\treturn UInt64;\n\t\t\tcase 9:\n\t\t\t\treturn Float;\n\t\t\tcase 10:\n\t\t\t\treturn Double;\n\t\t\tcase 11:\n\t\t\t\treturn Boolean;\n\t\t\tcase 12:\n\t\t\t\treturn String;\n\t\t\tcase 13:\n\t\t\t\treturn DateTime;\n\t\t\tcase 14:\n\t\t\t\treturn Text;\n\t\t\tdefault:\n\t\t\t\treturn Unknown;\n\t\t}\n\t}\n\n\t/**\n\t * Returns the class type for this DataType\n\t * \n\t * @return the class type for this DataType\n\t */\n\tpublic Class<?> getClazz() {\n\t\treturn clazz;\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/Property.java",
    "content": "/*\n * Licensed Materials - Property of Cirrus Link Solutions\n * Copyright (c) 2022 Cirrus Link Solutions LLC - All Rights Reserved\n * Unauthorized copying of this file, via any medium is strictly prohibited\n * Proprietary and confidential\n */\npackage org.eclipse.tahu.message.model;\n\npublic class Property<T> {\n\n\tprivate final String name;\n\n\tprivate final T defaultValue;\n\n\tprivate T value;\n\n\tpublic Property(String name, T defaultValue) {\n\t\tthis.name = name;\n\t\tthis.defaultValue = defaultValue;\n\t}\n\n\tpublic Property(String name, T defaultValue, T value) {\n\t\tthis(name, defaultValue);\n\t\tthis.value = value;\n\t}\n\n\tpublic T getValue() {\n\t\treturn value;\n\t}\n\n\tpublic void setValue(T value) {\n\t\tthis.value = value;\n\t}\n\n\tpublic String getName() {\n\t\treturn name;\n\t}\n\n\tpublic T getDefaultValue() {\n\t\treturn defaultValue;\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder builder = new StringBuilder();\n\t\tbuilder.append(\"Property [name=\");\n\t\tbuilder.append(name);\n\t\tbuilder.append(\", defaultValue=\");\n\t\tbuilder.append(defaultValue);\n\t\tbuilder.append(\", value=\");\n\t\tbuilder.append(value);\n\t\tbuilder.append(\"]\");\n\t\treturn builder.toString();\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/PropertyDataType.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.math.BigInteger;\nimport java.util.Date;\nimport java.util.List;\n\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\n/**\n * An enumeration of data types for values of a {@link PropertySet}\n */\npublic enum PropertyDataType {\n\n\t// Basic Types\n\tInt8(1, Byte.class),\n\tInt16(2, Short.class),\n\tInt32(3, Integer.class),\n\tInt64(4, Long.class),\n\tUInt8(5, Short.class),\n\tUInt16(6, Integer.class),\n\tUInt32(7, Long.class),\n\tUInt64(8, BigInteger.class),\n\tFloat(9, Float.class),\n\tDouble(10, Double.class),\n\tBoolean(11, Boolean.class),\n\tString(12, String.class),\n\tDateTime(13, Date.class),\n\tText(14, String.class),\n\n\t// Custom Types for PropertySets\n\tPropertySet(20, PropertySet.class),\n\tPropertySetList(21, List.class),\n\n\t// Unknown\n\tUnknown(0, Object.class);\n\n\tprivate static final Logger logger = LoggerFactory.getLogger(PropertyDataType.class.getName());\n\n\tprivate Class<?> clazz = null;\n\tprivate int intValue = 0;\n\n\t/**\n\t * Constructor\n\t *\n\t * @param intValue the integer representation of this {@link PropertyDataType}\n\t *\n\t * @param clazz the {@link Class} type of this {@link PropertyDataType}\n\t */\n\tprivate PropertyDataType(int intValue, Class<?> clazz) {\n\t\tthis.intValue = intValue;\n\t\tthis.clazz = clazz;\n\t}\n\n\t/**\n\t * Checks the type of this {@link PropertyDataType} against an {@link Object} value\n\t *\n\t * @param value the {@link Object} value to validate against the {@link PropertyDataType}\n\t *\n\t * @throws SparkplugInvalidTypeException if the validation of the {@link Object} value against the\n\t *             {@link PropertyDataType} fails\n\t */\n\tpublic void checkType(Object value) throws SparkplugInvalidTypeException {\n\t\tif (value != null && !clazz.isAssignableFrom(value.getClass())) {\n\t\t\tif (clazz == List.class && value instanceof List) {\n\t\t\t\t// Allow List subclasses\n\t\t\t} else {\n\t\t\t\tlogger.warn(\"Failed type check - \" + clazz + \" != \" + value.getClass().toString());\n\t\t\t\tthrow new SparkplugInvalidTypeException(value.getClass());\n\t\t\t}\n\t\t}\n\t}\n\n\t/**\n\t * Returns an integer representation of the data type.\n\t * \n\t * @return an integer representation of the data type.\n\t */\n\tpublic int toIntValue() {\n\t\treturn this.intValue;\n\t}\n\n\t/**\n\t * Converts the integer representation of the data type into a {@link PropertyDataType} instance.\n\t * \n\t * @param i the integer representation of the data type.\n\t * @return a {@link PropertyDataType} instance.\n\t */\n\tpublic static PropertyDataType fromInteger(int i) {\n\t\tswitch (i) {\n\t\t\tcase 1:\n\t\t\t\treturn Int8;\n\t\t\tcase 2:\n\t\t\t\treturn Int16;\n\t\t\tcase 3:\n\t\t\t\treturn Int32;\n\t\t\tcase 4:\n\t\t\t\treturn Int64;\n\t\t\tcase 5:\n\t\t\t\treturn UInt8;\n\t\t\tcase 6:\n\t\t\t\treturn UInt16;\n\t\t\tcase 7:\n\t\t\t\treturn UInt32;\n\t\t\tcase 8:\n\t\t\t\treturn UInt64;\n\t\t\tcase 9:\n\t\t\t\treturn Float;\n\t\t\tcase 10:\n\t\t\t\treturn Double;\n\t\t\tcase 11:\n\t\t\t\treturn Boolean;\n\t\t\tcase 12:\n\t\t\t\treturn String;\n\t\t\tcase 13:\n\t\t\t\treturn DateTime;\n\t\t\tcase 14:\n\t\t\t\treturn Text;\n\t\t\tcase 20:\n\t\t\t\treturn PropertySet;\n\t\t\tcase 21:\n\t\t\t\treturn PropertySetList;\n\t\t\tdefault:\n\t\t\t\treturn Unknown;\n\t\t}\n\t}\n\n\t/**\n\t * Returns the class type for this DataType\n\t * \n\t * @return the class type for this DataType\n\t */\n\tpublic Class<?> getClazz() {\n\t\treturn clazz;\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/PropertySet.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.util.Collection;\nimport java.util.HashMap;\nimport java.util.Map;\nimport java.util.Set;\n\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\n\nimport com.fasterxml.jackson.annotation.JsonIgnore;\n\n/**\n * A class that maintains a set of properties associated with a {@link Metric}.\n */\npublic class PropertySet implements Map<String, PropertyValue> {\n\n\t@JsonIgnore\n\tprivate Map<String, PropertyValue> map;\n\n\t/**\n\t * Default Constructor\n\t */\n\tpublic PropertySet() {\n\t\tthis.map = new HashMap<>();\n\t}\n\n\t/**\n\t * Copy Constructor\n\t *\n\t * @param propertySet the {@link PropertySet} to copy\n\t */\n\tpublic PropertySet(PropertySet propertySet) {\n\t\tthis.map = propertySet.getPropertyMap();\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param propertyMap the {@link Map} of {@link String}s to {@link PropertyValie}s\n\t */\n\tprivate PropertySet(Map<String, PropertyValue> propertyMap) {\n\t\tthis.map = propertyMap;\n\t}\n\n\t/**\n\t * Gets the {@link PropertyValue} associated with a given Property name\n\t *\n\t * @param name the name of the Property\n\t *\n\t * @return the {@link PropertyValue} associated with the name\n\t */\n\t@JsonIgnore\n\tpublic PropertyValue getPropertyValue(String name) {\n\t\treturn this.map.get(name);\n\t}\n\n\t/**\n\t * Sets the {@link PropertyValue} for a give property name\n\t *\n\t * @param name the name of the property\n\t * @param value the {@link PropertyValue} associated with the property name\n\t */\n\t@JsonIgnore\n\tpublic void setProperty(String name, PropertyValue value) {\n\t\tthis.map.put(name, value);\n\t}\n\n\t/**\n\t * Removes a property based on the property name\n\t *\n\t * @param name the name of the property to remove\n\t */\n\t@JsonIgnore\n\tpublic void removeProperty(String name) {\n\t\tthis.map.remove(name);\n\t}\n\n\t/**\n\t * Clears all properties and values from the {@link PropertySet}\n\t */\n\t@JsonIgnore\n\tpublic void clear() {\n\t\tthis.map.clear();\n\t}\n\n\t/**\n\t * Gets the names of the {@link PropertySet}\n\t *\n\t * @return the names of the {@link PropertySet}\n\t */\n\t@JsonIgnore\n\tpublic Set<String> getNames() {\n\t\treturn map.keySet();\n\t}\n\n\t/**\n\t * Gets the values of the {@link PropertySet}\n\t *\n\t * @return the values of the {@link PropertySet}\n\t */\n\t@JsonIgnore\n\tpublic Collection<PropertyValue> getValues() {\n\t\treturn map.values();\n\t}\n\n\t/**\n\t * Gets the {@link Map} of {@link String} property names to {@link PropertyValue}s\n\t *\n\t * @return the {@link Map} of {@link String} property names to {@link PropertyValue}s\n\t */\n\t@JsonIgnore\n\tpublic Map<String, PropertyValue> getPropertyMap() {\n\t\treturn map;\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\treturn \"PropertySet [propertyMap=\" + map + \"]\";\n\t}\n\n\t@Override\n\tpublic int size() {\n\t\treturn map.size();\n\t}\n\n\t@Override\n\tpublic boolean isEmpty() {\n\t\treturn map.isEmpty();\n\t}\n\n\t@Override\n\tpublic boolean containsKey(Object key) {\n\t\treturn map.containsKey(key);\n\t}\n\n\t@Override\n\tpublic boolean containsValue(Object value) {\n\t\treturn map.containsValue(value);\n\t}\n\n\t@Override\n\tpublic PropertyValue get(Object key) {\n\t\treturn map.get(key);\n\t}\n\n\t@Override\n\tpublic PropertyValue put(String key, PropertyValue value) {\n\t\treturn map.put(key, value);\n\t}\n\n\t@Override\n\tpublic PropertyValue remove(Object key) {\n\t\treturn map.remove(key);\n\t}\n\n\t@Override\n\tpublic void putAll(Map<? extends String, ? extends PropertyValue> m) {\n\t\tmap.putAll(m);\n\t}\n\n\t@Override\n\tpublic Set<String> keySet() {\n\t\treturn map.keySet();\n\t}\n\n\t@Override\n\tpublic Collection<PropertyValue> values() {\n\t\treturn map.values();\n\t}\n\n\t@Override\n\tpublic Set<java.util.Map.Entry<String, PropertyValue>> entrySet() {\n\t\treturn map.entrySet();\n\t}\n\n\t/**\n\t * A builder for a PropertySet instance\n\t */\n\tpublic static class PropertySetBuilder {\n\n\t\tprivate Map<String, PropertyValue> propertyMap;\n\n\t\tpublic PropertySetBuilder() {\n\t\t\tthis.propertyMap = new HashMap<>();\n\t\t}\n\n\t\tpublic PropertySetBuilder(Map<String, PropertyValue> propertyMap) {\n\t\t\tthis.propertyMap = propertyMap;\n\t\t}\n\n\t\tpublic PropertySetBuilder(PropertySet propertySet) throws SparkplugInvalidTypeException {\n\t\t\tthis.propertyMap = new HashMap<>();\n\t\t\tfor (String name : propertySet.getNames()) {\n\t\t\t\tPropertyValue value = propertySet.getPropertyValue(name);\n\t\t\t\tpropertyMap.put(name, new PropertyValue(value.getType(), value.getValue()));\n\t\t\t}\n\t\t}\n\n\t\tpublic PropertySetBuilder addProperty(String name, PropertyValue value) {\n\t\t\tthis.propertyMap.put(name, value);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic PropertySetBuilder addProperties(Map<String, PropertyValue> properties) {\n\t\t\tthis.propertyMap.putAll(properties);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic PropertySet createPropertySet() {\n\t\t\treturn new PropertySet(propertyMap);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/PropertyValue.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.util.Objects;\n\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\n\nimport com.fasterxml.jackson.annotation.JsonIgnore;\n\n/**\n * The value of a property in a {@link PropertySet}.\n */\npublic class PropertyValue {\n\n\tprivate PropertyDataType type;\n\tprivate Object value;\n\tprivate Boolean isNull = null;\n\n\t/**\n\t * Default Constructor\n\t */\n\tpublic PropertyValue() {\n\t}\n\n\t/**\n\t * Constructor.\n\t * \n\t * @param type the property type\n\t * @param value the property value\n\t * @throws SparkplugInvalidTypeException\n\t */\n\tpublic PropertyValue(PropertyDataType type, Object value) throws SparkplugInvalidTypeException {\n\t\tthis.type = type;\n\t\tthis.value = value;\n\t\tisNull = (value == null) ? true : false;\n\t\ttype.checkType(value);\n\t}\n\n\t/**\n\t * Gets the {@link PropertyDataType} of the {@link PropertyValue}\n\t *\n\t * @return the {@link PropertyDataType} of the {@link PropertyValue}\n\t */\n\tpublic PropertyDataType getType() {\n\t\treturn type;\n\t}\n\n\t/**\n\t * Sets the {@link PropertyDataType} of this {@link PropertyValue}\n\t *\n\t * @param type the {@link PropertyDataType} of this {@link PropertyValue}\n\t */\n\tpublic void setType(PropertyDataType type) {\n\t\tthis.type = type;\n\t}\n\n\t/**\n\t * Gets the {@link Object} value of the {@link PropertyValue}\n\t *\n\t * @return the {@link Object} value of the {@link PropertyValue}\n\t */\n\tpublic Object getValue() {\n\t\treturn value;\n\t}\n\n\t/**\n\t * Sets the {@link Object} value of this {@link PropertyValue}\n\t *\n\t * @param type the {@link Object} value of this {@link PropertyValue}\n\t */\n\tpublic void setValue(Object value) {\n\t\tthis.value = value;\n\t\tisNull = (value == null) ? true : false;\n\t}\n\n\t/**\n\t * Whether or not this {@link PropertyValue} is null or not\n\t *\n\t * @return true if this {@link PropertyValue} is null, otherwise false\n\t */\n\t@JsonIgnore\n\tpublic Boolean isNull() {\n\t\treturn isNull;\n\t}\n\n\t@Override\n\tpublic boolean equals(Object object) {\n\t\tif (this == object) {\n\t\t\treturn true;\n\t\t}\n\t\tif (object == null || this.getClass() != object.getClass()) {\n\t\t\treturn false;\n\t\t}\n\t\tPropertyValue propValue = (PropertyValue) object;\n\t\treturn Objects.equals(type, propValue.getType()) && Objects.equals(value, propValue.getValue());\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder builder = new StringBuilder();\n\t\tbuilder.append(\"PropertyValue [type=\");\n\t\tbuilder.append(type);\n\t\tbuilder.append(\", value=\");\n\t\tbuilder.append(value);\n\t\tbuilder.append(\", isNull=\");\n\t\tbuilder.append(isNull);\n\t\tbuilder.append(\"]\");\n\t\treturn builder.toString();\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/Quality.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\n/**\n * The Sparkplug quality value associated with a {@link Metric}\n */\npublic class Quality {\n\n\t/**\n\t * Default Constructor\n\t */\n\tpublic Quality() {\n\t\t// TODO Auto-generated constructor stub\n\t}\n\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/Row.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.util.ArrayList;\nimport java.util.Collection;\nimport java.util.List;\n\n/**\n * A class for representing a row of a data set.\n */\npublic class Row {\n\n\tprivate List<Value<?>> values;\n\n\t/**\n\t * Default Constructor\n\t */\n\tpublic Row() {\n\t\tthis.values = new ArrayList<>();\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param values\n\t */\n\tpublic Row(List<Value<?>> values) {\n\t\tthis.values = values;\n\t}\n\n\t/**\n\t * Gets a {@link List} of {@link Value}s in the {@link Row}\n\t *\n\t * @return a {@link List} of {@link Value}s in the {@link Row}\n\t */\n\tpublic List<Value<?>> getValues() {\n\t\treturn values;\n\t}\n\n\t/**\n\t * Sets a {@link List} of {@link Value}s for the {@link Row}\n\t *\n\t * @param values a {@link List} of {@link Value}s to set for the {@link Row}\n\t */\n\tpublic void setValues(List<Value<?>> values) {\n\t\tthis.values = values;\n\t}\n\n\t/**\n\t * Adds a {@link Value} to the end of the {@link Row}\n\t *\n\t * @param value a {@link Value} to the end of the {@link Row}\n\t */\n\tpublic void addValue(Value<?> value) {\n\t\tthis.values.add(value);\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\treturn \"Row [values=\" + values + \"]\";\n\t}\n\n\t@Override\n\tpublic int hashCode() {\n\t\tfinal int prime = 31;\n\t\tint result = 1;\n\t\tresult = prime * result + ((values == null) ? 0 : values.hashCode());\n\t\treturn result;\n\t}\n\n\t@Override\n\tpublic boolean equals(Object obj) {\n\t\tif (this == obj)\n\t\t\treturn true;\n\t\tif (obj == null)\n\t\t\treturn false;\n\t\tif (getClass() != obj.getClass())\n\t\t\treturn false;\n\t\tRow other = (Row) obj;\n\t\tif (values == null) {\n\t\t\tif (other.values != null)\n\t\t\t\treturn false;\n\t\t} else if (!values.equals(other.values))\n\t\t\treturn false;\n\t\treturn true;\n\t}\n\n\t/**\n\t * Converts a {@link Row} instance to a {@link List} of Objects representing the values.\n\t * \n\t * @param row a {@link Row} instance.\n\t * @return a {@link List} of Objects.\n\t */\n\tpublic static List<Object> toValues(Row row) {\n\t\tList<Object> list = new ArrayList<Object>(row.getValues().size());\n\t\tfor (Value<?> value : row.getValues()) {\n\t\t\tlist.add(value.getValue());\n\t\t}\n\t\treturn list;\n\t}\n\n\t/**\n\t * A builder for creating a {@link Row} instance.\n\t */\n\tpublic static class RowBuilder {\n\n\t\tprivate List<Value<?>> values;\n\n\t\tpublic RowBuilder() {\n\t\t\tthis.values = new ArrayList<Value<?>>();\n\t\t}\n\n\t\tpublic RowBuilder(Row row) {\n\t\t\tthis.values = new ArrayList<Value<?>>(row.getValues());\n\t\t}\n\n\t\tpublic RowBuilder addValue(Value<?> value) {\n\t\t\tthis.values.add(value);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic RowBuilder addValues(Collection<Value<?>> values) {\n\t\t\tthis.values.addAll(values);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic Row createRow() {\n\t\t\treturn new Row(values);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/SparkplugBPayload.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.util.ArrayList;\nimport java.util.Arrays;\nimport java.util.Collection;\nimport java.util.Date;\nimport java.util.List;\n\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\nimport com.fasterxml.jackson.annotation.JsonIgnore;\nimport com.fasterxml.jackson.annotation.JsonInclude;\nimport com.fasterxml.jackson.annotation.JsonInclude.Include;\n\n/**\n * A class representing a Sparkplug B payload\n */\n@JsonInclude(Include.NON_NULL)\npublic class SparkplugBPayload {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(SparkplugBPayload.class.getName());\n\n\tprivate Date timestamp;\n\tprivate List<Metric> metrics;\n\tprivate Long seq = null;\n\tprivate String uuid;\n\tprivate byte[] body;\n\n\t/**\n\t * Default Constructor\n\t */\n\tpublic SparkplugBPayload() {\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param timestamp the overall {@link Date} timestamp of the {@link SparkplugBPayload}\n\t * @param metrics a {@link List} of {@link Metrics} in the {@link SparkplugBPayload}\n\t * @param seq the Sparkplug sequence number for the {@link SparkplugBPayload}\n\t * @param uuid a UUID for the {@link SparkplugBPayload}\n\t * @param body an array of bytes for the {@link SparkplugBPayload}\n\t */\n\tpublic SparkplugBPayload(Date timestamp, List<Metric> metrics, Long seq, String uuid, byte[] body) {\n\t\tthis(timestamp, metrics, seq);\n\t\tthis.uuid = uuid;\n\t\tthis.body = body;\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param timestamp the overall {@link Date} timestamp of the {@link SparkplugBPayload}\n\t * @param metrics a {@link List} of {@link Metrics} in the {@link SparkplugBPayload}\n\t * @param seq the Sparkplug sequence number for the {@link SparkplugBPayload}\n\t */\n\tpublic SparkplugBPayload(Date timestamp, List<Metric> metrics, Long seq) {\n\t\tthis(timestamp, metrics);\n\t\tthis.seq = seq;\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param timestamp the overall {@link Date} timestamp of the {@link SparkplugBPayload}\n\t * @param metrics a {@link List} of {@link Metrics} in the {@link SparkplugBPayload}\n\t */\n\tpublic SparkplugBPayload(Date timestamp, List<Metric> metrics) {\n\t\tthis.timestamp = timestamp;\n\t\tthis.metrics = metrics;\n\t}\n\n\t/**\n\t * Copy Constructor\n\t *\n\t * @param payload the {@link SparkplugBPayload} to copy\n\t */\n\tpublic SparkplugBPayload(SparkplugBPayload payload) {\n\t\tthis.timestamp = payload.getTimestamp();\n\t\tif (payload.getMetrics() != null) {\n\t\t\tmetrics = new ArrayList<>();\n\t\t\tfor (Metric metric : payload.getMetrics()) {\n\t\t\t\ttry {\n\t\t\t\t\tmetrics.add(new Metric(metric));\n\t\t\t\t} catch (Exception e) {\n\t\t\t\t\tlogger.error(\"Failed to copy metric: {}\", metric, e);\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t\tthis.seq = payload.getSeq();\n\t\tthis.uuid = payload.getUuid();\n\t\tthis.body = payload.getBody();\n\t}\n\n\t/**\n\t * Gets the timestamp of the {@link SparkplugBPayload} as a {@link Date}\n\t *\n\t * @return a {@link Date} representing the timestamp of the {@link SparkplugBPayload}\n\t */\n\tpublic Date getTimestamp() {\n\t\treturn timestamp;\n\t}\n\n\t/**\n\t * Sets the timestamp of the {@link SparkplugBPayload}\n\t *\n\t * @param timestamp the {@link Date} timestamp to set for the {@link SparkplugBPayload}\n\t */\n\tpublic void setTimestamp(Date timestamp) {\n\t\tthis.timestamp = timestamp;\n\t}\n\n\t/**\n\t * Adds a {@link Metric} to the end of the {@link List} of Sparkplug metrics\n\t *\n\t * @param metric a {@link Metric} to add to the end of the {@link List} of Sparkplug metrics\n\t */\n\tpublic void addMetric(Metric metric) {\n\t\tmetrics.add(metric);\n\t}\n\n\t/**\n\t * Adds a {@link Metric} at the specified index to the {@link List} of Sparkplug metrics\n\t *\n\t * @param index the index to use in the {@link List} of {@link Metric}s when adding the {@link Metric}\n\t * @param metric a {@link Metric} to add at the specified index to the {@link List} of Sparkplug metrics\n\t */\n\tpublic void addMetric(int index, Metric metric) {\n\t\tmetrics.add(index, metric);\n\t}\n\n\t/**\n\t * Sets the {@link List} of {@link Metric}s for the {@link SparkplugBPayload}\n\t *\n\t * @param metrics the {@link List} of {@link Metric}s to set for the {@link SparkplugBPayload}\n\t */\n\tpublic void addMetrics(List<Metric> metrics) {\n\t\tthis.metrics.addAll(metrics);\n\t}\n\n\t/**\n\t * Removes a {@link Metric} from the {@link List} of {@link Metric}s in the {@link SparkplugBPayload}\n\t *\n\t * @param index the index to use when removing the {@link Metric}\n\t * @return the {@link Metric} that was removed\n\t */\n\tpublic Metric removeMetric(int index) {\n\t\treturn metrics.remove(index);\n\t}\n\n\t/**\n\t * Removes a {@link Metric} by equality to a {@link Metric} in the {@link List} of metrics\n\t *\n\t * @param metric the {@link Metric} to remove\n\t * @return true if the {@link Metric} was removed, otherwise false\n\t */\n\tpublic boolean removeMetric(Metric metric) {\n\t\treturn metrics.remove(metric);\n\t}\n\n\t/**\n\t * Gets the {@link List} of {@link Metric}s associated with the {@link SparkplugBPayload}\n\t *\n\t * @return the {@link List} of {@link Metric}s associated with the {@link SparkplugBPayload}\n\t */\n\tpublic List<Metric> getMetrics() {\n\t\treturn metrics;\n\t}\n\n\t/**\n\t * Gets the number of {@link Metric}s in this {@link SparkplugBPayload}\n\t *\n\t * @return the number of {@link Metric}s in this {@link SparkplugBPayload}\n\t */\n\t@JsonIgnore\n\tpublic Integer getMetricCount() {\n\t\treturn metrics.size();\n\t}\n\n\t/**\n\t * Sets the {@link List} of {@link Metric}s for this {@link SparkplugBPayload}\n\t *\n\t * @param metrics the {@link List} of {@link Metric}s to set for this {@link SparkplugBPayload}\n\t */\n\tpublic void setMetrics(List<Metric> metrics) {\n\t\tthis.metrics = metrics;\n\t}\n\n\t/**\n\t * Gets the sequence number for this {@link SparkplugBPayload}\n\t *\n\t * @return the sequence number for this {@link SparkplugBPayload}\n\t */\n\tpublic Long getSeq() {\n\t\treturn seq;\n\t}\n\n\t/**\n\t * Sets the sequence number for this {@link SparkplugBPayload}\n\t *\n\t * @param seq the sequence number to set for this {@link SparkplugBPayload}\n\t */\n\tpublic void setSeq(Long seq) {\n\t\tthis.seq = seq;\n\t}\n\n\t/**\n\t * Gets the UUID for this {@link SparkplugBPayload}\n\t *\n\t * @return the UUID for this {@link SparkplugBPayload}\n\t */\n\tpublic String getUuid() {\n\t\treturn uuid;\n\t}\n\n\t/**\n\t * Sets the UUID for this {@link SparkplugBPayload}\n\t *\n\t * @param seq the UUID to set for this {@link SparkplugBPayload}\n\t */\n\tpublic void setUuid(String uuid) {\n\t\tthis.uuid = uuid;\n\t}\n\n\t/**\n\t * Gets the body for this {@link SparkplugBPayload}\n\t *\n\t * @return the body for this {@link SparkplugBPayload}\n\t */\n\tpublic byte[] getBody() {\n\t\treturn body;\n\t}\n\n\t/**\n\t * Sets the body for this {@link SparkplugBPayload}\n\t *\n\t * @param seq the body to set for this {@link SparkplugBPayload}\n\t */\n\tpublic void setBody(byte[] body) {\n\t\tthis.body = body;\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder builder = new StringBuilder();\n\t\tbuilder.append(\"SparkplugBPayload [timestamp=\");\n\t\tbuilder.append(timestamp != null ? timestamp.getTime() : \"null\");\n\t\tbuilder.append(\", metrics=\");\n\t\tbuilder.append(metrics);\n\t\tbuilder.append(\", seq=\");\n\t\tbuilder.append(seq != null ? seq : \"null\");\n\t\tbuilder.append(\", uuid=\");\n\t\tbuilder.append(uuid);\n\t\tbuilder.append(\", body=\");\n\t\tbuilder.append(Arrays.toString(body));\n\t\tbuilder.append(\"]\");\n\t\treturn builder.toString();\n\t}\n\n\t/**\n\t * A builder for creating a {@link SparkplugBPayload} instance.\n\t */\n\tpublic static class SparkplugBPayloadBuilder {\n\n\t\tprivate Date timestamp;\n\t\tprivate List<Metric> metrics;\n\t\tprivate Long seq = null;\n\t\tprivate String uuid;\n\t\tprivate byte[] body;\n\n\t\tpublic SparkplugBPayloadBuilder(Long sequenceNumber) {\n\t\t\tthis.seq = sequenceNumber;\n\t\t\tmetrics = new ArrayList<Metric>();\n\t\t}\n\n\t\tpublic SparkplugBPayloadBuilder() {\n\t\t\tmetrics = new ArrayList<Metric>();\n\t\t}\n\n\t\tpublic SparkplugBPayloadBuilder addMetric(Metric metric) {\n\t\t\tthis.metrics.add(metric);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic SparkplugBPayloadBuilder addMetrics(Collection<Metric> metrics) {\n\t\t\tthis.metrics.addAll(metrics);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic SparkplugBPayloadBuilder setTimestamp(Date timestamp) {\n\t\t\tthis.timestamp = timestamp;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic SparkplugBPayloadBuilder setSeq(Long seq) {\n\t\t\tthis.seq = seq;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic SparkplugBPayloadBuilder setUuid(String uuid) {\n\t\t\tthis.uuid = uuid;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic SparkplugBPayloadBuilder setBody(byte[] body) {\n\t\t\tthis.body = body;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic SparkplugBPayload createPayload() {\n\t\t\treturn new SparkplugBPayload(timestamp, metrics, seq, uuid, body);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/SparkplugBPayloadMap.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.util.ArrayList;\nimport java.util.Arrays;\nimport java.util.Collection;\nimport java.util.Date;\nimport java.util.List;\nimport java.util.Map;\nimport java.util.concurrent.ConcurrentHashMap;\n\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\nimport com.fasterxml.jackson.annotation.JsonIgnore;\n\n/**\n * A class representing a Sparkplug B payload as a {@link Map} to prevent duplication of {@link Metric}s. This can be\n * useful for Sparkplug BIRTH payloads\n */\npublic class SparkplugBPayloadMap extends SparkplugBPayload {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(SparkplugBPayloadMap.class.getName());\n\n\tprivate final ConcurrentHashMap<String, Metric> metricMap;\n\n\tprivate final Object mapLock = new Object();\n\n\t/**\n\t * Default Constructor\n\t */\n\tpublic SparkplugBPayloadMap() {\n\t\tsuper();\n\t\tmetricMap = new ConcurrentHashMap<>();\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param timestamp the overall {@link Date} timestamp of the {@link SparkplugBPayload}\n\t * @param metrics a {@link List} of {@link Metrics} in the {@link SparkplugBPayload}\n\t * @param seq the Sparkplug sequence number for the {@link SparkplugBPayload}\n\t * @param uuid a UUID for the {@link SparkplugBPayload}\n\t * @param body an array of bytes for the {@link SparkplugBPayload}\n\t */\n\tpublic SparkplugBPayloadMap(Date timestamp, List<Metric> metrics, long seq, String uuid, byte[] body) {\n\t\tsuper(timestamp, null, seq, uuid, body);\n\t\tmetricMap = new ConcurrentHashMap<>();\n\t\tfor (Metric metric : metrics) {\n\t\t\tmetricMap.put(metric.getName(), metric);\n\t\t}\n\t}\n\n\t/**\n\t * Adds a {@link Metric} to the {@link SparkplugBPayload}. If the {@link Metric} is already present with the same\n\t * name, it will be replaced.\n\t *\n\t * metric the {@link Metric} to add\n\t */\n\t@Override\n\tpublic void addMetric(Metric metric) {\n\t\tsynchronized (mapLock) {\n\t\t\tmetricMap.put(metric.getName(), metric);\n\t\t}\n\t}\n\n\t/**\n\t * Adds a {@link Metric} to the {@link SparkplugBPayload}. If the {@link Metric} is already present with the same\n\t * name, it will be replaced.\n\t *\n\t * index this is ignored for this implementation metric the {@link Metric} to add\n\t */\n\t@Override\n\tpublic void addMetric(int index, Metric metric) {\n\t\tsynchronized (mapLock) {\n\t\t\tmetricMap.put(metric.getName(), metric);\n\t\t}\n\t}\n\n\t/**\n\t * Adds a {@link List} of {@link Metric}s to the {@link SparkplugBPayloadMap}. If the list of {@link Metric}s has\n\t * metrics with duplicate names, only the last one in the {@link List} will be included in the\n\t * {@link SparkplugBPayloadMap}\n\t *\n\t * metrics a {@link List} of {@link Metric}s to add to the {@link SparkplugBPayloadMap}\n\t */\n\t@Override\n\tpublic void addMetrics(List<Metric> metrics) {\n\t\tsynchronized (mapLock) {\n\t\t\tfor (Metric metric : metrics) {\n\t\t\t\taddMetric(metric);\n\t\t\t}\n\t\t}\n\t}\n\n\t/**\n\t * Not used for the {@link SparkplugBPayloadMap}. This will always do nothing and return null.\n\t *\n\t * index not used\n\t */\n\t@Override\n\tpublic Metric removeMetric(int index) {\n\t\t// This method isn't valid for the SparkplugBPayloadMap\n\t\tlogger.error(\"removeMetric(int index) isn't supported by the SparkplugBPayloadMap\");\n\t\treturn null;\n\t}\n\n\t/**\n\t * Removes a {@link Metric} by equality to a {@link Metric} in the {@link List} of metrics\n\t *\n\t * @param metric the {@link Metric} to remove\n\t * @return true if the {@link Metric} was removed, otherwise false\n\t */\n\t@Override\n\tpublic boolean removeMetric(Metric metric) {\n\t\tsynchronized (mapLock) {\n\t\t\tif (metric != null) {\n\t\t\t\treturn removeMetric(metric.getName());\n\t\t\t}\n\n\t\t\treturn false;\n\t\t}\n\t}\n\n\t/**\n\t * Removes a {@link Metric} by metric name\n\t *\n\t * @param metricName the {@link String} metricName to remove\n\t * @return true if the {@link Metric} was removed, otherwise false\n\t */\n\tpublic boolean removeMetric(String metricName) {\n\t\tsynchronized (mapLock) {\n\t\t\tif (metricName != null) {\n\t\t\t\tMetric removedMetric = metricMap.remove(metricName);\n\t\t\t\tif (removedMetric != null) {\n\t\t\t\t\treturn true;\n\t\t\t\t}\n\t\t\t}\n\n\t\t\treturn false;\n\t\t}\n\t}\n\n\t/**\n\t * Gets a {@link List} of {@link Metric}s in the {@link SparkplugBPayloadMap}\n\t */\n\t@Override\n\tpublic List<Metric> getMetrics() {\n\t\treturn new ArrayList<>(metricMap.values());\n\t}\n\n\t/**\n\t * Gets the number of {@link Metric}s in this {@link SparkplugBPayloadMap}\n\t */\n\t@Override\n\t@JsonIgnore\n\tpublic Integer getMetricCount() {\n\t\treturn metricMap.size();\n\t}\n\n\t/**\n\t * Sets the {@link List} of {@link Metric}s for this {@link SparkplugBPayloadMap}\n\t *\n\t * @param metrics the {@link List} of {@link Metric}s to set for this {@link SparkplugBPayloadMap}\n\t */\n\t@Override\n\tpublic void setMetrics(List<Metric> metrics) {\n\t\tmetricMap.clear();\n\t\tfor (Metric metric : metrics) {\n\t\t\tmetricMap.put(metric.getName(), metric);\n\t\t}\n\t}\n\n\t/**\n\t * Gets a Metric for a given metric name\n\t * \n\t * @param metricName the name of the {@link Metric} to fetch\n\t * \n\t * @return the {@link Metric} with the provided metric name\n\t */\n\tpublic Metric getMetric(String metricName) {\n\t\treturn metricMap.get(metricName);\n\t}\n\n\t/**\n\t * Updates a {@link Metric} value in the {@link SparkplugBPayloadMap}\n\t * \n\t * @param metricName the name of the metric. This is required as Aliasing may be enabled and the name may not be set\n\t *            in the {@link Metric}\n\t * @param metric the {@link Metric} to update the value of\n\t */\n\tpublic void updateMetricValue(String newMetricName, Metric newMetric, List<Property<?>> customProperties) {\n\t\tif (newMetric == null) {\n\t\t\tlogger.info(\"Metric '{}' is null during update - removing from cache\", newMetricName);\n\t\t\tmetricMap.put(newMetricName, null);\n\t\t\treturn;\n\t\t}\n\n\t\tMetric existingMetric = metricMap.get(newMetricName);\n\n\t\t// Update the 'qualified value' which is the value, quality, and timestamp\n\t\tif (existingMetric != null) {\n\t\t\tif (newMetric.getDataType() == MetricDataType.Template && newMetric.getValue() != null) {\n\t\t\t\tupdateTemplateMetricValues((TemplateMap) (getMetric(newMetricName).getValue()), newMetric,\n\t\t\t\t\t\tcustomProperties);\n\t\t\t} else {\n\t\t\t\texistingMetric.setValue(newMetric.getValue());\n\t\t\t}\n\n\t\t\thandleProps(existingMetric, newMetric, customProperties);\n\t\t\tlogger.trace(\"Updated metric in the map: {}\", existingMetric);\n\t\t} else {\n\t\t\tlogger.trace(\"Adding new metric to cache when updating: {}\", newMetric);\n\t\t\tmetricMap.put(newMetricName, newMetric);\n\t\t}\n\t}\n\n\tprivate void updateTemplateMetricValues(TemplateMap existingTemplateMap, Metric newMetric,\n\t\t\tList<Property<?>> customProperties) {\n\t\tTemplate newTemplate = (Template) newMetric.getValue();\n\t\tList<Metric> newMemberMetrics = newTemplate.getMetrics();\n\t\tif (newMemberMetrics != null && !newMemberMetrics.isEmpty()) {\n\t\t\tfor (Metric newMemberMetric : newMemberMetrics) {\n\t\t\t\tMetric existingMetric = existingTemplateMap.getMetricMap().get(newMemberMetric.getName());\n\t\t\t\tif (newMemberMetric.getDataType() == MetricDataType.Template && newMemberMetric.getValue() != null) {\n\t\t\t\t\tupdateTemplateMetricValues((TemplateMap) existingMetric.getValue(), newMemberMetric,\n\t\t\t\t\t\t\tcustomProperties);\n\t\t\t\t} else {\n\t\t\t\t\texistingTemplateMap.getMetricMap().get(newMemberMetric.getName())\n\t\t\t\t\t\t\t.setValue(newMemberMetric.getValue());\n\t\t\t\t}\n\n\t\t\t\thandleProps(existingMetric, newMemberMetric, customProperties);\n\t\t\t}\n\t\t}\n\t}\n\n\tprivate void handleProps(Metric existingMetric, Metric newMetric, List<Property<?>> customProperties) {\n\t\tPropertySet props = existingMetric.getProperties();\n\t\tif (newMetric.getProperties() != null\n\t\t\t\t&& newMetric.getProperties().getPropertyValue(SparkplugMeta.QUALITY_PROP_NAME) != null) {\n\t\t\tif (props == null) {\n\t\t\t\tprops = new PropertySet();\n\t\t\t\texistingMetric.setProperties(props);\n\t\t\t}\n\t\t\tprops.setProperty(SparkplugMeta.QUALITY_PROP_NAME,\n\t\t\t\t\tnewMetric.getProperties().getPropertyValue(SparkplugMeta.QUALITY_PROP_NAME));\n\t\t} else {\n\t\t\tif (props != null) {\n\t\t\t\t// If there is no quality - it is implied good and should be updated as such by simply removing it\n\t\t\t\tprops.remove(SparkplugMeta.QUALITY_PROP_NAME);\n\t\t\t}\n\t\t}\n\t\texistingMetric.setTimestamp(newMetric.getTimestamp());\n\n\t\tif (customProperties != null && !customProperties.isEmpty()) {\n\t\t\tfor (Property<?> customProperty : customProperties) {\n\t\t\t\tif (newMetric.getProperties() != null\n\t\t\t\t\t\t&& newMetric.getProperties().getPropertyValue(customProperty.getName()) != null) {\n\t\t\t\t\tif (props == null) {\n\t\t\t\t\t\tprops = new PropertySet();\n\t\t\t\t\t\texistingMetric.setProperties(props);\n\t\t\t\t\t}\n\t\t\t\t\tprops.setProperty(customProperty.getName(),\n\t\t\t\t\t\t\tnewMetric.getProperties().getPropertyValue(customProperty.getName()));\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\t/**\n\t * Updates all {@link Metric} timestamps to the specified {@link Date} as well as the timestamp for the overall\n\t * {@link SparkplugBPayloadMap}\n\t *\n\t * @param date the {@link Date} timestamp to use for all {@link Metric}s in this {@link SparkplugBPayloadMap}\n\t */\n\tpublic void updateMetricTimestamps(Date date) {\n\t\tfor (Metric metric : metricMap.values()) {\n\t\t\tmetric.setTimestamp(date);\n\t\t\tif (metric.getDataType() == MetricDataType.Template && metric.getValue() != null) {\n\t\t\t\tupdateTemplateTimestamps((Template) metric.getValue(), date);\n\t\t\t}\n\t\t}\n\t}\n\n\tprivate void updateTemplateTimestamps(Template template, Date date) {\n\t\tif (template != null && template.getMetrics() != null) {\n\t\t\tfor (Metric metric : template.getMetrics()) {\n\t\t\t\tmetric.setTimestamp(date);\n\t\t\t\tif (metric.getDataType() == MetricDataType.Template && metric.getValue() != null) {\n\t\t\t\t\tupdateTemplateTimestamps((Template) metric.getValue(), date);\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\tpublic void uptickMetricTimestamps(Date birthTimestamp) {\n\t\tfor (Metric metric : metricMap.values()) {\n\t\t\tDate uptickedTimestamp = new Date(metric.getTimestamp().getTime() + 1);\n\t\t\tif (birthTimestamp.before(uptickedTimestamp)) {\n\t\t\t\tmetric.setTimestamp(birthTimestamp);\n\t\t\t} else {\n\t\t\t\tmetric.setTimestamp(uptickedTimestamp);\n\t\t\t}\n\t\t\tif (metric.getDataType() == MetricDataType.Template && metric.getValue() != null) {\n\t\t\t\tuptickTemplateTimestamps((Template) metric.getValue(), birthTimestamp);\n\t\t\t}\n\t\t}\n\t}\n\n\tprivate void uptickTemplateTimestamps(Template template, Date birthTimestamp) {\n\t\tif (template != null && template.getMetrics() != null) {\n\t\t\tfor (Metric metric : template.getMetrics()) {\n\t\t\t\tDate uptickedTimestamp = new Date(metric.getTimestamp().getTime() + 1);\n\t\t\t\tif (birthTimestamp.before(uptickedTimestamp)) {\n\t\t\t\t\tmetric.setTimestamp(birthTimestamp);\n\t\t\t\t} else {\n\t\t\t\t\tmetric.setTimestamp(uptickedTimestamp);\n\t\t\t\t}\n\t\t\t\tif (metric.getDataType() == MetricDataType.Template && metric.getValue() != null) {\n\t\t\t\t\tupdateTemplateTimestamps((Template) metric.getValue(), birthTimestamp);\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder builder = new StringBuilder();\n\t\tbuilder.append(\"SparkplugBPayloadMap [timestamp=\");\n\t\tbuilder.append(super.getTimestamp() != null ? super.getTimestamp().getTime() : \"null\");\n\t\tbuilder.append(\", metrics=\");\n\t\tbuilder.append(getMetrics());\n\t\tbuilder.append(\", seq=\");\n\t\tbuilder.append(super.getSeq());\n\t\tbuilder.append(\", uuid=\");\n\t\tbuilder.append(super.getUuid());\n\t\tbuilder.append(\", body=\");\n\t\tbuilder.append(Arrays.toString(super.getBody()));\n\t\tbuilder.append(\"]\");\n\t\treturn builder.toString();\n\t}\n\n\t/**\n\t * A builder for creating a {@link SparkplugBPayloadMapBuilder} instance.\n\t */\n\tpublic static class SparkplugBPayloadMapBuilder {\n\n\t\tprivate Date timestamp;\n\t\tprivate List<Metric> metrics;\n\t\tprivate long seq = -1;\n\t\tprivate String uuid;\n\t\tprivate byte[] body;\n\n\t\tpublic SparkplugBPayloadMapBuilder(long sequenceNumber) {\n\t\t\tthis.seq = sequenceNumber;\n\t\t\tmetrics = new ArrayList<Metric>();\n\t\t}\n\n\t\tpublic SparkplugBPayloadMapBuilder() {\n\t\t\tmetrics = new ArrayList<Metric>();\n\t\t}\n\n\t\tpublic SparkplugBPayloadMapBuilder addMetric(Metric metric) {\n\t\t\tthis.metrics.add(metric);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic SparkplugBPayloadMapBuilder addMetrics(Collection<Metric> metrics) {\n\t\t\tthis.metrics.addAll(metrics);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic SparkplugBPayloadMapBuilder setTimestamp(Date timestamp) {\n\t\t\tthis.timestamp = timestamp;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic SparkplugBPayloadMapBuilder setSeq(long seq) {\n\t\t\tthis.seq = seq;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic SparkplugBPayloadMapBuilder setUuid(String uuid) {\n\t\t\tthis.uuid = uuid;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic SparkplugBPayloadMapBuilder setBody(byte[] body) {\n\t\t\tthis.body = body;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic SparkplugBPayloadMap createPayload() {\n\t\t\treturn new SparkplugBPayloadMap(timestamp, metrics, seq, uuid, body);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/SparkplugDescriptor.java",
    "content": "/********************************************************************************\n * Copyright (c) 2020-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\npublic interface SparkplugDescriptor {\n\n\t/**\n\t * Returns the String representation of this {@link SparkplugDescriptor}\n\t *\n\t * @return the String representation of this {@link SparkplugDescriptor}\n\t */\n\tpublic String getDescriptorString();\n\n\t/**\n\t * Returns the Group ID for this {@link SparkplugDescriptor}\n\t *\n\t * @return the String representation of the Group ID for this {@link SparkplugDescriptor}\n\t */\n\tpublic String getGroupId();\n\n\t/**\n\t * Returns the Group ID for this {@link SparkplugDescriptor}\n\t *\n\t * @return the String representation of the Group ID for this {@link SparkplugDescriptor}\n\t */\n\tpublic String getEdgeNodeId();\n\n\t/**\n\t * Returns true if this is a DeviceDescriptor, otherwise false\n\t *\n\t * @return true if this is a DeviceDescriptor, otherwise false\n\t */\n\tpublic boolean isDeviceDescriptor();\n\n\t/**\n\t * Returns the Group ID for this {@link SparkplugDescriptor}\n\t *\n\t * @return the String representation of the Group ID for this {@link SparkplugDescriptor}\n\t */\n\tpublic String getDeviceId();\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/SparkplugMeta.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\npublic class SparkplugMeta {\n\n\t/**\n\t * The root MQTT topic token for all Sparkplug B Messages\n\t */\n\tpublic static final String SPARKPLUG_B_TOPIC_PREFIX = \"spBv1.0\";\n\n\t/**\n\t * The Host Application MQTT topic token\n\t */\n\tpublic static final String SPARKPLUG_TOPIC_HOST_STATE_TOKEN = \"STATE\";\n\n\t/**\n\t * The full Host Application MQTT token prefix\n\t */\n\tpublic static final String SPARKPLUG_TOPIC_HOST_STATE_PREFIX =\n\t\t\tSPARKPLUG_B_TOPIC_PREFIX + \"/\" + SPARKPLUG_TOPIC_HOST_STATE_TOKEN;\n\n\t/**\n\t * The Sparkplug sequence number key for {@link Metric}s\n\t */\n\tpublic static final String SPARKPLUG_SEQUENCE_NUMBER_KEY = \"seq\";\n\n\t/**\n\t * The Sparkplug Birth/Death (BD) sequence number key used in Edge Node NBIRTH and NDEATH messages\n\t */\n\tpublic static final String SPARKPLUG_BD_SEQUENCE_NUMBER_KEY = \"bdSeq\";\n\n\t/**\n\t * The Sparkplug quality key\n\t */\n\tpublic static final String QUALITY_PROP_NAME = \"Quality\";\n\n\t/**\n\t * The Sparkplug 'Node Control' Metric prefix\n\t */\n\tpublic static final String METRIC_NODE_CONTROL = \"Node Control\";\n\n\t/**\n\t * The Sparkplug 'Node Control/Rebirth' Metric name\n\t */\n\tpublic static final String METRIC_NODE_REBIRTH = METRIC_NODE_CONTROL + \"/\" + \"Rebirth\";\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/StatePayload.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport com.fasterxml.jackson.annotation.JsonProperty;\n\n/**\n * A class to represent Sparkplug Host Application STATE payloads\n */\npublic class StatePayload {\n\n\t@JsonProperty(\"online\")\n\tprivate Boolean online;\n\n\t@JsonProperty(\"timestamp\")\n\tprivate Long timestamp;\n\n\t/**\n\t * Default Constructor\n\t */\n\tpublic StatePayload() {\n\t\tthis.online = null;\n\t\tthis.timestamp = null;\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param online whether or not this {@link StatePayload} is represented as online or not\n\t * @param timestamp the timestamp of this STATE payload\n\t */\n\tpublic StatePayload(Boolean online, Long timestamp) {\n\t\tsuper();\n\t\tthis.online = online;\n\t\tthis.timestamp = timestamp;\n\t}\n\n\t/**\n\t * Gets the online status of this {@link StatePayload}\n\t *\n\t * @return true if this {@link StatePayload} is online, otherwise false\n\t */\n\tpublic Boolean isOnline() {\n\t\treturn online;\n\t}\n\n\t/**\n\t * Sets the online status of this {@link StatePayload}\n\t *\n\t * @param online true if this payload is representing an online Host Application, otherwise false\n\t */\n\tpublic void setOnline(Boolean online) {\n\t\tthis.online = online;\n\t}\n\n\t/**\n\t * Gets the timestamp of this {@link StatePayload} as the number of milliseconds since epoch\n\t *\n\t * @return the timestamp of this {@link StatePayload} as the number of milliseconds since epoch\n\t */\n\tpublic Long getTimestamp() {\n\t\treturn timestamp;\n\t}\n\n\t/**\n\t * Sets the timestamp of this {@link StatePayload} as the number of milliseconds since epoch\n\t *\n\t * @param timestamp the timestamp of this {@link StatePayload} to set as the number of milliseconds since epoch\n\t */\n\tpublic void setTimestamp(Long timestamp) {\n\t\tthis.timestamp = timestamp;\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder builder = new StringBuilder();\n\t\tbuilder.append(\"StatePayload [online=\");\n\t\tbuilder.append(online);\n\t\tbuilder.append(\", timestamp=\");\n\t\tbuilder.append(timestamp);\n\t\tbuilder.append(\"]\");\n\t\treturn builder.toString();\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/Template.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.util.ArrayList;\nimport java.util.Collection;\nimport java.util.List;\n\nimport org.eclipse.tahu.SparkplugException;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\n\nimport com.fasterxml.jackson.annotation.JsonGetter;\nimport com.fasterxml.jackson.annotation.JsonInclude;\nimport com.fasterxml.jackson.annotation.JsonInclude.Include;\nimport com.fasterxml.jackson.annotation.JsonProperty;\nimport com.fasterxml.jackson.annotation.JsonSetter;\n\n/**\n * A class representing a template associated with a metric\n */\n@JsonInclude(Include.NON_NULL)\npublic class Template {\n\n\t/**\n\t * The Template version.\n\t */\n\t@JsonProperty(\"version\")\n\tprivate String version;\n\n\t/**\n\t * The template reference\n\t */\n\t@JsonProperty(\"reference\")\n\tprivate String templateRef;\n\n\t/**\n\t * True if the template is a definition, false otherwise.\n\t */\n\t@JsonProperty(\"isDefinition\")\n\tprivate boolean isDefinition;\n\n\t/**\n\t * List of metrics.\n\t */\n\t@JsonProperty(\"metrics\")\n\tprivate List<Metric> metrics;\n\n\t/**\n\t * List of parameters.\n\t */\n\t@JsonProperty(\"parameters\")\n\t@JsonInclude(Include.NON_EMPTY)\n\tprivate List<Parameter> parameters;\n\n\tpublic Template() {\n\t}\n\n\t/**\n\t * Constructor\n\t * \n\t * @param name the template name\n\t * @param version the template version\n\t * @param templateRef a template reference\n\t * @param isDefinition a flag indicating if this is a template definition\n\t * @param metrics a list of metrics\n\t * @param parmeters a list of parameters\n\t */\n\tpublic Template(String version, String templateRef, boolean isDefinition, List<Metric> metrics,\n\t\t\tList<Parameter> parameters) {\n\t\tthis.version = version;\n\t\tthis.templateRef = templateRef;\n\t\tthis.isDefinition = isDefinition;\n\t\tthis.metrics = metrics;\n\t\tthis.parameters = parameters;\n\t}\n\n\t/**\n\t * Gets the version of this {@link Template}\n\t *\n\t * @return the version of this {@link Template}\n\t */\n\tpublic String getVersion() {\n\t\treturn version;\n\t}\n\n\t/**\n\t * Sets the version of this {@link Template}\n\t *\n\t * @param version the version to set for this {@link Template}\n\t */\n\tpublic void setVersion(String version) {\n\t\tthis.version = version;\n\t}\n\n\t/**\n\t * Gets the template reference of this {@link Template}\n\t *\n\t * @return the template reference of this {@link Template}\n\t */\n\tpublic String getTemplateRef() {\n\t\treturn templateRef;\n\t}\n\n\t/**\n\t * Sets the template reference of this {@link Template}\n\t *\n\t * @param templateRef the template reference to set for this {@link Template}\n\t */\n\tpublic void setTemplateRef(String templateRef) {\n\t\tthis.templateRef = templateRef;\n\t}\n\n\t/**\n\t * Gets whether or not this {@link Template} is a definition or not\n\t *\n\t * @return true if this is a definition, otherwise false (meaning it is an instance)\n\t */\n\t@JsonGetter(\"isDefinition\")\n\tpublic boolean isDefinition() {\n\t\treturn isDefinition;\n\t}\n\n\t/**\n\t * Sets whether or not this {@link Template} is a definition or not\n\t *\n\t * @param isDefinition a boolean donoting if this is a {@link Template} definition or instance\n\t */\n\t@JsonSetter(\"isDefinition\")\n\tpublic void setDefinition(boolean isDefinition) {\n\t\tthis.isDefinition = isDefinition;\n\t}\n\n\t/**\n\t * Gets the {@link List} of {@link Metric}s associated with the {@link Template}\n\t *\n\t * @return the {@link List} of {@link Metric}s associated with the {@link Template}\n\t */\n\tpublic List<Metric> getMetrics() {\n\t\treturn metrics;\n\t}\n\n\t/**\n\t * Sets the {@link List} of {@link Metric}s for this {@link Template}\n\t *\n\t * @param metrics the {@link List} of {@link Metric}s to set for this {@link Template}\n\t */\n\tpublic void setMetrics(List<Metric> metrics) {\n\t\tthis.metrics = metrics;\n\t}\n\n\t/**\n\t * Adds a {@link Metric} to the end of the {@link List} of Sparkplug metrics\n\t *\n\t * @param metric a {@link Metric} to add to the end of the {@link List} of Sparkplug metrics\n\t */\n\tpublic void addMetric(Metric metric) {\n\t\tthis.metrics.add(metric);\n\t}\n\n\t/**\n\t * Gets the {@link List} of {@link Parameter}s associated with the {@link Template}\n\t *\n\t * @return the {@link List} of {@link Parameter}s associated with the {@link Template}\n\t */\n\tpublic List<Parameter> getParameters() {\n\t\treturn parameters;\n\t}\n\n\t/**\n\t * Sets the {@link List} of {@link Parameter}s for this {@link Template}\n\t *\n\t * @param metrics the {@link List} of {@link Parameter}s to set for this {@link Template}\n\t */\n\tpublic void setParameters(List<Parameter> parameters) {\n\t\tthis.parameters = parameters;\n\t}\n\n\t/**\n\t * Adds a {@link Parameter} to this {@link Template}\n\t * @param parameter a {@link Parameter} to add to this {@link Template}\n\t */\n\tpublic void addParameter(Parameter parameter) {\n\t\tthis.parameters.add(parameter);\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder builder = new StringBuilder();\n\t\tbuilder.append(\"Template [version=\");\n\t\tbuilder.append(version);\n\t\tbuilder.append(\", templateRef=\");\n\t\tbuilder.append(templateRef);\n\t\tbuilder.append(\", isDefinition=\");\n\t\tbuilder.append(isDefinition);\n\t\tbuilder.append(\", metrics=\");\n\t\tbuilder.append(metrics);\n\t\tbuilder.append(\", parameters=\");\n\t\tbuilder.append(parameters);\n\t\tbuilder.append(\"]\");\n\t\treturn builder.toString();\n\t}\n\n\t/**\n\t * A builder for creating a {@link Template} instance.\n\t */\n\tpublic static class TemplateBuilder {\n\n\t\tprivate String version;\n\t\tprivate String templateRef;\n\t\tprivate boolean isDefinition;\n\t\tprivate List<Metric> metrics;\n\t\tprivate List<Parameter> parameters;\n\n\t\t/**\n\t\t * @param name\n\t\t * @param version\n\t\t * @param templateRef\n\t\t * @param isDefinition\n\t\t * @param metrics\n\t\t * @param parameters\n\t\t */\n\t\tpublic TemplateBuilder() {\n\t\t\tsuper();\n\t\t\tthis.metrics = new ArrayList<Metric>();\n\t\t\tthis.parameters = new ArrayList<Parameter>();\n\t\t}\n\n\t\tpublic TemplateBuilder(Template template) throws SparkplugException {\n\t\t\tthis.version = template.getVersion();\n\t\t\tthis.templateRef = template.getTemplateRef();\n\t\t\tthis.isDefinition = template.isDefinition();\n\t\t\tthis.metrics = new ArrayList<Metric>(template.getMetrics().size());\n\t\t\tfor (Metric metric : template.getMetrics()) {\n\t\t\t\tthis.metrics.add(new MetricBuilder(metric).createMetric());\n\t\t\t}\n\t\t\tthis.parameters = new ArrayList<Parameter>(template.getParameters().size());\n\t\t\tfor (Parameter parameter : template.getParameters()) {\n\t\t\t\tthis.parameters.add(new Parameter(parameter.getName(), parameter.getType(), parameter.getValue()));\n\t\t\t}\n\t\t}\n\n\t\tpublic TemplateBuilder version(String version) {\n\t\t\tthis.version = version;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic TemplateBuilder templateRef(String templateRef) {\n\t\t\tthis.templateRef = templateRef;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic TemplateBuilder definition(boolean isDefinition) {\n\t\t\tthis.isDefinition = isDefinition;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic TemplateBuilder addMetric(Metric metric) {\n\t\t\tthis.metrics.add(metric);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic TemplateBuilder addMetrics(Collection<Metric> metrics) {\n\t\t\tthis.metrics.addAll(metrics);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic TemplateBuilder addParameter(Parameter parameter) {\n\t\t\tthis.parameters.add(parameter);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic TemplateBuilder addParameters(Collection<Parameter> parameters) {\n\t\t\tthis.parameters.addAll(parameters);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic Template createTemplate() {\n\t\t\treturn new Template(version, templateRef, isDefinition, metrics, parameters);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/TemplateMap.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport java.util.ArrayList;\nimport java.util.Collection;\nimport java.util.Collections;\nimport java.util.Date;\nimport java.util.List;\nimport java.util.Map;\nimport java.util.concurrent.ConcurrentHashMap;\n\nimport org.eclipse.tahu.SparkplugException;\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\nimport org.eclipse.tahu.message.model.DataSet.DataSetBuilder;\nimport org.eclipse.tahu.message.model.MetaData.MetaDataBuilder;\nimport org.eclipse.tahu.message.model.PropertySet.PropertySetBuilder;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\n/**\n * A class representing a {@link Map} of {@link Template} {@link Metric}s\n */\npublic class TemplateMap extends Template {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(TemplateMap.class.getName());\n\n\tprivate final Map<String, Metric> metricMap;\n\n\tprivate final Object mapLock = new Object();\n\n\tpublic TemplateMap() {\n\t\tmetricMap = new ConcurrentHashMap<>();\n\t}\n\n\t/**\n\t * Constructor\n\t * \n\t * @param name the template name\n\t * @param version the template version\n\t * @param templateRef a template reference\n\t * @param isDefinition a flag indicating if this is a template definition\n\t * @param metrics a list of metrics\n\t * @param parmeters a list of parameters\n\t */\n\tpublic TemplateMap(String version, String templateRef, boolean isDefinition, Map<String, Metric> metricMap,\n\t\t\tList<Parameter> parameters) {\n\t\tsuper(version, templateRef, isDefinition, null, parameters);\n\t\tthis.metricMap = metricMap;\n\t}\n\n\t/**\n\t * Gets an unmodifiable {@link Map} of the {@link Metric}s\n\t *\n\t * @return an unmodifiable {@link Map} of the {@link Metric}s\n\t */\n\tpublic Map<String, Metric> getMetricMap() {\n\t\tsynchronized (mapLock) {\n\t\t\treturn Collections.unmodifiableMap(metricMap);\n\t\t}\n\t}\n\n\t/**\n\t * Updates a {@link Metric} in the {@link TemplateMap}\n\t *\n\t * @param metricName the name of the {@link Metric} to update\n\t * @param metric the {@link Metric} to place in the {@link Map}\n\t */\n\tpublic void updateMetric(String metricName, Metric metric) {\n\t\tsynchronized (mapLock) {\n\t\t\tmetricMap.put(metricName, metric);\n\t\t}\n\t}\n\n\t/**\n\t * Gets the {@link List} of {@link Metric}s in this {@link TemplateMap}\n\t */\n\t@Override\n\tpublic List<Metric> getMetrics() {\n\t\tsynchronized (mapLock) {\n\t\t\treturn new ArrayList<>(metricMap.values());\n\t\t}\n\t}\n\n\t/**\n\t * Sets the {@link Map} of {@link Metric}s using a {@link List}. If any metric names are duplicated in the list,\n\t * only the last one will be placed in the map.\n\t */\n\t@Override\n\tpublic void setMetrics(List<Metric> metrics) {\n\t\tsynchronized (mapLock) {\n\t\t\tfor (Metric metric : metrics) {\n\t\t\t\taddMetric(metric);\n\t\t\t}\n\t\t}\n\t}\n\n\t/**\n\t * Adds an {@link Metric} to the end of the {@link TemplateMap}\n\t *\n\t * @param metric the {@link Metric} to add to the {@link TemplateMap}\n\t */\n\t@Override\n\tpublic void addMetric(Metric metric) {\n\t\tsynchronized (mapLock) {\n\t\t\tmetricMap.put(metric.getName(), metric);\n\t\t}\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder builder = new StringBuilder();\n\t\tbuilder.append(\"TemplateMap [version=\");\n\t\tbuilder.append(super.getVersion());\n\t\tbuilder.append(\", templateRef=\");\n\t\tbuilder.append(super.getTemplateRef());\n\t\tbuilder.append(\", isDefinition=\");\n\t\tbuilder.append(super.isDefinition());\n\t\tbuilder.append(\", metrics=\");\n\t\tbuilder.append(metricMap);\n\t\tbuilder.append(\", parameters=\");\n\t\tbuilder.append(super.getParameters());\n\t\tbuilder.append(\"]\");\n\t\treturn builder.toString();\n\t}\n\n\t/**\n\t * A builder for creating a {@link TemplateMap} instance.\n\t */\n\tpublic static class TemplateMapBuilder {\n\n\t\tprivate String version;\n\t\tprivate String templateRef;\n\t\tprivate boolean isDefinition;\n\t\tprivate Map<String, Metric> metricMap;\n\t\tprivate List<Parameter> parameters;\n\n\t\t/**\n\t\t * @param name\n\t\t * @param version\n\t\t * @param templateRef\n\t\t * @param isDefinition\n\t\t * @param metrics\n\t\t * @param parameters\n\t\t */\n\t\tpublic TemplateMapBuilder() {\n\t\t\tsuper();\n\t\t\tthis.metricMap = new ConcurrentHashMap<>();\n\t\t\tthis.parameters = new ArrayList<>();\n\t\t}\n\n\t\tpublic TemplateMapBuilder(Template template) throws SparkplugException {\n\t\t\tthis.version = template.getVersion();\n\t\t\tthis.templateRef = template.getTemplateRef();\n\t\t\tthis.isDefinition = template.isDefinition();\n\n\t\t\tthis.metricMap = new ConcurrentHashMap<>(template.getMetrics().size());\n\t\t\tfor (Metric metric : template.getMetrics()) {\n\t\t\t\tlogger.trace(\"Adding metric '{}' when converting Template to TemplateMap\", metric.getName());\n\t\t\t\tthis.metricMap.put(metric.getName(), new CustomMetricBuilder(metric).createMetric());\n\t\t\t}\n\t\t\tlogger.trace(\"MetricMap after conversion: {}\", metricMap);\n\t\t\tthis.parameters = new ArrayList<Parameter>(template.getParameters().size());\n\t\t\tfor (Parameter parameter : template.getParameters()) {\n\t\t\t\tthis.parameters.add(new Parameter(parameter.getName(), parameter.getType(), parameter.getValue()));\n\t\t\t}\n\t\t}\n\n\t\tpublic TemplateMapBuilder(TemplateMap templateMap) throws SparkplugException {\n\t\t\tthis.version = templateMap.getVersion();\n\t\t\tthis.templateRef = templateMap.getTemplateRef();\n\t\t\tthis.isDefinition = templateMap.isDefinition();\n\t\t\tthis.metricMap = new ConcurrentHashMap<>(templateMap.getMetrics().size());\n\t\t\tfor (Metric metric : templateMap.getMetrics()) {\n\t\t\t\tthis.metricMap.put(metric.getName(), new CustomMetricBuilder(metric).createMetric());\n\t\t\t}\n\t\t\tthis.parameters = new ArrayList<Parameter>(templateMap.getParameters().size());\n\t\t\tfor (Parameter parameter : templateMap.getParameters()) {\n\t\t\t\tthis.parameters.add(new Parameter(parameter.getName(), parameter.getType(), parameter.getValue()));\n\t\t\t}\n\t\t}\n\n\t\tpublic TemplateMapBuilder version(String version) {\n\t\t\tthis.version = version;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic TemplateMapBuilder templateRef(String templateRef) {\n\t\t\tthis.templateRef = templateRef;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic TemplateMapBuilder definition(boolean isDefinition) {\n\t\t\tthis.isDefinition = isDefinition;\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic TemplateMapBuilder addParameter(Parameter parameter) {\n\t\t\tthis.parameters.add(parameter);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic TemplateMapBuilder addParameters(Collection<Parameter> parameters) {\n\t\t\tthis.parameters.addAll(parameters);\n\t\t\treturn this;\n\t\t}\n\n\t\tpublic TemplateMap createTemplateMap() {\n\t\t\treturn new TemplateMap(version, templateRef, isDefinition, metricMap, parameters);\n\t\t}\n\t}\n\n\t/**\n\t * A builder for creating a {@link Metric} instance.\n\t */\n\tpublic static class CustomMetricBuilder {\n\n\t\tprivate String name;\n\t\tprivate Long alias;\n\t\tprivate Date timestamp;\n\t\tprivate MetricDataType dataType;\n\t\tprivate Boolean isHistorical;\n\t\tprivate Boolean isTransient;\n\t\tprivate MetaData metaData = null;\n\t\tprivate PropertySet properties = null;\n\t\tprivate Object value;\n\n\t\tpublic CustomMetricBuilder(Metric metric) throws SparkplugException {\n\t\t\tthis.name = metric.getName();\n\t\t\tthis.alias = metric.getAlias();\n\t\t\tthis.timestamp = metric.getTimestamp();\n\t\t\tthis.dataType = metric.getDataType();\n\t\t\tthis.isHistorical = metric.isHistorical();\n\t\t\tthis.isTransient = metric.isTransient();\n\t\t\tthis.metaData =\n\t\t\t\t\tmetric.getMetaData() != null ? new MetaDataBuilder(metric.getMetaData()).createMetaData() : null;\n\t\t\tthis.properties = metric.getMetaData() != null\n\t\t\t\t\t? new PropertySetBuilder(metric.getProperties()).createPropertySet()\n\t\t\t\t\t: null;\n\t\t\tswitch (dataType) {\n\t\t\t\tcase DataSet:\n\t\t\t\t\tthis.value = metric.getValue() != null\n\t\t\t\t\t\t\t? new DataSetBuilder((DataSet) metric.getValue()).createDataSet()\n\t\t\t\t\t\t\t: null;\n\t\t\t\t\tbreak;\n\t\t\t\tcase Template:\n\t\t\t\t\tthis.value = metric.getValue() != null\n\t\t\t\t\t\t\t? new TemplateMapBuilder((TemplateMap) metric.getValue()).createTemplateMap()\n\t\t\t\t\t\t\t: null;\n\t\t\t\t\tbreak;\n\t\t\t\tdefault:\n\t\t\t\t\tthis.value = metric.getValue();\n\t\t\t}\n\t\t}\n\n\t\tpublic Metric createMetric() throws SparkplugInvalidTypeException {\n\t\t\treturn new Metric(name, alias, timestamp, dataType, isHistorical, isTransient, metaData, properties, value);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/Topic.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\nimport org.eclipse.tahu.exception.TahuErrorCode;\nimport org.eclipse.tahu.exception.TahuException;\n\nimport com.fasterxml.jackson.annotation.JsonIgnore;\nimport com.fasterxml.jackson.annotation.JsonInclude;\nimport com.fasterxml.jackson.annotation.JsonInclude.Include;\n\n/**\n * A Sparkplug MQTT Topic\n */\n@JsonInclude(Include.NON_NULL)\npublic class Topic {\n\n\t/**\n\t * The Sparkplug namespace version.\n\t */\n\tprivate final String namespace;\n\n\t/**\n\t * The {@link SparkplugDesciptor} for this Edge Node or Device\n\t */\n\t@JsonIgnore\n\tprivate final SparkplugDescriptor sparkplugDescriptor;\n\n\t/**\n\t * The {@link EdgeNodeDescriptor} for this Edge Node or Device\n\t */\n\tprivate final EdgeNodeDescriptor edgeNodeDescriptor;\n\n\t/**\n\t * The ID of the logical grouping of Edge of Network (EoN) Nodes and devices.\n\t */\n\tprivate final String groupId;\n\n\t/**\n\t * The ID of the Edge of Network (EoN) Node.\n\t */\n\tprivate final String edgeNodeId;\n\n\t/**\n\t * The ID of the device.\n\t */\n\tprivate final String deviceId;\n\n\t/**\n\t * The ID if this is a Sparkplug Host Application topic\n\t */\n\tprivate final String hostApplicationId;\n\n\t/**\n\t * The message type.\n\t */\n\tprivate final MessageType type;\n\n\t/**\n\t * Default Constructor\n\t */\n\tpublic Topic() {\n\t\tthis.namespace = null;\n\t\tthis.sparkplugDescriptor = null;\n\t\tthis.edgeNodeDescriptor = null;\n\t\tthis.groupId = null;\n\t\tthis.edgeNodeId = null;\n\t\tthis.deviceId = null;\n\t\tthis.hostApplicationId = null;\n\t\tthis.type = null;\n\t}\n\n\t/**\n\t * A Constructor for Device Topics\n\t * \n\t * @param namespace the namespace\n\t * @param groupId the Group ID\n\t * @param edgeNodeId the Edge Node ID\n\t * @param deviceId the Device ID\n\t * @param type the message type\n\t */\n\tpublic Topic(String namespace, String groupId, String edgeNodeId, String deviceId, MessageType type) {\n\t\tsuper();\n\t\tthis.namespace = namespace;\n\t\tthis.sparkplugDescriptor = deviceId == null\n\t\t\t\t? new EdgeNodeDescriptor(groupId, edgeNodeId)\n\t\t\t\t: new DeviceDescriptor(groupId, edgeNodeId, deviceId);\n\t\tthis.edgeNodeDescriptor = new EdgeNodeDescriptor(groupId, edgeNodeId);\n\t\tthis.groupId = groupId;\n\t\tthis.edgeNodeId = edgeNodeId;\n\t\tthis.deviceId = deviceId;\n\t\tthis.hostApplicationId = null;\n\t\tthis.type = type;\n\t}\n\n\t/**\n\t * A Constructor for Edge Node Topics\n\t * \n\t * @param namespace the namespace\n\t * @param groupId the group ID\n\t * @param edgeNodeId the edge node ID\n\t * @param type the message type\n\t */\n\tpublic Topic(String namespace, String groupId, String edgeNodeId, MessageType type) {\n\t\tsuper();\n\t\tthis.namespace = namespace;\n\t\tthis.sparkplugDescriptor = new EdgeNodeDescriptor(groupId, edgeNodeId);\n\t\tthis.edgeNodeDescriptor = new EdgeNodeDescriptor(groupId, edgeNodeId);\n\t\tthis.groupId = groupId;\n\t\tthis.edgeNodeId = edgeNodeId;\n\t\tthis.deviceId = null;\n\t\tthis.hostApplicationId = null;\n\t\tthis.type = type;\n\t}\n\n\t/**\n\t * A Constructor for Device Topics\n\t * \n\t * @param namespace the namespace\n\t * @param deviceDescriptor the {@link EdgeNodeDescriptor}\n\t * @param type the message type\n\t */\n\tpublic Topic(String namespace, DeviceDescriptor deviceDescriptor, MessageType type) {\n\t\tthis(namespace, deviceDescriptor.getGroupId(), deviceDescriptor.getEdgeNodeId(), deviceDescriptor.getDeviceId(),\n\t\t\t\ttype);\n\t}\n\n\t/**\n\t * A Constructor for Edge Node Topics\n\t * \n\t * @param namespace the namespace\n\t * @param edgeNodeDescriptor the {@link EdgeNodeDescriptor}\n\t * @param type the message type\n\t */\n\tpublic Topic(String namespace, EdgeNodeDescriptor edgeNodeDescriptor, MessageType type) {\n\t\tthis(namespace, edgeNodeDescriptor.getGroupId(), edgeNodeDescriptor.getEdgeNodeId(), type);\n\t}\n\n\t/**\n\t * A Constructor for Host Application Topics\n\t *\n\t * @param namespace the namespace\n\t * @param hostApplicationId the Host Application ID\n\t * @param type the message type\n\t */\n\tpublic Topic(String namespace, String hostApplicationId, MessageType type) {\n\t\tsuper();\n\t\tthis.namespace = namespace;\n\t\tthis.hostApplicationId = hostApplicationId;\n\t\tthis.type = type;\n\t\tthis.sparkplugDescriptor = null;\n\t\tthis.edgeNodeDescriptor = null;\n\t\tthis.groupId = null;\n\t\tthis.edgeNodeId = null;\n\t\tthis.deviceId = null;\n\t}\n\n\t/**\n\t * Parses a Sparkplug topic from an MQTT topic string\n\t *\n\t * @param topicString the MQTT topic string to convert to a {@link Topic}\n\t * @return the {@link Topic} that represents the input MQTT topic string\n\t * @throws TahuException if the MQTT topic string can not be parsed\n\t */\n\tpublic static Topic parseTopic(String topicString) throws TahuException {\n\t\ttry {\n\t\t\tif (topicString == null || topicString.isEmpty()\n\t\t\t\t\t|| !topicString.startsWith(SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX) || !topicString.contains(\"/\")) {\n\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT,\n\t\t\t\t\t\t\"Invalid Sparkplug topic String: ''\" + topicString);\n\t\t\t}\n\n\t\t\tString[] splitTopic = topicString.split(\"/\");\n\t\t\tif (splitTopic.length == 3) {\n\t\t\t\tif (SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX.equals(splitTopic[0])\n\t\t\t\t\t\t&& SparkplugMeta.SPARKPLUG_TOPIC_HOST_STATE_TOKEN.equals(splitTopic[1])) {\n\t\t\t\t\treturn new Topic(SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX, splitTopic[2], MessageType.STATE);\n\t\t\t\t} else {\n\t\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT,\n\t\t\t\t\t\t\t\"Invalid Sparkplug STATE topic String: ''\" + topicString);\n\t\t\t\t}\n\t\t\t} else if (splitTopic.length == 4) {\n\t\t\t\tMessageType messageType = MessageType.parseMessageType(splitTopic[2]);\n\t\t\t\tif (SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX.equals(splitTopic[0]) && (messageType == MessageType.NBIRTH\n\t\t\t\t\t\t|| messageType == MessageType.NCMD || messageType == MessageType.NDATA\n\t\t\t\t\t\t|| messageType == MessageType.NDEATH || messageType == MessageType.NRECORD)) {\n\t\t\t\t\treturn new Topic(SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX, splitTopic[1], splitTopic[3], messageType);\n\t\t\t\t} else {\n\t\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT,\n\t\t\t\t\t\t\t\"Invalid Sparkplug Edge Node topic String: ''\" + topicString);\n\t\t\t\t}\n\t\t\t} else if (splitTopic.length == 5) {\n\t\t\t\tMessageType messageType = MessageType.parseMessageType(splitTopic[2]);\n\t\t\t\tif (SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX.equals(splitTopic[0]) && (messageType == MessageType.DBIRTH\n\t\t\t\t\t\t|| messageType == MessageType.DCMD || messageType == MessageType.DDATA\n\t\t\t\t\t\t|| messageType == MessageType.DDEATH || messageType == MessageType.DRECORD)) {\n\t\t\t\t\treturn new Topic(SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX, splitTopic[1], splitTopic[3], messageType);\n\t\t\t\t} else {\n\t\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT,\n\t\t\t\t\t\t\t\"Invalid Sparkplug Device topic String: ''\" + topicString);\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT,\n\t\t\t\t\t\t\"Invalid topic String length: ''\" + topicString);\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\tthrow new TahuException(TahuErrorCode.INTERNAL_ERROR, e);\n\t\t}\n\t}\n\n\t/**\n\t * Returns the Sparkplug namespace version.\n\t * \n\t * @return the namespace\n\t */\n\tpublic String getNamespace() {\n\t\treturn namespace;\n\t}\n\n\t/**\n\t * Returns the {@link SparkplugDescriptor}\n\t * \n\t * @return the SparkplugDescriptor\n\t */\n\tpublic SparkplugDescriptor getSparkplugDescriptor() {\n\t\treturn sparkplugDescriptor;\n\t}\n\n\t/**\n\t * Returns the {@link EdgeNodeDescriptor}\n\t * \n\t * @return the EdgeNodeDescriptor\n\t */\n\tpublic EdgeNodeDescriptor getEdgeNodeDescriptor() {\n\t\treturn edgeNodeDescriptor;\n\t}\n\n\t/**\n\t * Returns the ID of the logical grouping of Edge of Network (EoN) Nodes and devices.\n\t * \n\t * @return the group ID\n\t */\n\tpublic String getGroupId() {\n\t\treturn groupId;\n\t}\n\n\t/**\n\t * Returns the ID of the Edge of Network (EoN) Node.\n\t * \n\t * @return the edge node ID\n\t */\n\tpublic String getEdgeNodeId() {\n\t\treturn edgeNodeId;\n\t}\n\n\t/**\n\t * Returns the ID of the device.\n\t * \n\t * @return the device ID\n\t */\n\tpublic String getDeviceId() {\n\t\treturn deviceId;\n\t}\n\n\t/**\n\t * Returns the Host Application ID if this is a Host topic\n\t *\n\t * @return the Host Application ID\n\t */\n\tpublic String getHostApplicationId() {\n\t\treturn hostApplicationId;\n\t}\n\n\t/**\n\t * Returns the message type.\n\t * \n\t * @return the message type\n\t */\n\tpublic MessageType getType() {\n\t\treturn type;\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder sb = new StringBuilder();\n\t\tif (hostApplicationId == null) {\n\t\t\tsb.append(getNamespace()).append(\"/\").append(getGroupId()).append(\"/\").append(getType()).append(\"/\")\n\t\t\t\t\t.append(getEdgeNodeId());\n\t\t\tif (getDeviceId() != null) {\n\t\t\t\tsb.append(\"/\").append(getDeviceId());\n\t\t\t}\n\t\t} else {\n\t\t\tsb.append(getNamespace()).append(\"/\").append(getType()).append(\"/\").append(hostApplicationId);\n\t\t}\n\t\treturn sb.toString();\n\t}\n\n\t/**\n\t * Returns true if this topic's type matches the passes in type, false otherwise.\n\t * \n\t * @param type the type to check\n\t * @return true if this topic's type matches the passes in type, false otherwise\n\t */\n\tpublic boolean isType(MessageType type) {\n\t\treturn this.type != null && this.type.equals(type);\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/message/model/Value.java",
    "content": "/********************************************************************************\n * Copyright (c) 2014-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.message.model;\n\n/**\n * A class representing a Sparkplug metric {@link DataSet} value\n */\npublic class Value<V> {\n\n\tprivate DataSetDataType type;\n\tprivate V value;\n\n\t/**\n\t * Default Constructor\n\t */\n\tpublic Value() {\n\t\tsuper();\n\t}\n\n\t/**\n\t * Constructor\n\t *\n\t * @param type the {@link DataSetDataType} of this {@link Value}\n\t * @param value the value of this {@link DataSet} value\n\t */\n\tpublic Value(DataSetDataType type, V value) {\n\t\tsuper();\n\t\tthis.type = type;\n\t\tthis.value = value;\n\t}\n\n\t/**\n\t * The {@link DataSetDataType} of this {@link Value}\n\t *\n\t * @return the {@link DataSetDataType} of this {@link Value}\n\t */\n\tpublic DataSetDataType getType() {\n\t\treturn type;\n\t}\n\n\t/**\n\t * Sets the {@link DataSetDataType} of this {@link Value}\n\t *\n\t * @param type the {@link DataSetDataType} to set for this {@link Value}\n\t */\n\tpublic void setType(DataSetDataType type) {\n\t\tthis.type = type;\n\t}\n\n\t/**\n\t * The value of this {@link Value}\n\t *\n\t * @return the value of this {@link Value}\n\t */\n\tpublic V getValue() {\n\t\treturn value;\n\t}\n\n\t/**\n\t * Sets the value of this {@link Value}\n\t *\n\t * @param type the value to set for this {@link Value}\n\t */\n\tpublic void setValue(V value) {\n\t\tthis.value = value;\n\t}\n\n\t@Override\n\tpublic int hashCode() {\n\t\tfinal int prime = 31;\n\t\tint result = 1;\n\t\tresult = prime * result + ((type == null) ? 0 : type.hashCode());\n\t\tresult = prime * result + ((value == null) ? 0 : value.hashCode());\n\t\treturn result;\n\t}\n\n\t@Override\n\tpublic boolean equals(Object obj) {\n\t\tif (this == obj)\n\t\t\treturn true;\n\t\tif (obj == null)\n\t\t\treturn false;\n\t\tif (getClass() != obj.getClass())\n\t\t\treturn false;\n\t\tValue other = (Value) obj;\n\t\tif (type != other.type)\n\t\t\treturn false;\n\t\tif (value == null) {\n\t\t\tif (other.value != null)\n\t\t\t\treturn false;\n\t\t} else if (!value.equals(other.value))\n\t\t\treturn false;\n\t\treturn true;\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder builder = new StringBuilder();\n\t\tbuilder.append(\"Value [type=\");\n\t\tbuilder.append(type);\n\t\tbuilder.append(\", value=\");\n\t\tbuilder.append(value);\n\t\tbuilder.append(\"]\");\n\t\treturn builder.toString();\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/model/MetricDataTypeMap.java",
    "content": "/********************************************************************************\n * Copyright (c) 2023 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\npackage org.eclipse.tahu.model;\n\nimport java.util.Map;\nimport java.util.concurrent.ConcurrentHashMap;\n\nimport org.eclipse.tahu.message.model.MetricDataType;\n\npublic class MetricDataTypeMap {\n\n\tprivate final Map<String, MetricDataType> nameDataTypeMap;\n\tprivate final Map<Long, MetricDataType> aliasDataTypeMap;\n\n\tpublic MetricDataTypeMap() {\n\t\tnameDataTypeMap = new ConcurrentHashMap<>();\n\t\taliasDataTypeMap = new ConcurrentHashMap<>();\n\t}\n\n\tpublic void addMetricDataType(String metricName, MetricDataType metricDataType) {\n\t\tnameDataTypeMap.put(metricName, metricDataType);\n\t}\n\n\tpublic void addMetricDataType(Long alias, MetricDataType metricDataType) {\n\t\taliasDataTypeMap.put(alias, metricDataType);\n\t}\n\n\tpublic MetricDataType getMetricDataType(String metricName) {\n\t\treturn nameDataTypeMap.get(metricName);\n\t}\n\n\tpublic MetricDataType getMetricDataType(Long alias) {\n\t\treturn aliasDataTypeMap.get(alias);\n\t}\n\n\tpublic boolean isEmpty() {\n\t\tif (nameDataTypeMap.isEmpty() && aliasDataTypeMap.isEmpty()) {\n\t\t\treturn true;\n\t\t} else {\n\t\t\treturn false;\n\t\t}\n\t}\n\n\tpublic void clear() {\n\t\tnameDataTypeMap.clear();\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\tStringBuilder builder = new StringBuilder();\n\t\tbuilder.append(\"MetricDataTypeMap [nameDataTypeMap=\");\n\t\tbuilder.append(nameDataTypeMap);\n\t\tbuilder.append(\", aliasDataTypeMap=\");\n\t\tbuilder.append(aliasDataTypeMap);\n\t\tbuilder.append(\"]\");\n\t\treturn builder.toString();\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/model/MetricMap.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.model;\n\nimport java.util.Map;\nimport java.util.concurrent.ConcurrentHashMap;\n\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.MetricDataType;\n\n/**\n * Used to track Sparkplug aliases to Metric names and Metric names to aliases\n */\npublic class MetricMap {\n\n\tprivate final Map<String, Long> metricNameToAliasMap;\n\tprivate final Map<Long, String> aliasToMetricNameMap;\n\tprivate final MetricDataTypeMap metricDataTypeMap;\n\n\tprivate long nextAliasIndex;\n\n\tprivate final Object mapLock = new Object();\n\n\t/**\n\t * Constructor\n\t */\n\tpublic MetricMap() {\n\t\tmetricNameToAliasMap = new ConcurrentHashMap<>();\n\t\taliasToMetricNameMap = new ConcurrentHashMap<>();\n\t\tmetricDataTypeMap = new MetricDataTypeMap();\n\t\tnextAliasIndex = 0;\n\t}\n\n\t/**\n\t * Addes a new metric to the map and generates and returns the new alias that will be unique as required for the\n\t * Edge Node\n\t *\n\t * @param metricName the name of the metric to generate the alias for\n\t * @param metricDataType the MetricDataType associated with the {@link Metric}\n\t *\n\t * @return the generated alias for the supplied Metric name\n\t */\n\tpublic long addGeneratedAlias(String metricName, MetricDataType metricDataType) {\n\t\tsynchronized (mapLock) {\n\t\t\tlong newAlias = nextAliasIndex++;\n\t\t\tmetricNameToAliasMap.put(metricName, newAlias);\n\t\t\taliasToMetricNameMap.put(newAlias, metricName);\n\t\t\tmetricDataTypeMap.addMetricDataType(metricName, metricDataType);\n\t\t\treturn newAlias;\n\t\t}\n\t}\n\n\t/**\n\t * Adds a Metric name and alias to the map. The alias must be unique and not already tied to another Metric name\n\t * before calling this method\n\t *\n\t * @param metricName the name of the Metric to add to the map\n\t * @param alias the alias to add to the map and be tied to the alias\n\t * @param metricDataType the MetricDataType associated with the {@link Metric}\n\t */\n\tpublic void addAlias(String metricName, Long alias, MetricDataType metricDataType) {\n\t\tsynchronized (mapLock) {\n\t\t\tif (alias != null) {\n\t\t\t\tmetricNameToAliasMap.put(metricName, alias);\n\t\t\t\taliasToMetricNameMap.put(alias, metricName);\n\t\t\t\tmetricDataTypeMap.addMetricDataType(alias, metricDataType);\n\t\t\t}\n\t\t\tmetricDataTypeMap.addMetricDataType(metricName, metricDataType);\n\t\t}\n\t}\n\n\t/**\n\t * Clears the map of all Metric names and aliases\n\t */\n\tpublic void clear() {\n\t\tsynchronized (mapLock) {\n\t\t\tmetricNameToAliasMap.clear();\n\t\t\taliasToMetricNameMap.clear();\n\t\t\tmetricDataTypeMap.clear();\n\t\t\tnextAliasIndex = 0;\n\t\t}\n\t}\n\n\t/**\n\t * Gets and alias for a given Metric name\n\t *\n\t * @param metricName the Metric name associated with the alias\n\t * @return the alias that is associated with the Metric name\n\t */\n\tpublic Long getAlias(String metricName) {\n\t\treturn metricNameToAliasMap.get(metricName);\n\t}\n\n\t/**\n\t * Gets and Metric name for a given alias\n\t *\n\t * @param alias the alias associated with the Metric name\n\t * @return the alias that is associated with the Metric name\n\t */\n\tpublic String getMetricName(long alias) {\n\t\treturn aliasToMetricNameMap.get(alias);\n\t}\n\n\t/**\n\t * Gets the {@link MetricDataType} of this metric\n\t *\n\t * @param metricName the name of the {@link Metric} to get the {@link MetricDataType} of\n\t * @return the {@link MetricDataType} for the supplied Metric name\n\t */\n\tpublic MetricDataType getMetricDataType(String metricName) {\n\t\treturn metricDataTypeMap.getMetricDataType(metricName);\n\t}\n\n\t/**\n\t * Gets the {@link MetricDataType} of this metric\n\t *\n\t * @param alias the alias of the {@link Metric} to get the {@link MetricDataType} of\n\t * @return the {@link MetricDataType} for the supplied Metric alias\n\t */\n\tpublic MetricDataType getMetricDataType(Long alias) {\n\t\treturn metricDataTypeMap.getMetricDataType(alias);\n\t}\n\n\t/**\n\t * Gets the MetricDataTypeMap associated with this Edge Node\n\t *\n\t * @return the {@link MetricDataTypeMap} of this Edge Node\n\t */\n\tpublic MetricDataTypeMap getMetricDataTypeMap() {\n\t\treturn metricDataTypeMap;\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/model/MqttServerDefinition.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.model;\n\nimport org.eclipse.tahu.message.model.Topic;\nimport org.eclipse.tahu.mqtt.MqttClientId;\nimport org.eclipse.tahu.mqtt.MqttServerName;\nimport org.eclipse.tahu.mqtt.MqttServerUrl;\n\npublic class MqttServerDefinition {\n\n\tprivate final MqttServerName mqttServerName;\n\tprivate final MqttClientId mqttClientId;\n\tprivate final MqttServerUrl mqttServerUrl;\n\tprivate final String username;\n\tprivate final String password;\n\tprivate final int keepAliveTimeout;\n\tprivate final Topic ndeathTopic;\n\n\tpublic MqttServerDefinition(MqttServerName mqttServerName, MqttClientId mqttClientId, MqttServerUrl mqttServerUrl,\n\t\t\tString username, String password, int keepAliveTimeout, Topic ndeathTopic) {\n\t\tthis.mqttServerName = mqttServerName;\n\t\tthis.mqttClientId = mqttClientId;\n\t\tthis.mqttServerUrl = mqttServerUrl;\n\t\tthis.username = username;\n\t\tthis.password = password;\n\t\tthis.keepAliveTimeout = keepAliveTimeout;\n\t\tthis.ndeathTopic = ndeathTopic;\n\t}\n\n\tpublic MqttServerName getMqttServerName() {\n\t\treturn mqttServerName;\n\t}\n\n\tpublic MqttClientId getMqttClientId() {\n\t\treturn mqttClientId;\n\t}\n\n\tpublic MqttServerUrl getMqttServerUrl() {\n\t\treturn mqttServerUrl;\n\t}\n\n\tpublic String getUsername() {\n\t\treturn username;\n\t}\n\n\tpublic String getPassword() {\n\t\treturn password;\n\t}\n\n\tpublic int getKeepAliveTimeout() {\n\t\treturn keepAliveTimeout;\n\t}\n\n\tpublic Topic getNdeathTopic() {\n\t\treturn ndeathTopic;\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/mqtt/ClientCallback.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.mqtt;\n\nimport org.eclipse.paho.client.mqttv3.MqttMessage;\n\n/**\n * A callback interface for usage with {@link TahuClient} instances.\n */\npublic interface ClientCallback {\n\n\tpublic void shutdown();\n\n\tpublic void messageArrived(MqttServerName mqttServerName, MqttServerUrl mqttServerUrl, MqttClientId clientId,\n\t\t\tString topic, MqttMessage message);\n\n\tpublic void connectionLost(MqttServerName mqttServerName, MqttServerUrl mqttServerUrl, MqttClientId clientId,\n\t\t\tThrowable cause);\n\n\tpublic void connectComplete(boolean reconnect, MqttServerName mqttServerName, MqttServerUrl mqttServerUrl,\n\t\t\tMqttClientId clientId);\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/mqtt/MqttClientId.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.mqtt;\n\nimport java.util.UUID;\n\nimport org.eclipse.tahu.exception.TahuErrorCode;\nimport org.eclipse.tahu.exception.TahuException;\n\n/**\n * Defines MQTT Client ID\n */\npublic class MqttClientId {\n\n\tprivate static final int MAX_CLIENT_ID_LENGTH = 23;\n\n\t/*\n\t * MQTT client ID\n\t */\n\tprivate String mqttClientId;\n\n\t/**\n\t * MqttClientId constructor\n\t * \n\t * @param mqttClientId - MQTT client ID as {@link String}\n\t * @param checkClientIdLength - check length of MQTT Client ID? as {@link boolean}\n\t * @throws TahuException\n\t */\n\tpublic MqttClientId(String mqttClientId, boolean checkClientIdLength) throws TahuException {\n\t\tif (mqttClientId == null) {\n\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT, \"MQTT Client ID is not set\");\n\t\t} else if (checkClientIdLength && mqttClientId.length() > 23) {\n\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT,\n\t\t\t\t\t\"MQTT Client ID can not exceed \" + MAX_CLIENT_ID_LENGTH + \" characters in length\");\n\n\t\t}\n\t\tthis.mqttClientId = mqttClientId;\n\t}\n\n\t/**\n\t * Generates MQTT client ID for supplied prefix string\n\t * \n\t * @param clientIdPrefix - cloud ID prefix as {@link String}\n\t * @return MQTT client ID as {@link String}\n\t * @throws TahuException\n\t */\n\tpublic static String generate(String clientIdPrefix) throws TahuException {\n\t\tif (clientIdPrefix != null && clientIdPrefix.length() > MAX_CLIENT_ID_LENGTH - 2) {\n\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT,\n\t\t\t\t\t\"MQTT Client ID prefix can not exceed \" + (MAX_CLIENT_ID_LENGTH - 2) + \" characters in length\");\n\t\t}\n\t\treturn clientIdPrefix + \"-\" + UUID.randomUUID().toString().substring(0,\n\t\t\t\tMAX_CLIENT_ID_LENGTH - (clientIdPrefix != null ? clientIdPrefix.length() : 0) - 1);\n\t}\n\n\t/**\n\t * Reports MQTT Client ID\n\t * \n\t * @return MQTT Client ID as {@link String}\n\t */\n\tpublic String getMqttClientId() {\n\t\treturn mqttClientId;\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\treturn mqttClientId;\n\t}\n\n\t@Override\n\tpublic int hashCode() {\n\t\tfinal int prime = 31;\n\t\tint result = 1;\n\t\tresult = prime * result + ((mqttClientId == null) ? 0 : mqttClientId.hashCode());\n\t\treturn result;\n\t}\n\n\t@Override\n\tpublic boolean equals(Object obj) {\n\t\tif (this == obj)\n\t\t\treturn true;\n\t\tif (obj == null)\n\t\t\treturn false;\n\t\tif (getClass() != obj.getClass())\n\t\t\treturn false;\n\t\tMqttClientId other = (MqttClientId) obj;\n\t\tif (mqttClientId == null) {\n\t\t\tif (other.mqttClientId != null) {\n\t\t\t\treturn false;\n\t\t\t}\n\t\t} else if (!mqttClientId.equals(other.mqttClientId)) {\n\t\t\treturn false;\n\t\t}\n\t\treturn true;\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/mqtt/MqttOperatorDefs.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.mqtt;\n\npublic class MqttOperatorDefs {\n\n\t// Convenience for use with MQTT\n\tpublic static final int QOS0 = 0;\n\tpublic static final int QOS1 = 1;\n\tpublic static final int QOS2 = 2;\n\tpublic static final boolean RETAINEDMESGTRUE = true;\n\tpublic static final boolean RETAINEDMESGFALSE = false;\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/mqtt/MqttServerName.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.mqtt;\n\npublic class MqttServerName {\n\n\tprivate String mqttServerName;\n\n\tpublic MqttServerName(String mqttServerName) {\n\t\tthis.mqttServerName = mqttServerName;\n\t}\n\n\tpublic String getMqttServerName() {\n\t\treturn mqttServerName;\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\treturn mqttServerName;\n\t}\n\n\t@Override\n\tpublic int hashCode() {\n\t\tfinal int prime = 31;\n\t\tint result = 1;\n\t\tresult = prime * result + ((mqttServerName == null) ? 0 : mqttServerName.hashCode());\n\t\treturn result;\n\t}\n\n\t@Override\n\tpublic boolean equals(Object obj) {\n\t\tif (this == obj)\n\t\t\treturn true;\n\t\tif (obj == null)\n\t\t\treturn false;\n\t\tif (getClass() != obj.getClass())\n\t\t\treturn false;\n\t\tMqttServerName other = (MqttServerName) obj;\n\t\tif (mqttServerName == null) {\n\t\t\tif (other.mqttServerName != null)\n\t\t\t\treturn false;\n\t\t} else if (!mqttServerName.equals(other.mqttServerName))\n\t\t\treturn false;\n\t\treturn true;\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/mqtt/MqttServerUrl.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.mqtt;\n\nimport org.eclipse.tahu.exception.TahuErrorCode;\nimport org.eclipse.tahu.exception.TahuException;\n\npublic class MqttServerUrl {\n\n\tprivate final String mqttServerUrl;\n\tprivate final String protocol;\n\tprivate final String fqdn;\n\tprivate final Integer port;\n\n\tpublic MqttServerUrl(String mqttServerUrl) throws TahuException {\n\t\tthis.mqttServerUrl = mqttServerUrl;\n\n\t\ttry {\n\t\t\tString[] fqdnParts;\n\t\t\tif (mqttServerUrl.contains(\"://\")) {\n\t\t\t\tString[] protocolParts = mqttServerUrl.split(\"://\");\n\t\t\t\tprotocol = protocolParts[0];\n\t\t\t\tfqdnParts = protocolParts[1].split(\":\");\n\t\t\t} else {\n\t\t\t\tprotocol = \"tcp\";\n\t\t\t\tfqdnParts = mqttServerUrl.split(\":\");\n\t\t\t}\n\n\t\t\tif (fqdnParts.length == 1) {\n\t\t\t\tfqdn = fqdnParts[0];\n\t\t\t\tport = 1883;\n\t\t\t} else if (fqdnParts.length == 2) {\n\t\t\t\tfqdn = fqdnParts[0];\n\t\t\t\tport = Integer.parseInt(fqdnParts[1]);\n\t\t\t} else {\n\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT, \"Invalid MQTT Server URL: \" + mqttServerUrl);\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT, \"Invalid MQTT Server URL: \" + mqttServerUrl, e);\n\t\t}\n\t}\n\n\tpublic MqttServerUrl(String protocol, String fqdn, Integer port) throws TahuException {\n\t\tif (protocol == null || fqdn == null || port == null) {\n\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT,\n\t\t\t\t\t\"Invalid MQTT Server URL: protocol=\" + protocol + \" FQDN=\" + fqdn + \" port=\" + port);\n\t\t} else {\n\t\t\tmqttServerUrl = protocol + \"://\" + fqdn + \":\" + port;\n\t\t\tthis.protocol = protocol;\n\t\t\tthis.fqdn = fqdn;\n\t\t\tthis.port = port;\n\t\t}\n\t}\n\n\tpublic static MqttServerUrl getMqttServerUrlSafe(String mqttServerUrl) {\n\t\ttry {\n\t\t\treturn new MqttServerUrl(mqttServerUrl);\n\t\t} catch (Exception e) {\n\t\t\treturn null;\n\t\t}\n\t}\n\n\tpublic String getMqttServerUrl() {\n\t\treturn mqttServerUrl;\n\t}\n\n\tpublic String getProtocol() {\n\t\treturn protocol;\n\t}\n\n\tpublic String getFqdn() {\n\t\treturn fqdn;\n\t}\n\n\tpublic Integer getPort() {\n\t\treturn port;\n\t}\n\n\t@Override\n\tpublic String toString() {\n\t\treturn mqttServerUrl;\n\t}\n\n\t@Override\n\tpublic int hashCode() {\n\t\tfinal int prime = 31;\n\t\tint result = 1;\n\t\tresult = prime * result + ((mqttServerUrl == null) ? 0 : mqttServerUrl.hashCode());\n\t\treturn result;\n\t}\n\n\t@Override\n\tpublic boolean equals(Object obj) {\n\t\tif (this == obj)\n\t\t\treturn true;\n\t\tif (obj == null)\n\t\t\treturn false;\n\t\tif (getClass() != obj.getClass())\n\t\t\treturn false;\n\t\tMqttServerUrl other = (MqttServerUrl) obj;\n\t\tif (mqttServerUrl == null) {\n\t\t\tif (other.mqttServerUrl != null)\n\t\t\t\treturn false;\n\t\t} else if (!mqttServerUrl.equals(other.mqttServerUrl))\n\t\t\treturn false;\n\t\treturn true;\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/mqtt/RandomStartupDelay.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.mqtt;\n\nimport org.eclipse.tahu.exception.TahuErrorCode;\nimport org.eclipse.tahu.exception.TahuException;\n\npublic class RandomStartupDelay {\n\n\tpublic static final String ERROR_MESSAGE =\n\t\t\t\"Random Startup Delay must be of the form 'min-max' where min is the low end of the range and max is the high end of the range in milliseconds\";\n\n\tprivate final String randomStartupDelayString;\n\tprivate final long low;\n\tprivate final long high;\n\n\tpublic RandomStartupDelay(String randomStartupDelayString) throws TahuException {\n\t\tif (randomStartupDelayString != null && !randomStartupDelayString.trim().isEmpty()) {\n\t\t\tString[] pair = randomStartupDelayString.split(\"-\");\n\t\t\tif (pair.length == 2) {\n\t\t\t\ttry {\n\t\t\t\t\tlow = Long.parseLong(pair[0].trim());\n\t\t\t\t\thigh = Long.parseLong(pair[1].trim());\n\t\t\t\t\tif (low < 0 || high < 0 || high < low) {\n\t\t\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT, ERROR_MESSAGE);\n\t\t\t\t\t} else {\n\t\t\t\t\t\tthis.randomStartupDelayString = randomStartupDelayString;\n\t\t\t\t\t}\n\t\t\t\t} catch (Exception e) {\n\t\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT, ERROR_MESSAGE);\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT, ERROR_MESSAGE);\n\t\t\t}\n\t\t} else {\n\t\t\tthis.randomStartupDelayString = null;\n\t\t\tlow = -1L;\n\t\t\thigh = -1L;\n\t\t}\n\t}\n\n\tpublic String getRandomStartupDelayString() {\n\t\treturn randomStartupDelayString;\n\t}\n\n\tpublic long getLow() {\n\t\treturn low;\n\t}\n\n\tpublic long getHigh() {\n\t\treturn high;\n\t}\n\n\tpublic boolean isValid() {\n\t\treturn (low >= 0 && high >= low) ? true : false;\n\t}\n\n\tpublic long getRandomDelay() {\n\t\tif (randomStartupDelayString != null) {\n\t\t\treturn low + (long) (Math.random() * (high - low));\n\t\t} else {\n\t\t\treturn 0L;\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/mqtt/TahuClient.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.mqtt;\n\nimport java.util.ArrayList;\nimport java.util.Arrays;\nimport java.util.Collections;\nimport java.util.Date;\nimport java.util.Map;\nimport java.util.SortedMap;\nimport java.util.TreeMap;\n\nimport org.eclipse.paho.client.mqttv3.IMqttActionListener;\nimport org.eclipse.paho.client.mqttv3.IMqttDeliveryToken;\nimport org.eclipse.paho.client.mqttv3.IMqttToken;\nimport org.eclipse.paho.client.mqttv3.MqttAsyncClient;\nimport org.eclipse.paho.client.mqttv3.MqttCallback;\nimport org.eclipse.paho.client.mqttv3.MqttCallbackExtended;\nimport org.eclipse.paho.client.mqttv3.MqttConnectOptions;\nimport org.eclipse.paho.client.mqttv3.MqttException;\nimport org.eclipse.paho.client.mqttv3.MqttMessage;\nimport org.eclipse.paho.client.mqttv3.MqttSecurityException;\nimport org.eclipse.paho.client.mqttv3.internal.NetworkModuleService;\nimport org.eclipse.tahu.exception.TahuErrorCode;\nimport org.eclipse.tahu.exception.TahuException;\nimport org.eclipse.tahu.message.model.StatePayload;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\nimport com.fasterxml.jackson.databind.ObjectMapper;\n\n/**\n * An Custom MQTT client.\n */\npublic class TahuClient implements MqttCallbackExtended {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(TahuClient.class.getName());\n\n\tprivate static final long DEFAULT_CONNECT_RETRY_INTERVAL = 1000;\n\tprivate static final long DEFAULT_CONNECT_MONITOR_INTERVAL = 10000;\n\tprivate static final long DEFAULT_CONNECT_ATTEMPT_TIMEOUT = 30000;\n\n\tprivate Thread connectRunnableThread;\n\tprivate ConnectRunnable connectRunnable;\n\tprivate long connectRetryInterval;\n\tprivate long connectAttemptTimeout;\n\n\t/*\n\t * Tracks the state of the connection attempts.\n\t */\n\tprivate ConnectingState state = new ConnectingState();\n\n\t/*\n\t * birth/death properties\n\t */\n\tprivate boolean useSparkplugStatePayload;\n\tprivate Long lastStateDeathPayloadTimestamp;\n\tprivate String birthTopic;\n\tprivate byte[] birthPayload;\n\tprivate boolean birthRetain;\n\tprivate String lwtTopic;\n\tprivate byte[] lwtPayload;\n\tprivate int lwtQoS;\n\tprivate boolean lwtRetain;\n\tprivate IMqttDeliveryToken lwtDeliveryToken;\n\tprivate Object lwtDeliveryLock = new Object();\n\n\t/*\n\t * The Asynchronous MQTT Client and MQTTConnectOptions\n\t */\n\tprivate MqttAsyncClient client = null;\n\tMqttConnectOptions connectOptions = null;\n\n\t/*\n\t * Other standard MQTT parameters.\n\t */\n\tprivate MqttServerUrl mqttServerUrl;\n\tprivate MqttServerName mqttServerName;\n\tprivate final MqttClientId clientId;\n\tprivate String username;\n\tprivate String password;\n\tprivate boolean cleanSession;\n\tprivate int keepAlive;\n\n\t/*\n\t * The callback client\n\t */\n\tprivate ClientCallback callback;\n\n\t/**\n\t * A list of topics the client has subscribed on\n\t */\n\tprivate SortedMap<String, Integer> subscriptions = new TreeMap<>();\n\n\t/*\n\t * Odds/ends\n\t */\n\tprivate boolean autoReconnect;\n\tprivate RandomStartupDelay randomStartupDelay;\n\n\t/*\n\t * The maximum number of in-flight (pending) messages for the client to store. If this maximum is, publishes will\n\t * fail with and INTERNAL_ERROR: Caused by: org.eclipse.paho.client.mqttv3.MqttException: Too many publishes in\n\t * progress\n\t */\n\tprivate int maxInFlightMessages = 10;\n\n\t/*\n\t * The maximum number of topics per individual subscribe message.\n\t */\n\tprivate int maxTopicsPerSubscribe = 256;\n\n\tprivate Date connectTime;\n\tprivate Date disconnectTime;\n\tprivate Date onlineDate;\n\tprivate Date offlineDate;\n\tprivate double totalUptime;\n\tprivate double totalDowntime;\n\tprivate int connectionCount = 0; // # of Edge Nodes connected to this MQTT Client's Broker\n\tprivate boolean doLatencyCheck = false;\n\tprivate long numMesgsArrived = 0;\n\tprivate long lastNumMesgsArrived = 0;\n\n\tprivate boolean disconnectInProgress = false;\n\n\tprivate Object clientLock = new Object();\n\tprivate ConnectionMonitorThread connectionMonitorThread;\n\n\tprivate boolean trackFirstConnection = false;\n\tprivate boolean firstConnection = true;\n\tprivate boolean resubscribed = false;\n\n\tpublic TahuClient(final MqttClientId clientId, final MqttServerName mqttServerName,\n\t\t\tfinal MqttServerUrl mqttServerUrl, final String username, final String password, boolean cleanSession,\n\t\t\tint keepAlive, ClientCallback callback, RandomStartupDelay randomStartupDelay) {\n\t\tthis.mqttServerUrl = mqttServerUrl;\n\t\tthis.mqttServerName = mqttServerName;\n\t\tthis.clientId = clientId;\n\t\tthis.username = username;\n\t\tthis.password = password;\n\t\tthis.cleanSession = cleanSession;\n\t\tthis.keepAlive = keepAlive;\n\t\tthis.callback = callback;\n\t\tthis.randomStartupDelay = randomStartupDelay;\n\t\tthis.lwtRetain = false;\n\t\tthis.birthRetain = false;\n\t\tthis.autoReconnect = true;\n\t\tthis.setConnectRetryInterval(DEFAULT_CONNECT_RETRY_INTERVAL);\n\t\tthis.setConnectAttemptTimeout(DEFAULT_CONNECT_ATTEMPT_TIMEOUT);\n\t\tthis.renewDisconnectTime();\n\t\tthis.renewOnlineDate();\n\t\tthis.renewOfflineDate();\n\t}\n\n\tpublic TahuClient(final MqttClientId clientId, final MqttServerName mqttServerName,\n\t\t\tfinal MqttServerUrl mqttServerUrl, String username, String password, boolean cleanSession, int keepAlive,\n\t\t\tClientCallback callback, RandomStartupDelay randomStartupDelay, boolean useSparkplugStatePayload,\n\t\t\tString birthTopic, byte[] birthPayload, String lwtTopic, byte[] lwtPayload, int lwtQoS) {\n\t\tthis(clientId, mqttServerName, mqttServerUrl, username, password, cleanSession, keepAlive, callback,\n\t\t\t\trandomStartupDelay);\n\t\tthis.setLifecycleProps(useSparkplugStatePayload, birthTopic, birthPayload, false, lwtTopic, lwtPayload, lwtQoS,\n\t\t\t\tfalse);\n\t}\n\n\tpublic TahuClient(final MqttClientId clientId, final MqttServerName mqttServerName,\n\t\t\tfinal MqttServerUrl mqttServerUrl, String username, String password, boolean cleanSession, int keepAlive,\n\t\t\tClientCallback callback, RandomStartupDelay randomStartupDelay, boolean useSparkplugStatePayload,\n\t\t\tString birthTopic, byte[] birthPayload, boolean birthRetain, String lwtTopic, byte[] lwtPayload, int lwtQoS,\n\t\t\tboolean lwtRetain) {\n\t\tthis(clientId, mqttServerName, mqttServerUrl, username, password, cleanSession, keepAlive, callback,\n\t\t\t\trandomStartupDelay);\n\t\tthis.setLifecycleProps(useSparkplugStatePayload, birthTopic, birthPayload, birthRetain, lwtTopic, lwtPayload,\n\t\t\t\tlwtQoS, lwtRetain);\n\t}\n\n\t/**\n\t * Sets the properties relating to client life cycle events such as LWT and Birth topics and payloads.\n\t * \n\t * @param birthTopic the topic to publish birth certificates on\n\t * @param birthPayload the payload of a birth certificate\n\t * @param birthRetain whether to retain birth certificate messages\n\t * @param lwtTopic the topic to publish LWT on\n\t * @param lwtPayload the payload of an LWT\n\t * @param lwtRetain whether to retain LWT messages\n\t */\n\tprivate void setLifecycleProps(boolean useSparkplugStatePayload, String birthTopic, byte[] birthPayload,\n\t\t\tboolean birthRetain, String lwtTopic, byte[] lwtPayload, int lwtQoS, boolean lwtRetain) {\n\t\tthis.useSparkplugStatePayload = useSparkplugStatePayload;\n\t\tthis.birthTopic = birthTopic;\n\t\tthis.birthPayload = birthPayload;\n\t\tthis.birthRetain = birthRetain;\n\t\tthis.lwtTopic = lwtTopic;\n\t\tthis.lwtPayload = lwtPayload;\n\t\tthis.lwtQoS = lwtQoS;\n\t\tthis.lwtRetain = lwtRetain;\n\n\t}\n\n\tprotected MqttConnectOptions getMqttConnectOptions() {\n\t\treturn connectOptions;\n\t}\n\n\tprotected void setMqttConnectOptions(MqttConnectOptions connectOptions) {\n\t\tthis.connectOptions = connectOptions;\n\t}\n\n\tpublic long getNumMesgsArrived() {\n\t\treturn numMesgsArrived;\n\t}\n\n\tpublic long getMesgsArrivedDelta() {\n\t\t// Returns the number of messages arrived since last called.\n\t\tlong delta = numMesgsArrived - lastNumMesgsArrived;\n\t\tlastNumMesgsArrived = numMesgsArrived;\n\t\treturn delta;\n\t}\n\n\tpublic void clearMesgArrivedCount() {\n\t\tnumMesgsArrived = 0;\n\t\tlastNumMesgsArrived = 0;\n\t}\n\n\tpublic void setMaxInflightMessages(int max) {\n\t\tthis.maxInFlightMessages = max;\n\t}\n\n\tpublic int getMaxInflightMessages() {\n\t\treturn this.maxInFlightMessages;\n\t}\n\n\tpublic void setDoLatencyCheck(boolean state) {\n\t\tdoLatencyCheck = state;\n\t}\n\n\tpublic boolean getDoLatencyCheck() {\n\t\treturn doLatencyCheck;\n\t}\n\n\tpublic void clearConnectionCount() {\n\t\tconnectionCount = 0;\n\t}\n\n\tpublic void incrementConnectionCount() {\n\t\tconnectionCount++;\n\t}\n\n\tpublic int getConnectionCount() {\n\t\treturn connectionCount;\n\t}\n\n\tpublic MqttServerUrl getMqttServerUrl() {\n\t\treturn mqttServerUrl;\n\t}\n\n\tpublic MqttServerName getMqttServerName() {\n\t\treturn mqttServerName;\n\t}\n\n\tpublic MqttClientId getClientId() {\n\t\treturn clientId;\n\t}\n\n\tpublic String getUsername() {\n\t\treturn username;\n\t}\n\n\tpublic void setUsername(String username) {\n\t\tthis.username = username;\n\t}\n\n\tpublic String getPassword() {\n\t\treturn password;\n\t}\n\n\tpublic void setPassord(String password) {\n\t\tthis.password = password;\n\t}\n\n\tpublic int getKeepAlive() {\n\t\treturn keepAlive;\n\t}\n\n\tpublic boolean isCleanSession() {\n\t\treturn cleanSession;\n\t}\n\n\tpublic Map<String, Integer> getSubscriptions() {\n\t\treturn Collections.unmodifiableMap(subscriptions);\n\t}\n\n\tpublic int getMaxTopicsPerSubscribe() {\n\t\treturn maxTopicsPerSubscribe;\n\t}\n\n\tpublic void setMaxTopicsPerSubscribe(int maxTopicsPerSubscribe) {\n\t\tthis.maxTopicsPerSubscribe = maxTopicsPerSubscribe;\n\t}\n\n\tpublic ClientCallback getCallback() {\n\t\t// If callback is null, return a no-op implementation\n\t\treturn this.callback != null ? this.callback : new ClientCallback() {\n\t\t\t@Override\n\t\t\tpublic void shutdown() {\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\t@Override\n\t\t\tpublic void messageArrived(MqttServerName mqttServerName, MqttServerUrl mqttServerUrl,\n\t\t\t\t\tMqttClientId clientId, String topic, MqttMessage message) {\n\t\t\t}\n\n\t\t\t@Override\n\t\t\tpublic void connectionLost(MqttServerName mqttServerName, MqttServerUrl mqttServerUrl,\n\t\t\t\t\tMqttClientId clientId, Throwable cause) {\n\t\t\t}\n\n\t\t\t@Override\n\t\t\tpublic void connectComplete(boolean reconnect, MqttServerName mqttServerName, MqttServerUrl mqttServerUrl,\n\t\t\t\t\tMqttClientId clientId) {\n\t\t\t}\n\t\t};\n\t}\n\n\tpublic void setAutoReconnect(boolean autoReconnect) {\n\t\tthis.autoReconnect = autoReconnect;\n\t}\n\n\tpublic boolean getAutoReconnect() {\n\t\treturn autoReconnect;\n\t}\n\n\tpublic String getLwtTopic() {\n\t\treturn lwtTopic;\n\t}\n\n\tpublic void setLwtRetain(boolean retain) {\n\t\tthis.lwtRetain = retain;\n\t}\n\n\tpublic boolean getLwtRetain() {\n\t\treturn lwtRetain;\n\t}\n\n\tpublic Long getLastStateDeathPayloadTimestamp() {\n\t\treturn lastStateDeathPayloadTimestamp;\n\t}\n\n\tpublic boolean isConnected() {\n\t\tif (client != null) {\n\t\t\treturn client.isConnected();\n\t\t} else {\n\t\t\treturn false;\n\t\t}\n\t}\n\n\tpublic boolean isConnectedAndResubscribed() {\n\t\tif (client != null) {\n\t\t\treturn client.isConnected() && resubscribed;\n\t\t} else {\n\t\t\treturn false;\n\t\t}\n\t}\n\n\tpublic long getConnectDuration() throws TahuException {\n\t\tif (getConnectTime() != null) {\n\t\t\tDate now = new Date();\n\t\t\treturn now.getTime() - getConnectTime().getTime();\n\t\t} else if (getDisconnectTime() != null) {\n\t\t\tDate now = new Date();\n\t\t\treturn -(now.getTime() - getDisconnectTime().getTime());\n\t\t} else {\n\t\t\tthrow new TahuException(TahuErrorCode.INTERNAL_ERROR, \"Connect time is unknown\");\n\t\t}\n\t}\n\n\t/**\n\t * Returns the availability as a percentage, calculated by uptime/(uptime+downtime).\n\t * \n\t * @return a double representing the percentage of availability\n\t * @throws TahuException\n\t */\n\tpublic double getAvailability() throws TahuException {\n\t\tif (getConnectTime() != null) {\n\t\t\tDate now = new Date();\n\t\t\ttotalUptime = totalUptime + (now.getTime() - getConnectTime().getTime());\n\t\t}\n\t\tif (getDisconnectTime() != null) {\n\t\t\tDate now = new Date();\n\t\t\ttotalDowntime = totalDowntime + (now.getTime() - getDisconnectTime().getTime());\n\t\t}\n\n\t\tif ((totalUptime + totalDowntime == 0)) {\n\t\t\tthrow new TahuException(TahuErrorCode.INTERNAL_ERROR, \"Connect time is unknown\");\n\t\t}\n\n\t\treturn (totalUptime / (totalUptime + totalDowntime)) * 100.0;\n\t}\n\n\tpublic void resetAvailability() {\n\t\ttotalUptime = 0;\n\t\ttotalDowntime = 0;\n\t}\n\n\t/**\n\t * Returns a {@link Date} instance representing the online date.\n\t * \n\t * @return the online date.\n\t */\n\tpublic Date getOnlineDateTime() {\n\t\treturn this.onlineDate;\n\t}\n\n\t/**\n\t * Renews the online date.\n\t */\n\tpublic void renewOnlineDate() {\n\t\tthis.onlineDate = new Date();\n\t}\n\n\t/**\n\t * Returns a {@link Date} instance representing the offline date.\n\t * \n\t * @return the offline date.\n\t */\n\tpublic Date getOfflineDateTime() {\n\t\treturn this.offlineDate;\n\t}\n\n\t/**\n\t * Renews the offline date.\n\t */\n\tpublic void renewOfflineDate() {\n\t\tthis.offlineDate = new Date();\n\t}\n\n\tpublic IMqttDeliveryToken publish(String topic, byte[] payload, int qos, boolean retained) throws TahuException {\n\t\ttry {\n\t\t\tif (client == null) {\n\t\t\t\tthrow new TahuException(TahuErrorCode.INTERNAL_ERROR,\n\t\t\t\t\t\t\"MQTT client: \" + clientId.getMqttClientId() + \" is null\");\n\t\t\t} else if (client.isConnected()) {\n\t\t\t\tlogger.debug(\"{}: Publishing on Topic {}, Payload Size = {}\", getClientId(), topic, payload.length);\n\t\t\t\treturn client.publish(topic, payload, qos, retained);\n\t\t\t} else {\n\t\t\t\tthrow new TahuException(TahuErrorCode.INTERNAL_ERROR,\n\t\t\t\t\t\t\"MQTT client: \" + clientId.getMqttClientId() + \" is not connected\");\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\tthrow new TahuException(TahuErrorCode.INTERNAL_ERROR, e);\n\t\t}\n\t}\n\n\tpublic void asyncPublish(String topic, byte[] payload, int qos, boolean retained) throws TahuException {\n\t\tThread t = new Thread(new AsyncPublisher(topic, payload, qos, retained, false, 0, 0));\n\t\tt.start();\n\t}\n\n\tpublic void asyncPublish(String topic, byte[] payload, int qos, boolean retained, boolean retry, long retryDelay,\n\t\t\tint numAttempts) throws TahuException {\n\t\tThread t = new Thread(new AsyncPublisher(topic, payload, qos, retained, retry, retryDelay, numAttempts));\n\t\tt.start();\n\t}\n\n\t/**\n\t * Subscribes to a topic.\n\t * \n\t * @param topic the topic.\n\t * @param qos the quality of service (0, 1, or 2)\n\t * \n\t * @return the granted QoS for the subscription\n\t * @throws TahuException\n\t */\n\tpublic int subscribe(String topic, int qos) throws TahuException {\n\t\tsynchronized (clientLock) {\n\t\t\tif (client != null) {\n\t\t\t\tif (client.isConnected()) {\n\t\t\t\t\ttry {\n\t\t\t\t\t\tlogger.debug(\"{}: server {} - Attempting to subscribe on topic {} with QoS={}\", getClientId(),\n\t\t\t\t\t\t\t\tgetMqttServerName(), topic, qos);\n\t\t\t\t\t\tIMqttToken token = client.subscribe(topic, qos);\n\t\t\t\t\t\tlogger.trace(\"{}: Waiting for subscription on {}\", getClientId(), topic);\n\t\t\t\t\t\ttoken.waitForCompletion();\n\t\t\t\t\t\tlogger.trace(\"{}: Done waiting for subscription on {}\", getClientId(), topic);\n\t\t\t\t\t\tsubscriptions.put(topic, qos);\n\t\t\t\t\t\tint[] grantedQos = token.getGrantedQos();\n\t\t\t\t\t\tlogger.debug(\"{}: Granted QoS for subcription on {}: {}\", getClientId(), topic, grantedQos[0]);\n\t\t\t\t\t\tif (grantedQos != null && grantedQos.length == 1) {\n\t\t\t\t\t\t\treturn grantedQos[0];\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tString errorMessage = getClientId() + \": server \" + getMqttServerName()\n\t\t\t\t\t\t\t\t\t+ \" - Failed to subscribe to \" + topic;\n\t\t\t\t\t\t\tlogger.error(errorMessage);\n\t\t\t\t\t\t\tthrow new TahuException(TahuErrorCode.NOT_AUTHORIZED, errorMessage);\n\t\t\t\t\t\t}\n\t\t\t\t\t} catch (MqttException e) {\n\t\t\t\t\t\tlogger.error(getClientId() + \": server \" + getMqttServerName() + \" - Failed to subscribe to \"\n\t\t\t\t\t\t\t\t+ topic);\n\t\t\t\t\t\tthrow new TahuException(TahuErrorCode.INTERNAL_ERROR, e);\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t\tlogger.debug(\"{}: Not connected and not subscribing to {} - just storing the subscription for now\",\n\t\t\t\t\tgetClientId(), topic);\n\t\t\tsubscriptions.put(topic, qos);\n\t\t\treturn qos;\n\t\t}\n\t}\n\n\t/**\n\t * Subscribes to a set of topic.\n\t * \n\t * @param topics the topics.\n\t * @param qos the quality of service (0, 1, or 2)\n\t * \n\t * @return the granted QoS levels for the subscriptions\n\t * @throws TahuException\n\t */\n\tpublic int[] subscribe(String[] topics, int[] qos) throws TahuException {\n\t\tsynchronized (clientLock) {\n\t\t\ttry {\n\t\t\t\tif (client != null) {\n\t\t\t\t\tif (client.isConnected()) {\n\t\t\t\t\t\tlogger.debug(\"{}: Attempting to subscribe on topics {} with QoS={}\", getClientId(), topics,\n\t\t\t\t\t\t\t\tqos);\n\t\t\t\t\t\tIMqttToken token = client.subscribe(topics, qos);\n\t\t\t\t\t\tlogger.trace(\"{}: Waiting for subscription on {}\", getClientId(), Arrays.toString(topics));\n\t\t\t\t\t\ttoken.waitForCompletion();\n\t\t\t\t\t\tlogger.trace(\"{}: Done waiting for subscription on {}\", getClientId(), Arrays.toString(topics));\n\t\t\t\t\t\tint[] grantedQos = token.getGrantedQos();\n\t\t\t\t\t\tif (grantedQos != null && grantedQos.length > 0) {\n\t\t\t\t\t\t\tfor (int i = 0; i < topics.length; i++) {\n\t\t\t\t\t\t\t\tif (grantedQos[i] == qos[i]) {\n\t\t\t\t\t\t\t\t\tsubscriptions.put(topics[i], qos[i]);\n\t\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\t\tthrow new TahuException(TahuErrorCode.NOT_AUTHORIZED,\n\t\t\t\t\t\t\t\t\t\t\t\"Failed to subscribe to \" + topics[i]);\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\treturn grantedQos;\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tthrow new TahuException(TahuErrorCode.NOT_AUTHORIZED, \"Failed to subscribe to \" + topics);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t}\n\n\t\t\t\tfor (int i = 0; i < topics.length; i++) {\n\t\t\t\t\tsubscriptions.put(topics[i], qos[i]);\n\t\t\t\t}\n\t\t\t\tlogger.debug(\"{}: Not connected and not subscribing to {} - just storing the subscription for now\",\n\t\t\t\t\t\tgetClientId(), Arrays.asList(topics));\n\t\t\t\treturn qos;\n\t\t\t} catch (Exception e) {\n\t\t\t\tthrow new TahuException(TahuErrorCode.INTERNAL_ERROR, e);\n\t\t\t}\n\t\t}\n\t}\n\n\t/**\n\t * Unsubsribes from a topic.\n\t * \n\t * @param topic the topic.\n\t * @throws TahuException\n\t */\n\tpublic void unsubscribe(String topic) throws TahuException {\n\t\tsynchronized (clientLock) {\n\t\t\tif (client != null) {\n\t\t\t\tif (client.isConnected()) {\n\t\t\t\t\ttry {\n\t\t\t\t\t\tlogger.debug(\"{}: {} attempting to unsubscribe on topic {}\", getClientId(), mqttServerName,\n\t\t\t\t\t\t\t\ttopic);\n\t\t\t\t\t\tclient.unsubscribe(topic);\n\t\t\t\t\t} catch (MqttException e) {\n\t\t\t\t\t\tthrow new TahuException(TahuErrorCode.INTERNAL_ERROR, e);\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t\tsubscriptions.remove(topic);\n\t\t}\n\t}\n\n\t@Override\n\tpublic void connectionLost(Throwable cause) {\n\t\tlogger.debug(\"{}: MQTT connectionLost() to {} :: {}\", getClientId(), getMqttServerName(), getMqttServerUrl());\n\t\tif (logger.isTraceEnabled()) {\n\t\t\tif (client != null) {\n\t\t\t\tclient.getDebug().dumpClientDebug();\n\t\t\t}\n\t\t}\n\n\t\t// reset the timers if needed\n\t\tif (getDisconnectTime() == null) {\n\t\t\tthis.clearConnectTime();\n\t\t\tthis.renewDisconnectTime();\n\t\t\tthis.renewOfflineDate();\n\t\t}\n\n\t\t// Reset re-subscribed flag\n\t\tresubscribed = false;\n\n\t\tif (cause != null) {\n\t\t\t// We don't need to see all of the connection lost callbacks for clients\n\t\t\tlogger.debug(\"{}: Connection lost due to {}\", getClientId(), cause.getMessage(), cause);\n\t\t}\n\n\t\t// Trigger the connection lost event on the callback client\n\t\tgetCallback().connectionLost(getMqttServerName(), getMqttServerUrl(), getClientId(), cause);\n\t}\n\n\t@Override\n\tpublic void deliveryComplete(IMqttDeliveryToken token) {\n\t\tsynchronized (lwtDeliveryLock) {\n\t\t\tif (lwtDeliveryToken != null && lwtDeliveryToken.getMessageId() == token.getMessageId()) {\n\t\t\t\tlogger.info(\"{}: LWT Delivery complete for {}\", getClientId(), token.getMessageId());\n\t\t\t\tlwtDeliveryToken = null;\n\t\t\t} else {\n\t\t\t\tlogger.debug(\"{}: Delivery complete for {}\", getClientId(), token.getMessageId());\n\t\t\t}\n\t\t}\n\t}\n\n\t@Override\n\tpublic void messageArrived(String topic, MqttMessage mqttMessage) throws Exception {\n\t\tlogger.debug(\"{}: MQTT message arrived on topic {}\", getClientId(), topic);\n\t\tnumMesgsArrived++;\n\t\tgetCallback().messageArrived(getMqttServerName(), getMqttServerUrl(), getClientId(), topic, mqttMessage);\n\t}\n\n\t/**\n\t * Attempt to connect the TahuClient\n\t */\n\tpublic void connect() {\n\t\ttry {\n\t\t\tNetworkModuleService.validateURI(mqttServerUrl.getMqttServerUrl());\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"{}: Invalid MQTT Server URL: {}\", getClientId(), mqttServerUrl.getMqttServerUrl());\n\t\t\treturn;\n\t\t}\n\n\t\tlogger.debug(\"{}: Starting new connect, autoReconnect: {}\", getClientId(), autoReconnect);\n\t\tsynchronized (clientLock) {\n\t\t\tlogger.debug(\"{}: Got lock for new connect\", getClientId());\n\t\t\ttry {\n\t\t\t\t// reset the timers if needed\n\t\t\t\tif (getDisconnectTime() == null) {\n\t\t\t\t\tthis.clearConnectTime();\n\t\t\t\t\tthis.renewDisconnectTime();\n\t\t\t\t}\n\n\t\t\t\tif (getAutoReconnect() && state.inProgress()) {\n\t\t\t\t\tlogger.debug(\"{}: Connect attempt already in progress\", getClientId());\n\t\t\t\t\treturn;\n\t\t\t\t} else {\n\t\t\t\t\tdisconnect(0, 0, false, true);\n\t\t\t\t\tstate.setInProgress(true);\n\t\t\t\t\tlogger.debug(\"{}: Starting ConnectThread\", getClientId());\n\t\t\t\t\tconnectRunnable = new ConnectRunnable(this);\n\t\t\t\t\tconnectRunnableThread = new Thread(connectRunnable);\n\t\t\t\t\tconnectRunnableThread.start();\n\t\t\t\t}\n\t\t\t} catch (Throwable t) {\n\t\t\t\tlogger.error(\"{}: Error connectiong\", getClientId(), t);\n\t\t\t}\n\t\t}\n\t}\n\n\tpublic boolean isDisconnectInProgress() {\n\t\treturn disconnectInProgress;\n\t}\n\n\t/**\n\t * Attempt to disconnect the TahuClient.\n\t * \n\t * @param retryConnect true if the client should attempt to reconnect.\n\t */\n\tpublic void disconnect(long disconnectQuieseTime, long disconnectTimeout, boolean sendDisconnect,\n\t\t\tboolean waitForLwt) throws TahuException {\n\t\tthis.disconnect(disconnectQuieseTime, disconnectTimeout, sendDisconnect, true, waitForLwt);\n\t}\n\n\t/**\n\t * Attempt to disconnect the TahuClient.\n\t * \n\t * @param retryConnect true if the client should attempt to reconnect.\n\t */\n\tpublic void disconnect(long disconnectQuieseTime, long disconnectTimeout, boolean sendDisconnect,\n\t\t\tboolean publishLwt, boolean waitForLwt) throws TahuException {\n\t\tsynchronized (clientLock) {\n\t\t\tdisconnectInProgress = true;\n\n\t\t\ttry {\n\t\t\t\tshutdownConnectionMonitorThread();\n\t\t\t} catch (Exception e) {\n\t\t\t\tlogger.error(\"{}: Failed to shutdown connection monitor thread\", getClientId());\n\t\t\t}\n\n\t\t\ttry {\n\t\t\t\tif (connectRunnable != null && connectRunnableThread != null) {\n\t\t\t\t\tconnectRunnable.stopConnectAttempts();\n\t\t\t\t\tconnectRunnableThread.interrupt();\n\t\t\t\t}\n\t\t\t} catch (Exception e) {\n\t\t\t\tlogger.error(\"{}: Failed to shut down the connect runnable\", getClientId());\n\t\t\t}\n\n\t\t\tif (client != null) {\n\t\t\t\ttry {\n\t\t\t\t\tboolean clientConnected = client.isConnected();\n\t\t\t\t\tboolean lwtDeliveryComplete = false;\n\t\t\t\t\tif (publishLwt && lwtTopic != null && clientConnected) {\n\t\t\t\t\t\tlogger.info(\"{}: Publishing LWT on {} with qos={} and retain={}\", getClientId(), lwtTopic,\n\t\t\t\t\t\t\t\tlwtQoS, lwtRetain);\n\t\t\t\t\t\tsynchronized (lwtDeliveryLock) {\n\t\t\t\t\t\t\t/* \n\t\t\t\t\t\t\t * Synchronization with the deliveryComplete() callback is needed to ensure that\n\t\t\t\t\t\t\t * the publish() call is fully completed and the lwtDeliveryToken is set before\n\t\t\t\t\t\t\t * it is being nullified in the Paho callback.\n\t\t\t\t\t\t\t*/\n\t\t\t\t\t\t\tif (useSparkplugStatePayload) {\n\t\t\t\t\t\t\t\ttry {\n\t\t\t\t\t\t\t\t\tObjectMapper mapper = new ObjectMapper();\n\t\t\t\t\t\t\t\t\tStatePayload statePayload = new StatePayload(false, new Date().getTime());\n\t\t\t\t\t\t\t\t\tbyte[] payload = mapper.writeValueAsString(statePayload).getBytes();\n\t\t\t\t\t\t\t\t\tlwtDeliveryToken = publish(lwtTopic, payload, lwtQoS, lwtRetain);\n\t\t\t\t\t\t\t\t} catch (Exception e) {\n\t\t\t\t\t\t\t\t\tlogger.error(\"{}: Failed to publish the LWT message on {}\", getClientId(), lwtTopic,\n\t\t\t\t\t\t\t\t\t\t\te);\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\tlwtDeliveryToken = publish(lwtTopic, lwtPayload, lwtQoS, lwtRetain);\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\tlogger.debug(\"{}: published on LWT Topic={}, messageId={}\", getClientId(), lwtTopic,\n\t\t\t\t\t\t\t\t\tlwtDeliveryToken.getMessageId());\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tif (waitForLwt) {\n\t\t\t\t\t\t\tlwtDeliveryComplete = isLwtDeliveryComplete();\n\t\t\t\t\t\t\tlogger.trace(\"{}: Completed LWT Delivery? {}\", getClientId(), lwtDeliveryComplete);\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tlogger.trace(\"{}: Not waiting for LWT\", getClientId());\n\t\t\t\t\t\t}\n\t\t\t\t\t} else {\n\t\t\t\t\t\tlogger.debug(\"{}: Not publishing LWT, client connected state: {}\", getClientId(),\n\t\t\t\t\t\t\t\tclientConnected);\n\t\t\t\t\t}\n\n\t\t\t\t\tlogger.debug(\"{}: Disconnecting...\", getClientId());\n\t\t\t\t\tclient.disconnectForcibly(disconnectQuieseTime, disconnectTimeout, sendDisconnect);\n\t\t\t\t\tlogger.debug(\"{}: Done disconecting\", getClientId());\n\t\t\t\t\tclient.close();\n\t\t\t\t\tlogger.debug(\"{}: Client closed\", getClientId());\n\t\t\t\t} catch (MqttException e) {\n\t\t\t\t\tthrow new TahuException(TahuErrorCode.INTERNAL_ERROR, e);\n\t\t\t\t} finally {\n\t\t\t\t\tclient = null;\n\t\t\t\t\tstate.setInProgress(false);\n\t\t\t\t\tdisconnectInProgress = false;\n\t\t\t\t\tlwtDeliveryToken = null;\n\t\t\t\t\t// Reset re-subscribed flag\n\t\t\t\t\tresubscribed = false;\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tlogger.debug(\"{}: Disconnect: Client is already null\", getClientId());\n\t\t\t}\n\n\t\t\t// reset the timers if needed\n\t\t\tif (getDisconnectTime() == null) {\n\t\t\t\tthis.clearConnectTime();\n\t\t\t\tthis.renewDisconnectTime();\n\t\t\t\tthis.renewOfflineDate();\n\t\t\t}\n\n\t\t\tdisconnectInProgress = false;\n\t\t}\n\t}\n\n\t/*\n\t * Attempt to connect.\n\t */\n\tprivate IMqttToken attemptConnect(MqttAsyncClient client, MqttConnectOptions options, String ctx)\n\t\t\tthrows MqttSecurityException, MqttException {\n\t\tsynchronized (clientLock) {\n\t\t\tif (isConnected()) {\n\t\t\t\tlogger.trace(\"{} is already connected - not trying again\", getClientId());\n\t\t\t\treturn null;\n\t\t\t}\n\t\t\tif (randomStartupDelay != null && randomStartupDelay.isValid()) {\n\t\t\t\tlong randomDelay = randomStartupDelay.getRandomDelay();\n\t\t\t\tlogger.debug(\"{}: Waiting random delay of {} ms before reconnect attempt\", getClientId(), randomDelay);\n\t\t\t\ttry {\n\t\t\t\t\tThread.sleep(randomDelay);\n\t\t\t\t} catch (InterruptedException e) {\n\t\t\t\t\tlogger.warn(\"{}: Sleep interrupted\", getClientId(), e);\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tlogger.debug(\"{}: Attempting {} to {}\", getClientId(), ctx, getMqttServerUrl());\n\t\t\tlogger.trace(\"{}: Thread {} :: {}\", getClientId(), Thread.currentThread().getName(),\n\t\t\t\t\tThread.currentThread().getId());\n\n\t\t\t// Make the call to connect (this is asynchronous)\n\t\t\treturn client.connect(options, ctx, new IMqttActionListener() {\n\n\t\t\t\t@Override\n\t\t\t\tpublic void onSuccess(IMqttToken token) {\n\t\t\t\t\tlogger.info(\"{}: {} succeeded\", getClientId(), token.getUserContext());\n\t\t\t\t\tstate.setInProgress(false);\n\t\t\t\t}\n\n\t\t\t\t@Override\n\t\t\t\tpublic void onFailure(IMqttToken token, Throwable throwable) {\n\t\t\t\t\tlogger.warn(\"{}: {} failed due to {}\", getClientId(), token.getUserContext(),\n\t\t\t\t\t\t\tthrowable != null ? throwable.getMessage() : \"?\", throwable);\n\t\t\t\t\tlogger.warn(\"{}: MQTT Client details: {}\", getClientId(), getTahuClientDetails());\n\t\t\t\t\tstate.setInProgress(false);\n\t\t\t\t}\n\n\t\t\t\tprivate String getTahuClientDetails() {\n\t\t\t\t\tStringBuilder sb = new StringBuilder();\n\t\t\t\t\tsb.append(\"MQTT Server Name = \").append(mqttServerName).append(\" :: \");\n\t\t\t\t\tsb.append(\"MQTT Server URL = \").append(mqttServerUrl).append(\" :: \");\n\t\t\t\t\tsb.append(\"MQTT Client ID = \").append(clientId).append(\" :: \");\n\t\t\t\t\tsb.append(\"Using Birth = \").append(birthTopic == null || birthTopic.isEmpty() ? \"false\" : \"true\")\n\t\t\t\t\t\t\t.append(\" :: \");\n\t\t\t\t\tsb.append(\"Using LWT = \").append(lwtTopic == null || lwtTopic.isEmpty() ? \"false\" : \"true\");\n\t\t\t\t\treturn sb.toString();\n\t\t\t\t}\n\t\t\t});\n\t\t}\n\t}\n\n\t/**\n\t * A class for tracking the connect in-progress state.\n\t */\n\tprivate class ConnectingState {\n\n\t\tprivate boolean inProgress = false;\n\n\t\tprotected void setInProgress(boolean inProgress) {\n\t\t\tthis.inProgress = inProgress;\n\t\t}\n\n\t\tprotected boolean inProgress() {\n\t\t\treturn this.inProgress;\n\t\t}\n\t}\n\n\t/**\n\t * A Runnable implementation for connecting the client to a broker. Will continue to attempt to connect on failure\n\t * until the client is disconnected (setting the keepConnected flag to false).\n\t */\n\tprotected class ConnectRunnable implements Runnable {\n\n\t\tprivate MqttCallback callback;\n\n\t\tprivate boolean attemptConnects = true;\n\n\t\tpublic ConnectRunnable(final MqttCallback callback) {\n\t\t\tthis.callback = callback;\n\t\t}\n\n\t\tpublic void stopConnectAttempts() {\n\t\t\tattemptConnects = false;\n\t\t}\n\n\t\t@Override\n\t\tpublic void run() {\n\t\t\t// ensure we are disconnected and null\n\t\t\tif (client != null) {\n\t\t\t\ttry {\n\t\t\t\t\tif (client.isConnected()) {\n\t\t\t\t\t\tclient.disconnectForcibly(0, 1, false);\n\t\t\t\t\t\tshutdownConnectionMonitorThread();\n\t\t\t\t\t}\n\t\t\t\t\t// client.setCallback(null);\n\t\t\t\t\tclient.close();\n\t\t\t\t} catch (MqttException e) {\n\t\t\t\t\tlogger.error(\"{}: Error while disconnecting client\", getClientId(), e);\n\t\t\t\t} finally {\n\t\t\t\t\tclient = null;\n\t\t\t\t}\n\t\t\t}\n\n\t\t\ttry {\n\t\t\t\t// Reset re-subscribed flag\n\t\t\t\tresubscribed = false;\n\n\t\t\t\tif (connectOptions == null) {\n\t\t\t\t\tconnectOptions = new MqttConnectOptions();\n\t\t\t\t}\n\t\t\t\tconnectOptions.setMqttVersion(MqttConnectOptions.MQTT_VERSION_3_1_1);\n\t\t\t\tconnectOptions.setCleanSession(cleanSession);\n\t\t\t\tconnectOptions.setConnectionTimeout(30);\n\t\t\t\tif (getUsername() != null && !getUsername().trim().isEmpty()) {\n\t\t\t\t\tlogger.debug(\"{}: Setting username to {}\", getClientId(), getUsername());\n\t\t\t\t\tconnectOptions.setUserName(getUsername());\n\t\t\t\t}\n\t\t\t\tif (getPassword() != null && !getPassword().trim().isEmpty()) {\n\t\t\t\t\tlogger.debug(\"{}: Setting password to ****\", getClientId());\n\t\t\t\t\tconnectOptions.setPassword(getPassword().toCharArray());\n\t\t\t\t}\n\t\t\t\tconnectOptions.setKeepAliveInterval(keepAlive);\n\t\t\t\tif (lwtTopic != null) {\n\t\t\t\t\tlogger.debug(\"{}: Setting WILL on {} with retain {}\", getClientId(), lwtTopic, lwtRetain);\n\t\t\t\t\tif (useSparkplugStatePayload) {\n\t\t\t\t\t\tObjectMapper mapper = new ObjectMapper();\n\t\t\t\t\t\tlastStateDeathPayloadTimestamp = new Date().getTime();\n\t\t\t\t\t\tStatePayload statePayload = new StatePayload(false, lastStateDeathPayloadTimestamp);\n\t\t\t\t\t\tbyte[] payload = mapper.writeValueAsString(statePayload).getBytes();\n\t\t\t\t\t\tconnectOptions.setWill(lwtTopic, payload, MqttOperatorDefs.QOS1, lwtRetain);\n\t\t\t\t\t} else {\n\t\t\t\t\t\tconnectOptions.setWill(lwtTopic, lwtPayload, MqttOperatorDefs.QOS1, lwtRetain);\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t\tconnectOptions.setMaxInflight(getMaxInflightMessages());\n\n\t\t\t\t// Create the client instance\n\t\t\t\tlogger.info(\"{}: Creating the MQTT Client to {} on thread {}\", getClientId(), getMqttServerUrl(),\n\t\t\t\t\t\tThread.currentThread().getName());\n\t\t\t\tclient = new MqttAsyncClient(getMqttServerUrl().toString(), getClientId().toString(), null);\n\n\t\t\t\t// Set the callback handler\n\t\t\t\tclient.setCallback(callback);\n\t\t\t\tIMqttToken connectToken = null;\n\n\t\t\t\t// A time stamp to track the current attempt in case the underlying client is stuck attempting forever\n\t\t\t\tlong attemptTimestamp = System.currentTimeMillis();\n\n\t\t\t\tif (autoReconnect) {\n\t\t\t\t\ttry {\n\t\t\t\t\t\twhile (!isConnected() && attemptConnects) {\n\t\t\t\t\t\t\ttry {\n\t\t\t\t\t\t\t\tsynchronized (clientLock) {\n\t\t\t\t\t\t\t\t\tif (!attemptConnects) {\n\t\t\t\t\t\t\t\t\t\tlogger.info(\"{}: No longer attempting to connect\", getClientId());\n\t\t\t\t\t\t\t\t\t\tstate.setInProgress(false);\n\t\t\t\t\t\t\t\t\t\treturn;\n\t\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\t\tconnectToken = attemptConnect(client, connectOptions, \"connect with retry\");\n\n\t\t\t\t\t\t\t\t\t// Update time stamp for current attempt\n\t\t\t\t\t\t\t\t\tattemptTimestamp = System.currentTimeMillis();\n\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\t// Sleep for the connect retry interval\n\t\t\t\t\t\t\t\tThread.sleep(getConnectRetryInterval());\n\t\t\t\t\t\t\t} catch (InterruptedException ie) {\n\t\t\t\t\t\t\t\tlogger.info(\"{}: Connect thread {} interrupted - giving up\",\n\t\t\t\t\t\t\t\t\t\tThread.currentThread().getName(), getClientId());\n\t\t\t\t\t\t\t\treturn;\n\t\t\t\t\t\t\t} catch (MqttException e) {\n\t\t\t\t\t\t\t\tif (e.getReasonCode() == MqttException.REASON_CODE_CONNECT_IN_PROGRESS) {\n\t\t\t\t\t\t\t\t\tif (connectToken != null) {\n\t\t\t\t\t\t\t\t\t\tlogger.debug(\"{}: Still trying to connect - isComplete? {}, sessionPresent? {}\",\n\t\t\t\t\t\t\t\t\t\t\t\tgetClientId(), connectToken.isComplete(),\n\t\t\t\t\t\t\t\t\t\t\t\tconnectToken.getSessionPresent());\n\t\t\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\t\t\tlogger.debug(\"{}: Still trying to connect\", getClientId());\n\t\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\t\t// Check if the connect attempt has timed out\n\t\t\t\t\t\t\t\t\tif (System.currentTimeMillis() - attemptTimestamp > connectAttemptTimeout) {\n\t\t\t\t\t\t\t\t\t\tsynchronized (clientLock) {\n\t\t\t\t\t\t\t\t\t\t\t// Forcibly close the client\n\t\t\t\t\t\t\t\t\t\t\tlogger.warn(\"{}: Connect attempt has timed out - forcing close\",\n\t\t\t\t\t\t\t\t\t\t\t\t\tgetClientId());\n\t\t\t\t\t\t\t\t\t\t\tclient.close(true);\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\t\t\tThread.sleep(500);\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\t\tlogger.debug(\"{}: Unable to connect due to {}, next connect attempt in {} ms\",\n\t\t\t\t\t\t\t\t\t\t\tgetClientId(), e.getMessage(), getConnectRetryInterval());\n\t\t\t\t\t\t\t\t\tThread.sleep(getConnectRetryInterval());\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tlogger.info(\"{}: MQTT Client connected to {} on thread {}\", getClientId(), getMqttServerUrl(),\n\t\t\t\t\t\t\t\tThread.currentThread().getName());\n\t\t\t\t\t\tstate.setInProgress(false);\n\t\t\t\t\t} catch (InterruptedException ie) {\n\t\t\t\t\t\tlogger.info(\"{}: Connect thread 2 interrupted - giving up\", getClientId());\n\t\t\t\t\t\tstate.setInProgress(false);\n\t\t\t\t\t\treturn;\n\t\t\t\t\t} catch (Throwable throwable) {\n\t\t\t\t\t\tlogException(\n\t\t\t\t\t\t\t\t\"Error while attempting connect (with autoReconnect=true) to \" + getMqttServerUrl(),\n\t\t\t\t\t\t\t\tthrowable);\n\t\t\t\t\t\tstate.setInProgress(false);\n\t\t\t\t\t\tif (autoReconnect && !isConnected() && attemptConnects) {\n\t\t\t\t\t\t\tattemptRecovery();\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t} else {\n\t\t\t\t\ttry {\n\t\t\t\t\t\tsynchronized (clientLock) {\n\t\t\t\t\t\t\tif (!attemptConnects) {\n\t\t\t\t\t\t\t\tlogger.info(\"{}: No longer attempting to connect\", getClientId());\n\t\t\t\t\t\t\t\tstate.setInProgress(false);\n\t\t\t\t\t\t\t\treturn;\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t// Attempt to connect\n\t\t\t\t\t\t\tattemptConnect(client, connectOptions, \"connect\");\n\t\t\t\t\t\t}\n\t\t\t\t\t} catch (Throwable throwable) {\n\t\t\t\t\t\tlogException(\n\t\t\t\t\t\t\t\t\"Error while attempting connect (with autoReconnect=false) to \" + getMqttServerUrl(),\n\t\t\t\t\t\t\t\tthrowable);\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t} catch (Exception e) {\n\t\t\t\tlogger.error(\"{}: Error while connecting client\", getClientId(), e);\n\t\t\t\tstate.setInProgress(false);\n\t\t\t\tif (autoReconnect && !isConnected() && attemptConnects) {\n\t\t\t\t\tattemptRecovery();\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\tprivate void attemptRecovery() {\n\t\tlogger.warn(\"{}: Connect failed - retrying\", getClientId());\n\t\ttry {\n\t\t\tif (randomStartupDelay != null && randomStartupDelay.isValid()) {\n\t\t\t\tlong randomDelay = randomStartupDelay.getRandomDelay();\n\t\t\t\tlogger.info(\"{}: Sleeping {} before reconnect attempt\", getClientId(), randomDelay);\n\t\t\t\tThread.sleep(randomDelay);\n\t\t\t} else {\n\t\t\t\tThread.sleep(getConnectRetryInterval());\n\t\t\t}\n\t\t} catch (InterruptedException ie) {\n\t\t\tlogger.warn(\"{}: InterruptedException while preparing to reconnect\", getClientId(), ie);\n\t\t\treturn;\n\t\t}\n\t\tif (autoReconnect) {\n\t\t\tconnect();\n\t\t} else {\n\t\t\tlogger.warn(\"{}: AutoReconnect canceled - No longer going to retry\", getClientId());\n\t\t\treturn;\n\t\t}\n\t}\n\n\tprivate class AsyncPublisher implements Runnable {\n\n\t\tprivate String topic;\n\t\tprivate byte[] payload;\n\t\tprivate int qos;\n\t\tprivate boolean retained;\n\n\t\t// Retry params\n\t\tprivate boolean retry = false;\n\t\tprivate long retryDelay;\n\t\tprivate int numAttempts;\n\n\t\tpublic AsyncPublisher(String topic, byte[] payload, int qos, boolean retained, boolean retry, long retryDelay,\n\t\t\t\tint numAttempts) {\n\t\t\tthis.topic = topic;\n\t\t\tthis.payload = payload;\n\t\t\tthis.qos = qos;\n\t\t\tthis.retained = retained;\n\t\t\tthis.retry = retry;\n\t\t\tthis.retryDelay = retryDelay;\n\t\t\tthis.numAttempts = numAttempts;\n\t\t}\n\n\t\t@Override\n\t\tpublic void run() {\n\t\t\ttry {\n\t\t\t\tif (retry) {\n\t\t\t\t\tfor (int i = 0; i < numAttempts; i++) {\n\t\t\t\t\t\tif (client == null || !client.isConnected()) {\n\t\t\t\t\t\t\tThread.sleep(retryDelay);\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tlogger.debug(\"{}: Publishing on {}, Payload size = {}\", getClientId(), topic,\n\t\t\t\t\t\t\t\t\tpayload.length);\n\t\t\t\t\t\t\tclient.publish(topic, payload, qos, retained);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\n\t\t\t\t\tlogger.error(\"{}: Failed to publish message on {} after {} attempts\", getClientId(), topic,\n\t\t\t\t\t\t\tnumAttempts);\n\t\t\t\t\tthrow new TahuException(TahuErrorCode.INTERNAL_ERROR,\n\t\t\t\t\t\t\t\"Failed to publish message on \" + topic + \" after \" + numAttempts + \" attempts\");\n\t\t\t\t} else {\n\t\t\t\t\tif (client == null) {\n\t\t\t\t\t\tthrow new TahuException(TahuErrorCode.INTERNAL_ERROR, \"MQTT client is null\");\n\t\t\t\t\t} else if (client.isConnected()) {\n\t\t\t\t\t\tlogger.debug(\"{}: Publishing on {}, Payload size = {}\", getClientId(), topic, payload.length);\n\t\t\t\t\t\tclient.publish(topic, payload, qos, retained);\n\t\t\t\t\t} else {\n\t\t\t\t\t\tthrow new TahuException(TahuErrorCode.INTERNAL_ERROR, \"MQTT client not connected\");\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t} catch (Exception e) {\n\t\t\t\tlogger.error(\"{}: Failed to publish\", getClientId(), e);\n\t\t\t}\n\t\t}\n\t}\n\n\tprivate void shutdownConnectionMonitorThread() {\n\t\tif (connectionMonitorThread == null) {\n\t\t\tlogger.debug(\"{}: Not shutting down ConnectionMonitorThread - its null\", getClientId());\n\t\t\treturn;\n\t\t}\n\t\tif (connectionMonitorThread.isAlive()) {\n\t\t\tlogger.debug(\"{}: Shutting down ConnectionMonitorThread\", getClientId());\n\t\t\tconnectionMonitorThread.shutdown();\n\t\t\tconnectionMonitorThread = null;\n\t\t} else {\n\t\t\tlogger.debug(\"{}: Not shutting down ConnectionMonitorThread - its not alive\", getClientId());\n\t\t}\n\t}\n\n\tprivate class ConnectionMonitorThread extends Thread {\n\t\tprivate ConnectionMonitor connectionMonitor;\n\n\t\tpublic ConnectionMonitorThread(ConnectionMonitor connectionMonitor) {\n\t\t\tsuper(connectionMonitor);\n\t\t\tthis.connectionMonitor = connectionMonitor;\n\t\t}\n\n\t\tpublic void shutdown() {\n\t\t\tconnectionMonitor.setKeepRunning(false);\n\t\t\tthis.interrupt();\n\t\t}\n\t}\n\n\tprivate class ConnectionMonitor implements Runnable {\n\n\t\tprivate final MqttAsyncClient monitoredClient;\n\t\tprivate final MqttClientId monitoredClientId;\n\t\tprivate boolean keepRunning = true;\n\n\t\tpublic ConnectionMonitor(MqttAsyncClient client, MqttClientId clientId) {\n\t\t\tthis.monitoredClient = client;\n\t\t\tthis.monitoredClientId = clientId;\n\t\t}\n\n\t\tpublic void setKeepRunning(boolean keepRunning) {\n\t\t\tthis.keepRunning = keepRunning;\n\t\t}\n\n\t\tpublic void run() {\n\t\t\ttry {\n\t\t\t\tint connectionLostCounter = 0;\n\t\t\t\twhile (keepRunning) {\n\t\t\t\t\tsynchronized (clientLock) {\n\t\t\t\t\t\tif (monitoredClient != null) {\n\t\t\t\t\t\t\tif (!monitoredClient.isConnected()) {\n\t\t\t\t\t\t\t\tif (state.inProgress()) {\n\t\t\t\t\t\t\t\t\tlogger.debug(\"{}: ConnectionMonitor - Attempting to connect\", monitoredClientId);\n\t\t\t\t\t\t\t\t\tconnectionLostCounter = 0;\n\t\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\t\tlogger.debug(\"{}: ConnectionMonitor - Not connected, incrementing counter\",\n\t\t\t\t\t\t\t\t\t\t\tmonitoredClientId);\n\t\t\t\t\t\t\t\t\tconnectionLostCounter++;\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\tlogger.trace(\"{}: ConnectionMonitor - Already connected\", monitoredClientId);\n\t\t\t\t\t\t\t\tconnectionLostCounter = 0;\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tlogger.debug(\"{}: ConnectionMonitor - Client is null - Uncaught connectionLost\",\n\t\t\t\t\t\t\t\t\tgetClientId());\n\t\t\t\t\t\t\tconnectionLostCounter = 5;\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\n\t\t\t\t\tif (connectionLostCounter == 5 && callback != null) {\n\t\t\t\t\t\tcallback.connectionLost(mqttServerName, mqttServerUrl, monitoredClientId,\n\t\t\t\t\t\t\t\tnew Throwable(monitoredClientId + \": Uncaught paho disconnect\"));\n\t\t\t\t\t}\n\n\t\t\t\t\ttry {\n\t\t\t\t\t\tThread.sleep(DEFAULT_CONNECT_MONITOR_INTERVAL);\n\t\t\t\t\t} catch (InterruptedException ie) {\n\t\t\t\t\t\tlogger.debug(\"{}: ConnectionMonitor interrupted\", monitoredClientId);\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t} catch (Exception e) {\n\t\t\t\tlogger.error(\"{}: ConnectionMonitor failed to keep running\", monitoredClientId, e);\n\t\t\t}\n\t\t}\n\t}\n\n\t@Override\n\tpublic void connectComplete(boolean reconnect, String serverURI) {\n\n\t\t// Check if we are in the process of disconnecting\n\t\tif (disconnectInProgress) {\n\t\t\tlogger.warn(\"{}: Ignoring connect complete to {}, disconnect in progress\", getClientId(), serverURI);\n\t\t\t// This potentially prevents a deadlock situation upon synchronizing on the clientLock below if a disconnect\n\t\t\t// is in progress and waiting on the client.disconnect() call\n\t\t\treturn;\n\t\t}\n\n\t\tsynchronized (clientLock) {\n\t\t\tif (reconnect) {\n\t\t\t\tlogger.debug(\"{}: SUCCESSFULLY RECONNECTED to {}\", getClientId(), getMqttServerUrl());\n\t\t\t}\n\n\t\t\tif (autoReconnect) {\n\t\t\t\tif (connectionMonitorThread == null || !connectionMonitorThread.isAlive()) {\n\t\t\t\t\tconnectionMonitorThread = new ConnectionMonitorThread(new ConnectionMonitor(client, getClientId()));\n\t\t\t\t\tconnectionMonitorThread.start();\n\t\t\t\t}\n\t\t\t}\n\n\t\t\t// The client is connected - renew online date, renew the connect time, clear disconnect time\n\t\t\tthis.renewOnlineDate();\n\t\t\tthis.renewConnectTime();\n\t\t\tthis.clearDisconnectTime();\n\n\t\t\tlogger.info(\"{}: Connected to {}\", getClientId(), getMqttServerUrl());\n\n\t\t\t// Call connectComplete() with the callback\n\t\t\tgetCallback().connectComplete(reconnect, getMqttServerName(), getMqttServerUrl(), getClientId());\n\n\t\t\t// Subscribe (or re-subscribe)\n\t\t\tif (!subscriptions.isEmpty()) {\n\t\t\t\t// Build up the arrays of topics and QoS levels\n\t\t\t\tint totalCount = subscriptions.size();\n\t\t\t\tint subscribedCount = 0;\n\t\t\t\tArrayList<String> topicsList = new ArrayList<String>(subscriptions.keySet());\n\n\t\t\t\twhile (subscribedCount < totalCount) {\n\t\t\t\t\tint topicsRemaining = totalCount - subscribedCount;\n\t\t\t\t\t// Don't attempt to publish more that the max topics per subscribe\n\t\t\t\t\tint size = topicsRemaining > maxTopicsPerSubscribe ? maxTopicsPerSubscribe : topicsRemaining;\n\n\t\t\t\t\tString[] topics = new String[size];\n\t\t\t\t\tint[] qosLevels = new int[size];\n\n\t\t\t\t\tfor (int i = 0; i < size; i++) {\n\t\t\t\t\t\tString topic = topicsList.get(i + subscribedCount);\n\t\t\t\t\t\ttopics[i] = topic;\n\t\t\t\t\t\tqosLevels[i] = subscriptions.get(topic);\n\t\t\t\t\t}\n\n\t\t\t\t\tString topicStr = Arrays.toString(topics);\n\t\t\t\t\tString qosStr = Arrays.toString(qosLevels);\n\t\t\t\t\tlogger.debug(\"{}: server {} - Attempting to subscribe on topic {} with QoS={}\", getClientId(),\n\t\t\t\t\t\t\tgetMqttServerName(), topicStr, qosStr);\n\t\t\t\t\ttry {\n\t\t\t\t\t\tclient.subscribe(topics, qosLevels, null, new IMqttActionListener() {\n\t\t\t\t\t\t\t@Override\n\t\t\t\t\t\t\tpublic void onSuccess(IMqttToken asyncActionToken) {\n\t\t\t\t\t\t\t\tint[] grantedQos = asyncActionToken.getGrantedQos();\n\t\t\t\t\t\t\t\tif (Arrays.equals(qosLevels, grantedQos)) {\n\t\t\t\t\t\t\t\t\tlogger.debug(\"{}: server {} - Successfully subscribed on {} on QoS={}\",\n\t\t\t\t\t\t\t\t\t\t\tgetClientId(), getMqttServerName(), topicStr, qosStr);\n\t\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\t\ttry {\n\t\t\t\t\t\t\t\t\t\tString grantedQosStr = Arrays.toString(grantedQos);\n\t\t\t\t\t\t\t\t\t\tlogger.error(\"{}: server {} - Failed subscribe on {} granted QoS {} != {}\",\n\t\t\t\t\t\t\t\t\t\t\t\tgetClientId(), getMqttServerName(), topicStr, qosStr, grantedQosStr);\n\n\t\t\t\t\t\t\t\t\t\t// FIXME - remove This sleep is necessary due to:\n\t\t\t\t\t\t\t\t\t\t// https://github.com/eclipse/paho.mqtt.java/issues/850\n\t\t\t\t\t\t\t\t\t\tThread.sleep(1000);\n\n\t\t\t\t\t\t\t\t\t\tsynchronized (clientLock) {\n\t\t\t\t\t\t\t\t\t\t\t// Force the disconnect and return\n\t\t\t\t\t\t\t\t\t\t\tclient.disconnectForcibly(0, 1, false);\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t\treturn;\n\t\t\t\t\t\t\t\t\t} catch (Exception e) {\n\t\t\t\t\t\t\t\t\t\tlogger.error(\n\t\t\t\t\t\t\t\t\t\t\t\t\"{}: server {} - Failed disconnect on failed subscribe granted QoS\",\n\t\t\t\t\t\t\t\t\t\t\t\tgetClientId(), getMqttServerName(), e);\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t@Override\n\t\t\t\t\t\t\tpublic void onFailure(IMqttToken asyncActionToken, Throwable exception) {\n\t\t\t\t\t\t\t\tsynchronized (clientLock) {\n\t\t\t\t\t\t\t\t\ttry {\n\t\t\t\t\t\t\t\t\t\tlogger.error(\"{}: server {} - Failed to subscribe on {}\",\n\t\t\t\t\t\t\t\t\t\t\t\tgetClientId(), getMqttServerName(), topicStr);\n\t\t\t\t\t\t\t\t\t\tclient.disconnectForcibly(0, 1, false);\n\t\t\t\t\t\t\t\t\t} catch (MqttException e) {\n\t\t\t\t\t\t\t\t\t\tlogger.error(\"{}: server {} - Failed disconnect on failed subscribe\",\n\t\t\t\t\t\t\t\t\t\t\t\tgetClientId(), getMqttServerName(), e);\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t});\n\t\t\t\t\t} catch (MqttException e) {\n\t\t\t\t\t\tlogger.error(\"{}: server {} - Failed to subscribe on {} with QoS={}\", getClientId(),\n\t\t\t\t\t\t\t\tgetMqttServerName(), topicStr, qosStr, e);\n\t\t\t\t\t\tbreak;\n\t\t\t\t\t}\n\n\t\t\t\t\tsubscribedCount += size;\n\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tif (trackFirstConnection && !firstConnection) {\n\t\t\t\t\tlogger.warn(\"{}: No subscriptions for {}\", getClientId(), getClientId());\n\t\t\t\t}\n\t\t\t}\n\n\t\t\t// Mark that the client has finished re-subscribing\n\t\t\tresubscribed = true;\n\n\t\t\t// Publish a standard Birth/Death Certificate if a baseTopic has been defined.\n\t\t\tpublishBirthMessage();\n\n\t\t\tfirstConnection = false;\n\t\t}\n\t}\n\n\t/**\n\t * Sets the 'track first connection' flag\n\t * \n\t * @param trackFirstConnection - the 'track first connection' flag as {@link boolean}\n\t */\n\tpublic void setTrackFirstConnection(boolean trackFirstConnection) {\n\t\tsynchronized (clientLock) {\n\t\t\tthis.trackFirstConnection = trackFirstConnection;\n\t\t}\n\t}\n\n\tpublic void publishBirthMessage() {\n\t\tif (birthTopic != null) {\n\t\t\ttry {\n\t\t\t\tlogger.debug(\"{}: Publishing BIRTH on {} with retain {}\", getClientId(), birthTopic, birthRetain);\n\t\t\t\tif (useSparkplugStatePayload) {\n\t\t\t\t\ttry {\n\t\t\t\t\t\tObjectMapper mapper = new ObjectMapper();\n\t\t\t\t\t\tStatePayload statePayload = new StatePayload(true, lastStateDeathPayloadTimestamp);\n\t\t\t\t\t\tbyte[] payload = mapper.writeValueAsString(statePayload).getBytes();\n\t\t\t\t\t\tpublish(birthTopic, payload, MqttOperatorDefs.QOS1, birthRetain);\n\t\t\t\t\t} catch (Exception e) {\n\t\t\t\t\t\tlogger.error(\"{}: Failed to publish the BIRTH message on {}\", getClientId(), birthTopic, e);\n\t\t\t\t\t}\n\t\t\t\t} else {\n\t\t\t\t\tpublish(birthTopic, birthPayload, MqttOperatorDefs.QOS1, birthRetain);\n\t\t\t\t}\n\t\t\t} catch (TahuException ce) {\n\t\t\t\tlogger.error(\"{}: Error in birth topic publish on connect\", getClientId(), ce);\n\t\t\t\ttry {\n\t\t\t\t\tclient.disconnectForcibly(0, 1, false);\n\t\t\t\t} catch (Exception e) {\n\t\t\t\t\tlogger.error(\"{}: Failed to disconnect after failed BIRTH publish\", getClientId(), e);\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\tprivate Date getConnectTime() {\n\t\treturn this.connectTime;\n\t}\n\n\tprivate Date getDisconnectTime() {\n\t\treturn this.disconnectTime;\n\t}\n\n\tprivate void clearConnectTime() {\n\t\tthis.connectTime = null;\n\t}\n\n\tprivate void clearDisconnectTime() {\n\t\tthis.disconnectTime = null;\n\t}\n\n\tprivate void renewConnectTime() {\n\t\tthis.connectTime = new Date();\n\t}\n\n\tprivate void renewDisconnectTime() {\n\t\tthis.disconnectTime = new Date();\n\t}\n\n\tprivate long getConnectRetryInterval() {\n\t\treturn connectRetryInterval;\n\t}\n\n\tpublic void setConnectRetryInterval(long connectRetryInterval) {\n\t\tthis.connectRetryInterval = connectRetryInterval;\n\t}\n\n\tprivate long getConnectAttemptTimeout() {\n\t\treturn connectAttemptTimeout;\n\t}\n\n\tpublic void setConnectAttemptTimeout(long connectAttemptTimeout) {\n\t\tthis.connectAttemptTimeout = connectAttemptTimeout;\n\t}\n\n\tpublic boolean isAttemptingConnect() {\n\t\treturn state.inProgress();\n\t}\n\n\tprivate String getErrorMessage(String prefix, Throwable throwable) {\n\t\treturn new StringBuilder(prefix).append(\": \").append(getErrorMessage(throwable)).toString();\n\t}\n\n\tprivate String getErrorMessage(Throwable throwable) {\n\t\tStringBuilder sb = new StringBuilder(throwable.getMessage());\n\t\tThrowable cause = throwable.getCause();\n\t\tif (cause != null) {\n\t\t\tsb.append(\": \").append(getErrorMessage(cause));\n\t\t}\n\t\treturn sb.toString();\n\t}\n\n\tprivate void logException(String message, Throwable throwable) {\n\t\tString errorMessage = getErrorMessage(message, throwable);\n\t\tif (logger.isTraceEnabled()) {\n\t\t\t// Only log the stack trace if trace is enabled\n\t\t\tlogger.error(\"{}: {}\", getClientId(), errorMessage, throwable);\n\t\t} else {\n\t\t\tlogger.error(\"{}: {}\", getClientId(), errorMessage);\n\t\t}\n\t}\n\n\t/*\n\t * This method waits to ensure that the LWT gets published before graceful disconnect.\n\t * It uses the 'keepAlive' to timeout if the lwtDeliveryToken is not cleared by the deliveryComplete() \n\t * Paho callback.\n\t */\n\tprivate boolean isLwtDeliveryComplete() {\n\t\tint counter = keepAlive * 4;\n\t\tfor (int i = 0; i < counter; i++) {\n\t\t\ttry {\n\t\t\t\tif (lwtDeliveryToken == null) {\n\t\t\t\t\tlogger.info(\"{}: LWT delivery confirmation - done waiting\", getClientId());\n\t\t\t\t\treturn true;\n\t\t\t\t} else {\n\t\t\t\t\tThread.sleep(250);\n\t\t\t\t}\n\t\t\t} catch (InterruptedException e) {\n\t\t\t\tlogger.warn(\"{}: Interrupted while waiting for LWT\", getClientId());\n\t\t\t}\n\t\t}\n\t\tlwtDeliveryToken = null;\n\t\tlogger.warn(\"{}: LWT delivery confirmation - timeout\", getClientId());\n\t\treturn false;\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/protobuf/SparkplugBProto.java",
    "content": "// Generated by the protocol buffer compiler.  DO NOT EDIT!\n// source: sparkplug_b/sparkplug_b.proto\n\npackage org.eclipse.tahu.protobuf;\n\npublic final class SparkplugBProto {\n  private SparkplugBProto() {}\n  public static void registerAllExtensions(\n      com.google.protobuf.ExtensionRegistryLite registry) {\n  }\n\n  public static void registerAllExtensions(\n      com.google.protobuf.ExtensionRegistry registry) {\n    registerAllExtensions(\n        (com.google.protobuf.ExtensionRegistryLite) registry);\n  }\n  /**\n   * <pre>\n   * Indexes of Data Types\n   * </pre>\n   *\n   * Protobuf enum {@code org.eclipse.tahu.protobuf.DataType}\n   */\n  public enum DataType\n      implements com.google.protobuf.ProtocolMessageEnum {\n    /**\n     * <pre>\n     * Unknown placeholder for future expansion.\n     * </pre>\n     *\n     * <code>Unknown = 0;</code>\n     */\n    Unknown(0),\n    /**\n     * <pre>\n     * Basic Types\n     * </pre>\n     *\n     * <code>Int8 = 1;</code>\n     */\n    Int8(1),\n    /**\n     * <code>Int16 = 2;</code>\n     */\n    Int16(2),\n    /**\n     * <code>Int32 = 3;</code>\n     */\n    Int32(3),\n    /**\n     * <code>Int64 = 4;</code>\n     */\n    Int64(4),\n    /**\n     * <code>UInt8 = 5;</code>\n     */\n    UInt8(5),\n    /**\n     * <code>UInt16 = 6;</code>\n     */\n    UInt16(6),\n    /**\n     * <code>UInt32 = 7;</code>\n     */\n    UInt32(7),\n    /**\n     * <code>UInt64 = 8;</code>\n     */\n    UInt64(8),\n    /**\n     * <code>Float = 9;</code>\n     */\n    Float(9),\n    /**\n     * <code>Double = 10;</code>\n     */\n    Double(10),\n    /**\n     * <code>Boolean = 11;</code>\n     */\n    Boolean(11),\n    /**\n     * <code>String = 12;</code>\n     */\n    String(12),\n    /**\n     * <code>DateTime = 13;</code>\n     */\n    DateTime(13),\n    /**\n     * <code>Text = 14;</code>\n     */\n    Text(14),\n    /**\n     * <pre>\n     * Additional Metric Types\n     * </pre>\n     *\n     * <code>UUID = 15;</code>\n     */\n    UUID(15),\n    /**\n     * <code>DataSet = 16;</code>\n     */\n    DataSet(16),\n    /**\n     * <code>Bytes = 17;</code>\n     */\n    Bytes(17),\n    /**\n     * <code>File = 18;</code>\n     */\n    File(18),\n    /**\n     * <code>Template = 19;</code>\n     */\n    Template(19),\n    /**\n     * <pre>\n     * Additional PropertyValue Types\n     * </pre>\n     *\n     * <code>PropertySet = 20;</code>\n     */\n    PropertySet(20),\n    /**\n     * <code>PropertySetList = 21;</code>\n     */\n    PropertySetList(21),\n    /**\n     * <pre>\n     * Array Types\n     * </pre>\n     *\n     * <code>Int8Array = 22;</code>\n     */\n    Int8Array(22),\n    /**\n     * <code>Int16Array = 23;</code>\n     */\n    Int16Array(23),\n    /**\n     * <code>Int32Array = 24;</code>\n     */\n    Int32Array(24),\n    /**\n     * <code>Int64Array = 25;</code>\n     */\n    Int64Array(25),\n    /**\n     * <code>UInt8Array = 26;</code>\n     */\n    UInt8Array(26),\n    /**\n     * <code>UInt16Array = 27;</code>\n     */\n    UInt16Array(27),\n    /**\n     * <code>UInt32Array = 28;</code>\n     */\n    UInt32Array(28),\n    /**\n     * <code>UInt64Array = 29;</code>\n     */\n    UInt64Array(29),\n    /**\n     * <code>FloatArray = 30;</code>\n     */\n    FloatArray(30),\n    /**\n     * <code>DoubleArray = 31;</code>\n     */\n    DoubleArray(31),\n    /**\n     * <code>BooleanArray = 32;</code>\n     */\n    BooleanArray(32),\n    /**\n     * <code>StringArray = 33;</code>\n     */\n    StringArray(33),\n    /**\n     * <code>DateTimeArray = 34;</code>\n     */\n    DateTimeArray(34),\n    ;\n\n    /**\n     * <pre>\n     * Unknown placeholder for future expansion.\n     * </pre>\n     *\n     * <code>Unknown = 0;</code>\n     */\n    public static final int Unknown_VALUE = 0;\n    /**\n     * <pre>\n     * Basic Types\n     * </pre>\n     *\n     * <code>Int8 = 1;</code>\n     */\n    public static final int Int8_VALUE = 1;\n    /**\n     * <code>Int16 = 2;</code>\n     */\n    public static final int Int16_VALUE = 2;\n    /**\n     * <code>Int32 = 3;</code>\n     */\n    public static final int Int32_VALUE = 3;\n    /**\n     * <code>Int64 = 4;</code>\n     */\n    public static final int Int64_VALUE = 4;\n    /**\n     * <code>UInt8 = 5;</code>\n     */\n    public static final int UInt8_VALUE = 5;\n    /**\n     * <code>UInt16 = 6;</code>\n     */\n    public static final int UInt16_VALUE = 6;\n    /**\n     * <code>UInt32 = 7;</code>\n     */\n    public static final int UInt32_VALUE = 7;\n    /**\n     * <code>UInt64 = 8;</code>\n     */\n    public static final int UInt64_VALUE = 8;\n    /**\n     * <code>Float = 9;</code>\n     */\n    public static final int Float_VALUE = 9;\n    /**\n     * <code>Double = 10;</code>\n     */\n    public static final int Double_VALUE = 10;\n    /**\n     * <code>Boolean = 11;</code>\n     */\n    public static final int Boolean_VALUE = 11;\n    /**\n     * <code>String = 12;</code>\n     */\n    public static final int String_VALUE = 12;\n    /**\n     * <code>DateTime = 13;</code>\n     */\n    public static final int DateTime_VALUE = 13;\n    /**\n     * <code>Text = 14;</code>\n     */\n    public static final int Text_VALUE = 14;\n    /**\n     * <pre>\n     * Additional Metric Types\n     * </pre>\n     *\n     * <code>UUID = 15;</code>\n     */\n    public static final int UUID_VALUE = 15;\n    /**\n     * <code>DataSet = 16;</code>\n     */\n    public static final int DataSet_VALUE = 16;\n    /**\n     * <code>Bytes = 17;</code>\n     */\n    public static final int Bytes_VALUE = 17;\n    /**\n     * <code>File = 18;</code>\n     */\n    public static final int File_VALUE = 18;\n    /**\n     * <code>Template = 19;</code>\n     */\n    public static final int Template_VALUE = 19;\n    /**\n     * <pre>\n     * Additional PropertyValue Types\n     * </pre>\n     *\n     * <code>PropertySet = 20;</code>\n     */\n    public static final int PropertySet_VALUE = 20;\n    /**\n     * <code>PropertySetList = 21;</code>\n     */\n    public static final int PropertySetList_VALUE = 21;\n    /**\n     * <pre>\n     * Array Types\n     * </pre>\n     *\n     * <code>Int8Array = 22;</code>\n     */\n    public static final int Int8Array_VALUE = 22;\n    /**\n     * <code>Int16Array = 23;</code>\n     */\n    public static final int Int16Array_VALUE = 23;\n    /**\n     * <code>Int32Array = 24;</code>\n     */\n    public static final int Int32Array_VALUE = 24;\n    /**\n     * <code>Int64Array = 25;</code>\n     */\n    public static final int Int64Array_VALUE = 25;\n    /**\n     * <code>UInt8Array = 26;</code>\n     */\n    public static final int UInt8Array_VALUE = 26;\n    /**\n     * <code>UInt16Array = 27;</code>\n     */\n    public static final int UInt16Array_VALUE = 27;\n    /**\n     * <code>UInt32Array = 28;</code>\n     */\n    public static final int UInt32Array_VALUE = 28;\n    /**\n     * <code>UInt64Array = 29;</code>\n     */\n    public static final int UInt64Array_VALUE = 29;\n    /**\n     * <code>FloatArray = 30;</code>\n     */\n    public static final int FloatArray_VALUE = 30;\n    /**\n     * <code>DoubleArray = 31;</code>\n     */\n    public static final int DoubleArray_VALUE = 31;\n    /**\n     * <code>BooleanArray = 32;</code>\n     */\n    public static final int BooleanArray_VALUE = 32;\n    /**\n     * <code>StringArray = 33;</code>\n     */\n    public static final int StringArray_VALUE = 33;\n    /**\n     * <code>DateTimeArray = 34;</code>\n     */\n    public static final int DateTimeArray_VALUE = 34;\n\n\n    public final int getNumber() {\n      return value;\n    }\n\n    /**\n     * @deprecated Use {@link #forNumber(int)} instead.\n     */\n    @java.lang.Deprecated\n    public static DataType valueOf(int value) {\n      return forNumber(value);\n    }\n\n    public static DataType forNumber(int value) {\n      switch (value) {\n        case 0: return Unknown;\n        case 1: return Int8;\n        case 2: return Int16;\n        case 3: return Int32;\n        case 4: return Int64;\n        case 5: return UInt8;\n        case 6: return UInt16;\n        case 7: return UInt32;\n        case 8: return UInt64;\n        case 9: return Float;\n        case 10: return Double;\n        case 11: return Boolean;\n        case 12: return String;\n        case 13: return DateTime;\n        case 14: return Text;\n        case 15: return UUID;\n        case 16: return DataSet;\n        case 17: return Bytes;\n        case 18: return File;\n        case 19: return Template;\n        case 20: return PropertySet;\n        case 21: return PropertySetList;\n        case 22: return Int8Array;\n        case 23: return Int16Array;\n        case 24: return Int32Array;\n        case 25: return Int64Array;\n        case 26: return UInt8Array;\n        case 27: return UInt16Array;\n        case 28: return UInt32Array;\n        case 29: return UInt64Array;\n        case 30: return FloatArray;\n        case 31: return DoubleArray;\n        case 32: return BooleanArray;\n        case 33: return StringArray;\n        case 34: return DateTimeArray;\n        default: return null;\n      }\n    }\n\n    public static com.google.protobuf.Internal.EnumLiteMap<DataType>\n        internalGetValueMap() {\n      return internalValueMap;\n    }\n    private static final com.google.protobuf.Internal.EnumLiteMap<\n        DataType> internalValueMap =\n          new com.google.protobuf.Internal.EnumLiteMap<DataType>() {\n            public DataType findValueByNumber(int number) {\n              return DataType.forNumber(number);\n            }\n          };\n\n    public final com.google.protobuf.Descriptors.EnumValueDescriptor\n        getValueDescriptor() {\n      return getDescriptor().getValues().get(ordinal());\n    }\n    public final com.google.protobuf.Descriptors.EnumDescriptor\n        getDescriptorForType() {\n      return getDescriptor();\n    }\n    public static final com.google.protobuf.Descriptors.EnumDescriptor\n        getDescriptor() {\n      return org.eclipse.tahu.protobuf.SparkplugBProto.getDescriptor().getEnumTypes().get(0);\n    }\n\n    private static final DataType[] VALUES = values();\n\n    public static DataType valueOf(\n        com.google.protobuf.Descriptors.EnumValueDescriptor desc) {\n      if (desc.getType() != getDescriptor()) {\n        throw new java.lang.IllegalArgumentException(\n          \"EnumValueDescriptor is not for this type.\");\n      }\n      return VALUES[desc.getIndex()];\n    }\n\n    private final int value;\n\n    private DataType(int value) {\n      this.value = value;\n    }\n\n    // @@protoc_insertion_point(enum_scope:org.eclipse.tahu.protobuf.DataType)\n  }\n\n  public interface PayloadOrBuilder extends\n      // @@protoc_insertion_point(interface_extends:org.eclipse.tahu.protobuf.Payload)\n      com.google.protobuf.GeneratedMessageV3.\n          ExtendableMessageOrBuilder<Payload> {\n\n    /**\n     * <pre>\n     * Timestamp at message sending time\n     * </pre>\n     *\n     * <code>optional uint64 timestamp = 1;</code>\n     */\n    boolean hasTimestamp();\n    /**\n     * <pre>\n     * Timestamp at message sending time\n     * </pre>\n     *\n     * <code>optional uint64 timestamp = 1;</code>\n     */\n    long getTimestamp();\n\n    /**\n     * <pre>\n     * Repeated forever - no limit in Google Protobufs\n     * </pre>\n     *\n     * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n     */\n    java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric> \n        getMetricsList();\n    /**\n     * <pre>\n     * Repeated forever - no limit in Google Protobufs\n     * </pre>\n     *\n     * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n     */\n    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric getMetrics(int index);\n    /**\n     * <pre>\n     * Repeated forever - no limit in Google Protobufs\n     * </pre>\n     *\n     * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n     */\n    int getMetricsCount();\n    /**\n     * <pre>\n     * Repeated forever - no limit in Google Protobufs\n     * </pre>\n     *\n     * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n     */\n    java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder> \n        getMetricsOrBuilderList();\n    /**\n     * <pre>\n     * Repeated forever - no limit in Google Protobufs\n     * </pre>\n     *\n     * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n     */\n    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder getMetricsOrBuilder(\n        int index);\n\n    /**\n     * <pre>\n     * Sequence number\n     * </pre>\n     *\n     * <code>optional uint64 seq = 3;</code>\n     */\n    boolean hasSeq();\n    /**\n     * <pre>\n     * Sequence number\n     * </pre>\n     *\n     * <code>optional uint64 seq = 3;</code>\n     */\n    long getSeq();\n\n    /**\n     * <pre>\n     * UUID to track message type in terms of schema definitions\n     * </pre>\n     *\n     * <code>optional string uuid = 4;</code>\n     */\n    boolean hasUuid();\n    /**\n     * <pre>\n     * UUID to track message type in terms of schema definitions\n     * </pre>\n     *\n     * <code>optional string uuid = 4;</code>\n     */\n    java.lang.String getUuid();\n    /**\n     * <pre>\n     * UUID to track message type in terms of schema definitions\n     * </pre>\n     *\n     * <code>optional string uuid = 4;</code>\n     */\n    com.google.protobuf.ByteString\n        getUuidBytes();\n\n    /**\n     * <pre>\n     * To optionally bypass the whole definition above\n     * </pre>\n     *\n     * <code>optional bytes body = 5;</code>\n     */\n    boolean hasBody();\n    /**\n     * <pre>\n     * To optionally bypass the whole definition above\n     * </pre>\n     *\n     * <code>optional bytes body = 5;</code>\n     */\n    com.google.protobuf.ByteString getBody();\n  }\n  /**\n   * Protobuf type {@code org.eclipse.tahu.protobuf.Payload}\n   */\n  public  static final class Payload extends\n      com.google.protobuf.GeneratedMessageV3.ExtendableMessage<\n        Payload> implements\n      // @@protoc_insertion_point(message_implements:org.eclipse.tahu.protobuf.Payload)\n      PayloadOrBuilder {\n    // Use Payload.newBuilder() to construct.\n    private Payload(com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<org.eclipse.tahu.protobuf.SparkplugBProto.Payload, ?> builder) {\n      super(builder);\n    }\n    private Payload() {\n      timestamp_ = 0L;\n      metrics_ = java.util.Collections.emptyList();\n      seq_ = 0L;\n      uuid_ = \"\";\n      body_ = com.google.protobuf.ByteString.EMPTY;\n    }\n\n    @java.lang.Override\n    public final com.google.protobuf.UnknownFieldSet\n    getUnknownFields() {\n      return this.unknownFields;\n    }\n    private Payload(\n        com.google.protobuf.CodedInputStream input,\n        com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n        throws com.google.protobuf.InvalidProtocolBufferException {\n      this();\n      int mutable_bitField0_ = 0;\n      com.google.protobuf.UnknownFieldSet.Builder unknownFields =\n          com.google.protobuf.UnknownFieldSet.newBuilder();\n      try {\n        boolean done = false;\n        while (!done) {\n          int tag = input.readTag();\n          switch (tag) {\n            case 0:\n              done = true;\n              break;\n            default: {\n              if (!parseUnknownField(input, unknownFields,\n                                     extensionRegistry, tag)) {\n                done = true;\n              }\n              break;\n            }\n            case 8: {\n              bitField0_ |= 0x00000001;\n              timestamp_ = input.readUInt64();\n              break;\n            }\n            case 18: {\n              if (!((mutable_bitField0_ & 0x00000002) == 0x00000002)) {\n                metrics_ = new java.util.ArrayList<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric>();\n                mutable_bitField0_ |= 0x00000002;\n              }\n              metrics_.add(\n                  input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.PARSER, extensionRegistry));\n              break;\n            }\n            case 24: {\n              bitField0_ |= 0x00000002;\n              seq_ = input.readUInt64();\n              break;\n            }\n            case 34: {\n              com.google.protobuf.ByteString bs = input.readBytes();\n              bitField0_ |= 0x00000004;\n              uuid_ = bs;\n              break;\n            }\n            case 42: {\n              bitField0_ |= 0x00000008;\n              body_ = input.readBytes();\n              break;\n            }\n          }\n        }\n      } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n        throw e.setUnfinishedMessage(this);\n      } catch (java.io.IOException e) {\n        throw new com.google.protobuf.InvalidProtocolBufferException(\n            e).setUnfinishedMessage(this);\n      } finally {\n        if (((mutable_bitField0_ & 0x00000002) == 0x00000002)) {\n          metrics_ = java.util.Collections.unmodifiableList(metrics_);\n        }\n        this.unknownFields = unknownFields.build();\n        makeExtensionsImmutable();\n      }\n    }\n    public static final com.google.protobuf.Descriptors.Descriptor\n        getDescriptor() {\n      return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_descriptor;\n    }\n\n    protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n        internalGetFieldAccessorTable() {\n      return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_fieldAccessorTable\n          .ensureFieldAccessorsInitialized(\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Builder.class);\n    }\n\n    public interface TemplateOrBuilder extends\n        // @@protoc_insertion_point(interface_extends:org.eclipse.tahu.protobuf.Payload.Template)\n        com.google.protobuf.GeneratedMessageV3.\n            ExtendableMessageOrBuilder<Template> {\n\n      /**\n       * <pre>\n       * The version of the Template to prevent mismatches\n       * </pre>\n       *\n       * <code>optional string version = 1;</code>\n       */\n      boolean hasVersion();\n      /**\n       * <pre>\n       * The version of the Template to prevent mismatches\n       * </pre>\n       *\n       * <code>optional string version = 1;</code>\n       */\n      java.lang.String getVersion();\n      /**\n       * <pre>\n       * The version of the Template to prevent mismatches\n       * </pre>\n       *\n       * <code>optional string version = 1;</code>\n       */\n      com.google.protobuf.ByteString\n          getVersionBytes();\n\n      /**\n       * <pre>\n       * Each metric includes a name, datatype, and optionally a value\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric> \n          getMetricsList();\n      /**\n       * <pre>\n       * Each metric includes a name, datatype, and optionally a value\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric getMetrics(int index);\n      /**\n       * <pre>\n       * Each metric includes a name, datatype, and optionally a value\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      int getMetricsCount();\n      /**\n       * <pre>\n       * Each metric includes a name, datatype, and optionally a value\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder> \n          getMetricsOrBuilderList();\n      /**\n       * <pre>\n       * Each metric includes a name, datatype, and optionally a value\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder getMetricsOrBuilder(\n          int index);\n\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n       */\n      java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter> \n          getParametersList();\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter getParameters(int index);\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n       */\n      int getParametersCount();\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n       */\n      java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.ParameterOrBuilder> \n          getParametersOrBuilderList();\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.ParameterOrBuilder getParametersOrBuilder(\n          int index);\n\n      /**\n       * <pre>\n       * Reference to a template if this is extending a Template or an instance - must exist if an instance\n       * </pre>\n       *\n       * <code>optional string template_ref = 4;</code>\n       */\n      boolean hasTemplateRef();\n      /**\n       * <pre>\n       * Reference to a template if this is extending a Template or an instance - must exist if an instance\n       * </pre>\n       *\n       * <code>optional string template_ref = 4;</code>\n       */\n      java.lang.String getTemplateRef();\n      /**\n       * <pre>\n       * Reference to a template if this is extending a Template or an instance - must exist if an instance\n       * </pre>\n       *\n       * <code>optional string template_ref = 4;</code>\n       */\n      com.google.protobuf.ByteString\n          getTemplateRefBytes();\n\n      /**\n       * <code>optional bool is_definition = 5;</code>\n       */\n      boolean hasIsDefinition();\n      /**\n       * <code>optional bool is_definition = 5;</code>\n       */\n      boolean getIsDefinition();\n    }\n    /**\n     * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.Template}\n     */\n    public  static final class Template extends\n        com.google.protobuf.GeneratedMessageV3.ExtendableMessage<\n          Template> implements\n        // @@protoc_insertion_point(message_implements:org.eclipse.tahu.protobuf.Payload.Template)\n        TemplateOrBuilder {\n      // Use Template.newBuilder() to construct.\n      private Template(com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template, ?> builder) {\n        super(builder);\n      }\n      private Template() {\n        version_ = \"\";\n        metrics_ = java.util.Collections.emptyList();\n        parameters_ = java.util.Collections.emptyList();\n        templateRef_ = \"\";\n        isDefinition_ = false;\n      }\n\n      @java.lang.Override\n      public final com.google.protobuf.UnknownFieldSet\n      getUnknownFields() {\n        return this.unknownFields;\n      }\n      private Template(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        this();\n        int mutable_bitField0_ = 0;\n        com.google.protobuf.UnknownFieldSet.Builder unknownFields =\n            com.google.protobuf.UnknownFieldSet.newBuilder();\n        try {\n          boolean done = false;\n          while (!done) {\n            int tag = input.readTag();\n            switch (tag) {\n              case 0:\n                done = true;\n                break;\n              default: {\n                if (!parseUnknownField(input, unknownFields,\n                                       extensionRegistry, tag)) {\n                  done = true;\n                }\n                break;\n              }\n              case 10: {\n                com.google.protobuf.ByteString bs = input.readBytes();\n                bitField0_ |= 0x00000001;\n                version_ = bs;\n                break;\n              }\n              case 18: {\n                if (!((mutable_bitField0_ & 0x00000002) == 0x00000002)) {\n                  metrics_ = new java.util.ArrayList<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric>();\n                  mutable_bitField0_ |= 0x00000002;\n                }\n                metrics_.add(\n                    input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.PARSER, extensionRegistry));\n                break;\n              }\n              case 26: {\n                if (!((mutable_bitField0_ & 0x00000004) == 0x00000004)) {\n                  parameters_ = new java.util.ArrayList<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter>();\n                  mutable_bitField0_ |= 0x00000004;\n                }\n                parameters_.add(\n                    input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.PARSER, extensionRegistry));\n                break;\n              }\n              case 34: {\n                com.google.protobuf.ByteString bs = input.readBytes();\n                bitField0_ |= 0x00000002;\n                templateRef_ = bs;\n                break;\n              }\n              case 40: {\n                bitField0_ |= 0x00000004;\n                isDefinition_ = input.readBool();\n                break;\n              }\n            }\n          }\n        } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n          throw e.setUnfinishedMessage(this);\n        } catch (java.io.IOException e) {\n          throw new com.google.protobuf.InvalidProtocolBufferException(\n              e).setUnfinishedMessage(this);\n        } finally {\n          if (((mutable_bitField0_ & 0x00000002) == 0x00000002)) {\n            metrics_ = java.util.Collections.unmodifiableList(metrics_);\n          }\n          if (((mutable_bitField0_ & 0x00000004) == 0x00000004)) {\n            parameters_ = java.util.Collections.unmodifiableList(parameters_);\n          }\n          this.unknownFields = unknownFields.build();\n          makeExtensionsImmutable();\n        }\n      }\n      public static final com.google.protobuf.Descriptors.Descriptor\n          getDescriptor() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Template_descriptor;\n      }\n\n      protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n          internalGetFieldAccessorTable() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Template_fieldAccessorTable\n            .ensureFieldAccessorsInitialized(\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Builder.class);\n      }\n\n      public interface ParameterOrBuilder extends\n          // @@protoc_insertion_point(interface_extends:org.eclipse.tahu.protobuf.Payload.Template.Parameter)\n          com.google.protobuf.MessageOrBuilder {\n\n        /**\n         * <code>optional string name = 1;</code>\n         */\n        boolean hasName();\n        /**\n         * <code>optional string name = 1;</code>\n         */\n        java.lang.String getName();\n        /**\n         * <code>optional string name = 1;</code>\n         */\n        com.google.protobuf.ByteString\n            getNameBytes();\n\n        /**\n         * <code>optional uint32 type = 2;</code>\n         */\n        boolean hasType();\n        /**\n         * <code>optional uint32 type = 2;</code>\n         */\n        int getType();\n\n        /**\n         * <code>optional uint32 int_value = 3;</code>\n         */\n        boolean hasIntValue();\n        /**\n         * <code>optional uint32 int_value = 3;</code>\n         */\n        int getIntValue();\n\n        /**\n         * <code>optional uint64 long_value = 4;</code>\n         */\n        boolean hasLongValue();\n        /**\n         * <code>optional uint64 long_value = 4;</code>\n         */\n        long getLongValue();\n\n        /**\n         * <code>optional float float_value = 5;</code>\n         */\n        boolean hasFloatValue();\n        /**\n         * <code>optional float float_value = 5;</code>\n         */\n        float getFloatValue();\n\n        /**\n         * <code>optional double double_value = 6;</code>\n         */\n        boolean hasDoubleValue();\n        /**\n         * <code>optional double double_value = 6;</code>\n         */\n        double getDoubleValue();\n\n        /**\n         * <code>optional bool boolean_value = 7;</code>\n         */\n        boolean hasBooleanValue();\n        /**\n         * <code>optional bool boolean_value = 7;</code>\n         */\n        boolean getBooleanValue();\n\n        /**\n         * <code>optional string string_value = 8;</code>\n         */\n        boolean hasStringValue();\n        /**\n         * <code>optional string string_value = 8;</code>\n         */\n        java.lang.String getStringValue();\n        /**\n         * <code>optional string string_value = 8;</code>\n         */\n        com.google.protobuf.ByteString\n            getStringValueBytes();\n\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension extension_value = 9;</code>\n         */\n        boolean hasExtensionValue();\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension extension_value = 9;</code>\n         */\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension getExtensionValue();\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension extension_value = 9;</code>\n         */\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtensionOrBuilder getExtensionValueOrBuilder();\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ValueCase getValueCase();\n      }\n      /**\n       * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.Template.Parameter}\n       */\n      public  static final class Parameter extends\n          com.google.protobuf.GeneratedMessageV3 implements\n          // @@protoc_insertion_point(message_implements:org.eclipse.tahu.protobuf.Payload.Template.Parameter)\n          ParameterOrBuilder {\n        // Use Parameter.newBuilder() to construct.\n        private Parameter(com.google.protobuf.GeneratedMessageV3.Builder<?> builder) {\n          super(builder);\n        }\n        private Parameter() {\n          name_ = \"\";\n          type_ = 0;\n        }\n\n        @java.lang.Override\n        public final com.google.protobuf.UnknownFieldSet\n        getUnknownFields() {\n          return this.unknownFields;\n        }\n        private Parameter(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          this();\n          int mutable_bitField0_ = 0;\n          com.google.protobuf.UnknownFieldSet.Builder unknownFields =\n              com.google.protobuf.UnknownFieldSet.newBuilder();\n          try {\n            boolean done = false;\n            while (!done) {\n              int tag = input.readTag();\n              switch (tag) {\n                case 0:\n                  done = true;\n                  break;\n                default: {\n                  if (!parseUnknownField(input, unknownFields,\n                                         extensionRegistry, tag)) {\n                    done = true;\n                  }\n                  break;\n                }\n                case 10: {\n                  com.google.protobuf.ByteString bs = input.readBytes();\n                  bitField0_ |= 0x00000001;\n                  name_ = bs;\n                  break;\n                }\n                case 16: {\n                  bitField0_ |= 0x00000002;\n                  type_ = input.readUInt32();\n                  break;\n                }\n                case 24: {\n                  valueCase_ = 3;\n                  value_ = input.readUInt32();\n                  break;\n                }\n                case 32: {\n                  valueCase_ = 4;\n                  value_ = input.readUInt64();\n                  break;\n                }\n                case 45: {\n                  valueCase_ = 5;\n                  value_ = input.readFloat();\n                  break;\n                }\n                case 49: {\n                  valueCase_ = 6;\n                  value_ = input.readDouble();\n                  break;\n                }\n                case 56: {\n                  valueCase_ = 7;\n                  value_ = input.readBool();\n                  break;\n                }\n                case 66: {\n                  com.google.protobuf.ByteString bs = input.readBytes();\n                  valueCase_ = 8;\n                  value_ = bs;\n                  break;\n                }\n                case 74: {\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.Builder subBuilder = null;\n                  if (valueCase_ == 9) {\n                    subBuilder = ((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension) value_).toBuilder();\n                  }\n                  value_ =\n                      input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.PARSER, extensionRegistry);\n                  if (subBuilder != null) {\n                    subBuilder.mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension) value_);\n                    value_ = subBuilder.buildPartial();\n                  }\n                  valueCase_ = 9;\n                  break;\n                }\n              }\n            }\n          } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n            throw e.setUnfinishedMessage(this);\n          } catch (java.io.IOException e) {\n            throw new com.google.protobuf.InvalidProtocolBufferException(\n                e).setUnfinishedMessage(this);\n          } finally {\n            this.unknownFields = unknownFields.build();\n            makeExtensionsImmutable();\n          }\n        }\n        public static final com.google.protobuf.Descriptors.Descriptor\n            getDescriptor() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_descriptor;\n        }\n\n        protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n            internalGetFieldAccessorTable() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_fieldAccessorTable\n              .ensureFieldAccessorsInitialized(\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.Builder.class);\n        }\n\n        public interface ParameterValueExtensionOrBuilder extends\n            // @@protoc_insertion_point(interface_extends:org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension)\n            com.google.protobuf.GeneratedMessageV3.\n                ExtendableMessageOrBuilder<ParameterValueExtension> {\n        }\n        /**\n         * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension}\n         */\n        public  static final class ParameterValueExtension extends\n            com.google.protobuf.GeneratedMessageV3.ExtendableMessage<\n              ParameterValueExtension> implements\n            // @@protoc_insertion_point(message_implements:org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension)\n            ParameterValueExtensionOrBuilder {\n          // Use ParameterValueExtension.newBuilder() to construct.\n          private ParameterValueExtension(com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension, ?> builder) {\n            super(builder);\n          }\n          private ParameterValueExtension() {\n          }\n\n          @java.lang.Override\n          public final com.google.protobuf.UnknownFieldSet\n          getUnknownFields() {\n            return this.unknownFields;\n          }\n          private ParameterValueExtension(\n              com.google.protobuf.CodedInputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws com.google.protobuf.InvalidProtocolBufferException {\n            this();\n            com.google.protobuf.UnknownFieldSet.Builder unknownFields =\n                com.google.protobuf.UnknownFieldSet.newBuilder();\n            try {\n              boolean done = false;\n              while (!done) {\n                int tag = input.readTag();\n                switch (tag) {\n                  case 0:\n                    done = true;\n                    break;\n                  default: {\n                    if (!parseUnknownField(input, unknownFields,\n                                           extensionRegistry, tag)) {\n                      done = true;\n                    }\n                    break;\n                  }\n                }\n              }\n            } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n              throw e.setUnfinishedMessage(this);\n            } catch (java.io.IOException e) {\n              throw new com.google.protobuf.InvalidProtocolBufferException(\n                  e).setUnfinishedMessage(this);\n            } finally {\n              this.unknownFields = unknownFields.build();\n              makeExtensionsImmutable();\n            }\n          }\n          public static final com.google.protobuf.Descriptors.Descriptor\n              getDescriptor() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_descriptor;\n          }\n\n          protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n              internalGetFieldAccessorTable() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_fieldAccessorTable\n                .ensureFieldAccessorsInitialized(\n                    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.Builder.class);\n          }\n\n          private byte memoizedIsInitialized = -1;\n          public final boolean isInitialized() {\n            byte isInitialized = memoizedIsInitialized;\n            if (isInitialized == 1) return true;\n            if (isInitialized == 0) return false;\n\n            if (!extensionsAreInitialized()) {\n              memoizedIsInitialized = 0;\n              return false;\n            }\n            memoizedIsInitialized = 1;\n            return true;\n          }\n\n          public void writeTo(com.google.protobuf.CodedOutputStream output)\n                              throws java.io.IOException {\n            com.google.protobuf.GeneratedMessageV3\n              .ExtendableMessage<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension>.ExtensionWriter\n                extensionWriter = newExtensionWriter();\n            extensionWriter.writeUntil(536870912, output);\n            unknownFields.writeTo(output);\n          }\n\n          public int getSerializedSize() {\n            int size = memoizedSize;\n            if (size != -1) return size;\n\n            size = 0;\n            size += extensionsSerializedSize();\n            size += unknownFields.getSerializedSize();\n            memoizedSize = size;\n            return size;\n          }\n\n          private static final long serialVersionUID = 0L;\n          @java.lang.Override\n          public boolean equals(final java.lang.Object obj) {\n            if (obj == this) {\n             return true;\n            }\n            if (!(obj instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension)) {\n              return super.equals(obj);\n            }\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension other = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension) obj;\n\n            boolean result = true;\n            result = result && unknownFields.equals(other.unknownFields);\n            result = result &&\n                getExtensionFields().equals(other.getExtensionFields());\n            return result;\n          }\n\n          @java.lang.Override\n          public int hashCode() {\n            if (memoizedHashCode != 0) {\n              return memoizedHashCode;\n            }\n            int hash = 41;\n            hash = (19 * hash) + getDescriptorForType().hashCode();\n            hash = hashFields(hash, getExtensionFields());\n            hash = (29 * hash) + unknownFields.hashCode();\n            memoizedHashCode = hash;\n            return hash;\n          }\n\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension parseFrom(\n              com.google.protobuf.ByteString data)\n              throws com.google.protobuf.InvalidProtocolBufferException {\n            return PARSER.parseFrom(data);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension parseFrom(\n              com.google.protobuf.ByteString data,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws com.google.protobuf.InvalidProtocolBufferException {\n            return PARSER.parseFrom(data, extensionRegistry);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension parseFrom(byte[] data)\n              throws com.google.protobuf.InvalidProtocolBufferException {\n            return PARSER.parseFrom(data);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension parseFrom(\n              byte[] data,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws com.google.protobuf.InvalidProtocolBufferException {\n            return PARSER.parseFrom(data, extensionRegistry);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension parseFrom(java.io.InputStream input)\n              throws java.io.IOException {\n            return com.google.protobuf.GeneratedMessageV3\n                .parseWithIOException(PARSER, input);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension parseFrom(\n              java.io.InputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws java.io.IOException {\n            return com.google.protobuf.GeneratedMessageV3\n                .parseWithIOException(PARSER, input, extensionRegistry);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension parseDelimitedFrom(java.io.InputStream input)\n              throws java.io.IOException {\n            return com.google.protobuf.GeneratedMessageV3\n                .parseDelimitedWithIOException(PARSER, input);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension parseDelimitedFrom(\n              java.io.InputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws java.io.IOException {\n            return com.google.protobuf.GeneratedMessageV3\n                .parseDelimitedWithIOException(PARSER, input, extensionRegistry);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension parseFrom(\n              com.google.protobuf.CodedInputStream input)\n              throws java.io.IOException {\n            return com.google.protobuf.GeneratedMessageV3\n                .parseWithIOException(PARSER, input);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension parseFrom(\n              com.google.protobuf.CodedInputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws java.io.IOException {\n            return com.google.protobuf.GeneratedMessageV3\n                .parseWithIOException(PARSER, input, extensionRegistry);\n          }\n\n          public Builder newBuilderForType() { return newBuilder(); }\n          public static Builder newBuilder() {\n            return DEFAULT_INSTANCE.toBuilder();\n          }\n          public static Builder newBuilder(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension prototype) {\n            return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);\n          }\n          public Builder toBuilder() {\n            return this == DEFAULT_INSTANCE\n                ? new Builder() : new Builder().mergeFrom(this);\n          }\n\n          @java.lang.Override\n          protected Builder newBuilderForType(\n              com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n            Builder builder = new Builder(parent);\n            return builder;\n          }\n          /**\n           * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension}\n           */\n          public static final class Builder extends\n              com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension, Builder> implements\n              // @@protoc_insertion_point(builder_implements:org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension)\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtensionOrBuilder {\n            public static final com.google.protobuf.Descriptors.Descriptor\n                getDescriptor() {\n              return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_descriptor;\n            }\n\n            protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n                internalGetFieldAccessorTable() {\n              return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_fieldAccessorTable\n                  .ensureFieldAccessorsInitialized(\n                      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.Builder.class);\n            }\n\n            // Construct using org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.newBuilder()\n            private Builder() {\n              maybeForceBuilderInitialization();\n            }\n\n            private Builder(\n                com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n              super(parent);\n              maybeForceBuilderInitialization();\n            }\n            private void maybeForceBuilderInitialization() {\n              if (com.google.protobuf.GeneratedMessageV3\n                      .alwaysUseFieldBuilders) {\n              }\n            }\n            public Builder clear() {\n              super.clear();\n              return this;\n            }\n\n            public com.google.protobuf.Descriptors.Descriptor\n                getDescriptorForType() {\n              return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_descriptor;\n            }\n\n            public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension getDefaultInstanceForType() {\n              return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.getDefaultInstance();\n            }\n\n            public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension build() {\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension result = buildPartial();\n              if (!result.isInitialized()) {\n                throw newUninitializedMessageException(result);\n              }\n              return result;\n            }\n\n            public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension buildPartial() {\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension result = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension(this);\n              onBuilt();\n              return result;\n            }\n\n            public Builder clone() {\n              return (Builder) super.clone();\n            }\n            public Builder setField(\n                com.google.protobuf.Descriptors.FieldDescriptor field,\n                Object value) {\n              return (Builder) super.setField(field, value);\n            }\n            public Builder clearField(\n                com.google.protobuf.Descriptors.FieldDescriptor field) {\n              return (Builder) super.clearField(field);\n            }\n            public Builder clearOneof(\n                com.google.protobuf.Descriptors.OneofDescriptor oneof) {\n              return (Builder) super.clearOneof(oneof);\n            }\n            public Builder setRepeatedField(\n                com.google.protobuf.Descriptors.FieldDescriptor field,\n                int index, Object value) {\n              return (Builder) super.setRepeatedField(field, index, value);\n            }\n            public Builder addRepeatedField(\n                com.google.protobuf.Descriptors.FieldDescriptor field,\n                Object value) {\n              return (Builder) super.addRepeatedField(field, value);\n            }\n            public <Type> Builder setExtension(\n                com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension, Type> extension,\n                Type value) {\n              return (Builder) super.setExtension(extension, value);\n            }\n            public <Type> Builder setExtension(\n                com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension, java.util.List<Type>> extension,\n                int index, Type value) {\n              return (Builder) super.setExtension(extension, index, value);\n            }\n            public <Type> Builder addExtension(\n                com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension, java.util.List<Type>> extension,\n                Type value) {\n              return (Builder) super.addExtension(extension, value);\n            }\n            public <Type> Builder clearExtension(\n                com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension, ?> extension) {\n              return (Builder) super.clearExtension(extension);\n            }\n            public Builder mergeFrom(com.google.protobuf.Message other) {\n              if (other instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension) {\n                return mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension)other);\n              } else {\n                super.mergeFrom(other);\n                return this;\n              }\n            }\n\n            public Builder mergeFrom(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension other) {\n              if (other == org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.getDefaultInstance()) return this;\n              this.mergeExtensionFields(other);\n              this.mergeUnknownFields(other.unknownFields);\n              onChanged();\n              return this;\n            }\n\n            public final boolean isInitialized() {\n              if (!extensionsAreInitialized()) {\n                return false;\n              }\n              return true;\n            }\n\n            public Builder mergeFrom(\n                com.google.protobuf.CodedInputStream input,\n                com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n                throws java.io.IOException {\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension parsedMessage = null;\n              try {\n                parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry);\n              } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n                parsedMessage = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension) e.getUnfinishedMessage();\n                throw e.unwrapIOException();\n              } finally {\n                if (parsedMessage != null) {\n                  mergeFrom(parsedMessage);\n                }\n              }\n              return this;\n            }\n            public final Builder setUnknownFields(\n                final com.google.protobuf.UnknownFieldSet unknownFields) {\n              return super.setUnknownFields(unknownFields);\n            }\n\n            public final Builder mergeUnknownFields(\n                final com.google.protobuf.UnknownFieldSet unknownFields) {\n              return super.mergeUnknownFields(unknownFields);\n            }\n\n\n            // @@protoc_insertion_point(builder_scope:org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension)\n          }\n\n          // @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension)\n          private static final org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension DEFAULT_INSTANCE;\n          static {\n            DEFAULT_INSTANCE = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension();\n          }\n\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension getDefaultInstance() {\n            return DEFAULT_INSTANCE;\n          }\n\n          @java.lang.Deprecated public static final com.google.protobuf.Parser<ParameterValueExtension>\n              PARSER = new com.google.protobuf.AbstractParser<ParameterValueExtension>() {\n            public ParameterValueExtension parsePartialFrom(\n                com.google.protobuf.CodedInputStream input,\n                com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n                throws com.google.protobuf.InvalidProtocolBufferException {\n                return new ParameterValueExtension(input, extensionRegistry);\n            }\n          };\n\n          public static com.google.protobuf.Parser<ParameterValueExtension> parser() {\n            return PARSER;\n          }\n\n          @java.lang.Override\n          public com.google.protobuf.Parser<ParameterValueExtension> getParserForType() {\n            return PARSER;\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension getDefaultInstanceForType() {\n            return DEFAULT_INSTANCE;\n          }\n\n        }\n\n        private int bitField0_;\n        private int valueCase_ = 0;\n        private java.lang.Object value_;\n        public enum ValueCase\n            implements com.google.protobuf.Internal.EnumLite {\n          INT_VALUE(3),\n          LONG_VALUE(4),\n          FLOAT_VALUE(5),\n          DOUBLE_VALUE(6),\n          BOOLEAN_VALUE(7),\n          STRING_VALUE(8),\n          EXTENSION_VALUE(9),\n          VALUE_NOT_SET(0);\n          private final int value;\n          private ValueCase(int value) {\n            this.value = value;\n          }\n          /**\n           * @deprecated Use {@link #forNumber(int)} instead.\n           */\n          @java.lang.Deprecated\n          public static ValueCase valueOf(int value) {\n            return forNumber(value);\n          }\n\n          public static ValueCase forNumber(int value) {\n            switch (value) {\n              case 3: return INT_VALUE;\n              case 4: return LONG_VALUE;\n              case 5: return FLOAT_VALUE;\n              case 6: return DOUBLE_VALUE;\n              case 7: return BOOLEAN_VALUE;\n              case 8: return STRING_VALUE;\n              case 9: return EXTENSION_VALUE;\n              case 0: return VALUE_NOT_SET;\n              default: return null;\n            }\n          }\n          public int getNumber() {\n            return this.value;\n          }\n        };\n\n        public ValueCase\n        getValueCase() {\n          return ValueCase.forNumber(\n              valueCase_);\n        }\n\n        public static final int NAME_FIELD_NUMBER = 1;\n        private volatile java.lang.Object name_;\n        /**\n         * <code>optional string name = 1;</code>\n         */\n        public boolean hasName() {\n          return ((bitField0_ & 0x00000001) == 0x00000001);\n        }\n        /**\n         * <code>optional string name = 1;</code>\n         */\n        public java.lang.String getName() {\n          java.lang.Object ref = name_;\n          if (ref instanceof java.lang.String) {\n            return (java.lang.String) ref;\n          } else {\n            com.google.protobuf.ByteString bs = \n                (com.google.protobuf.ByteString) ref;\n            java.lang.String s = bs.toStringUtf8();\n            if (bs.isValidUtf8()) {\n              name_ = s;\n            }\n            return s;\n          }\n        }\n        /**\n         * <code>optional string name = 1;</code>\n         */\n        public com.google.protobuf.ByteString\n            getNameBytes() {\n          java.lang.Object ref = name_;\n          if (ref instanceof java.lang.String) {\n            com.google.protobuf.ByteString b = \n                com.google.protobuf.ByteString.copyFromUtf8(\n                    (java.lang.String) ref);\n            name_ = b;\n            return b;\n          } else {\n            return (com.google.protobuf.ByteString) ref;\n          }\n        }\n\n        public static final int TYPE_FIELD_NUMBER = 2;\n        private int type_;\n        /**\n         * <code>optional uint32 type = 2;</code>\n         */\n        public boolean hasType() {\n          return ((bitField0_ & 0x00000002) == 0x00000002);\n        }\n        /**\n         * <code>optional uint32 type = 2;</code>\n         */\n        public int getType() {\n          return type_;\n        }\n\n        public static final int INT_VALUE_FIELD_NUMBER = 3;\n        /**\n         * <code>optional uint32 int_value = 3;</code>\n         */\n        public boolean hasIntValue() {\n          return valueCase_ == 3;\n        }\n        /**\n         * <code>optional uint32 int_value = 3;</code>\n         */\n        public int getIntValue() {\n          if (valueCase_ == 3) {\n            return (java.lang.Integer) value_;\n          }\n          return 0;\n        }\n\n        public static final int LONG_VALUE_FIELD_NUMBER = 4;\n        /**\n         * <code>optional uint64 long_value = 4;</code>\n         */\n        public boolean hasLongValue() {\n          return valueCase_ == 4;\n        }\n        /**\n         * <code>optional uint64 long_value = 4;</code>\n         */\n        public long getLongValue() {\n          if (valueCase_ == 4) {\n            return (java.lang.Long) value_;\n          }\n          return 0L;\n        }\n\n        public static final int FLOAT_VALUE_FIELD_NUMBER = 5;\n        /**\n         * <code>optional float float_value = 5;</code>\n         */\n        public boolean hasFloatValue() {\n          return valueCase_ == 5;\n        }\n        /**\n         * <code>optional float float_value = 5;</code>\n         */\n        public float getFloatValue() {\n          if (valueCase_ == 5) {\n            return (java.lang.Float) value_;\n          }\n          return 0F;\n        }\n\n        public static final int DOUBLE_VALUE_FIELD_NUMBER = 6;\n        /**\n         * <code>optional double double_value = 6;</code>\n         */\n        public boolean hasDoubleValue() {\n          return valueCase_ == 6;\n        }\n        /**\n         * <code>optional double double_value = 6;</code>\n         */\n        public double getDoubleValue() {\n          if (valueCase_ == 6) {\n            return (java.lang.Double) value_;\n          }\n          return 0D;\n        }\n\n        public static final int BOOLEAN_VALUE_FIELD_NUMBER = 7;\n        /**\n         * <code>optional bool boolean_value = 7;</code>\n         */\n        public boolean hasBooleanValue() {\n          return valueCase_ == 7;\n        }\n        /**\n         * <code>optional bool boolean_value = 7;</code>\n         */\n        public boolean getBooleanValue() {\n          if (valueCase_ == 7) {\n            return (java.lang.Boolean) value_;\n          }\n          return false;\n        }\n\n        public static final int STRING_VALUE_FIELD_NUMBER = 8;\n        /**\n         * <code>optional string string_value = 8;</code>\n         */\n        public boolean hasStringValue() {\n          return valueCase_ == 8;\n        }\n        /**\n         * <code>optional string string_value = 8;</code>\n         */\n        public java.lang.String getStringValue() {\n          java.lang.Object ref = \"\";\n          if (valueCase_ == 8) {\n            ref = value_;\n          }\n          if (ref instanceof java.lang.String) {\n            return (java.lang.String) ref;\n          } else {\n            com.google.protobuf.ByteString bs = \n                (com.google.protobuf.ByteString) ref;\n            java.lang.String s = bs.toStringUtf8();\n            if (bs.isValidUtf8() && (valueCase_ == 8)) {\n              value_ = s;\n            }\n            return s;\n          }\n        }\n        /**\n         * <code>optional string string_value = 8;</code>\n         */\n        public com.google.protobuf.ByteString\n            getStringValueBytes() {\n          java.lang.Object ref = \"\";\n          if (valueCase_ == 8) {\n            ref = value_;\n          }\n          if (ref instanceof java.lang.String) {\n            com.google.protobuf.ByteString b = \n                com.google.protobuf.ByteString.copyFromUtf8(\n                    (java.lang.String) ref);\n            if (valueCase_ == 8) {\n              value_ = b;\n            }\n            return b;\n          } else {\n            return (com.google.protobuf.ByteString) ref;\n          }\n        }\n\n        public static final int EXTENSION_VALUE_FIELD_NUMBER = 9;\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension extension_value = 9;</code>\n         */\n        public boolean hasExtensionValue() {\n          return valueCase_ == 9;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension extension_value = 9;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension getExtensionValue() {\n          if (valueCase_ == 9) {\n             return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension) value_;\n          }\n          return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.getDefaultInstance();\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension extension_value = 9;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtensionOrBuilder getExtensionValueOrBuilder() {\n          if (valueCase_ == 9) {\n             return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension) value_;\n          }\n          return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.getDefaultInstance();\n        }\n\n        private byte memoizedIsInitialized = -1;\n        public final boolean isInitialized() {\n          byte isInitialized = memoizedIsInitialized;\n          if (isInitialized == 1) return true;\n          if (isInitialized == 0) return false;\n\n          if (hasExtensionValue()) {\n            if (!getExtensionValue().isInitialized()) {\n              memoizedIsInitialized = 0;\n              return false;\n            }\n          }\n          memoizedIsInitialized = 1;\n          return true;\n        }\n\n        public void writeTo(com.google.protobuf.CodedOutputStream output)\n                            throws java.io.IOException {\n          if (((bitField0_ & 0x00000001) == 0x00000001)) {\n            com.google.protobuf.GeneratedMessageV3.writeString(output, 1, name_);\n          }\n          if (((bitField0_ & 0x00000002) == 0x00000002)) {\n            output.writeUInt32(2, type_);\n          }\n          if (valueCase_ == 3) {\n            output.writeUInt32(\n                3, (int)((java.lang.Integer) value_));\n          }\n          if (valueCase_ == 4) {\n            output.writeUInt64(\n                4, (long)((java.lang.Long) value_));\n          }\n          if (valueCase_ == 5) {\n            output.writeFloat(\n                5, (float)((java.lang.Float) value_));\n          }\n          if (valueCase_ == 6) {\n            output.writeDouble(\n                6, (double)((java.lang.Double) value_));\n          }\n          if (valueCase_ == 7) {\n            output.writeBool(\n                7, (boolean)((java.lang.Boolean) value_));\n          }\n          if (valueCase_ == 8) {\n            com.google.protobuf.GeneratedMessageV3.writeString(output, 8, value_);\n          }\n          if (valueCase_ == 9) {\n            output.writeMessage(9, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension) value_);\n          }\n          unknownFields.writeTo(output);\n        }\n\n        public int getSerializedSize() {\n          int size = memoizedSize;\n          if (size != -1) return size;\n\n          size = 0;\n          if (((bitField0_ & 0x00000001) == 0x00000001)) {\n            size += com.google.protobuf.GeneratedMessageV3.computeStringSize(1, name_);\n          }\n          if (((bitField0_ & 0x00000002) == 0x00000002)) {\n            size += com.google.protobuf.CodedOutputStream\n              .computeUInt32Size(2, type_);\n          }\n          if (valueCase_ == 3) {\n            size += com.google.protobuf.CodedOutputStream\n              .computeUInt32Size(\n                  3, (int)((java.lang.Integer) value_));\n          }\n          if (valueCase_ == 4) {\n            size += com.google.protobuf.CodedOutputStream\n              .computeUInt64Size(\n                  4, (long)((java.lang.Long) value_));\n          }\n          if (valueCase_ == 5) {\n            size += com.google.protobuf.CodedOutputStream\n              .computeFloatSize(\n                  5, (float)((java.lang.Float) value_));\n          }\n          if (valueCase_ == 6) {\n            size += com.google.protobuf.CodedOutputStream\n              .computeDoubleSize(\n                  6, (double)((java.lang.Double) value_));\n          }\n          if (valueCase_ == 7) {\n            size += com.google.protobuf.CodedOutputStream\n              .computeBoolSize(\n                  7, (boolean)((java.lang.Boolean) value_));\n          }\n          if (valueCase_ == 8) {\n            size += com.google.protobuf.GeneratedMessageV3.computeStringSize(8, value_);\n          }\n          if (valueCase_ == 9) {\n            size += com.google.protobuf.CodedOutputStream\n              .computeMessageSize(9, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension) value_);\n          }\n          size += unknownFields.getSerializedSize();\n          memoizedSize = size;\n          return size;\n        }\n\n        private static final long serialVersionUID = 0L;\n        @java.lang.Override\n        public boolean equals(final java.lang.Object obj) {\n          if (obj == this) {\n           return true;\n          }\n          if (!(obj instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter)) {\n            return super.equals(obj);\n          }\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter other = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter) obj;\n\n          boolean result = true;\n          result = result && (hasName() == other.hasName());\n          if (hasName()) {\n            result = result && getName()\n                .equals(other.getName());\n          }\n          result = result && (hasType() == other.hasType());\n          if (hasType()) {\n            result = result && (getType()\n                == other.getType());\n          }\n          result = result && getValueCase().equals(\n              other.getValueCase());\n          if (!result) return false;\n          switch (valueCase_) {\n            case 3:\n              result = result && (getIntValue()\n                  == other.getIntValue());\n              break;\n            case 4:\n              result = result && (getLongValue()\n                  == other.getLongValue());\n              break;\n            case 5:\n              result = result && (\n                  java.lang.Float.floatToIntBits(getFloatValue())\n                  == java.lang.Float.floatToIntBits(\n                      other.getFloatValue()));\n              break;\n            case 6:\n              result = result && (\n                  java.lang.Double.doubleToLongBits(getDoubleValue())\n                  == java.lang.Double.doubleToLongBits(\n                      other.getDoubleValue()));\n              break;\n            case 7:\n              result = result && (getBooleanValue()\n                  == other.getBooleanValue());\n              break;\n            case 8:\n              result = result && getStringValue()\n                  .equals(other.getStringValue());\n              break;\n            case 9:\n              result = result && getExtensionValue()\n                  .equals(other.getExtensionValue());\n              break;\n            case 0:\n            default:\n          }\n          result = result && unknownFields.equals(other.unknownFields);\n          return result;\n        }\n\n        @java.lang.Override\n        public int hashCode() {\n          if (memoizedHashCode != 0) {\n            return memoizedHashCode;\n          }\n          int hash = 41;\n          hash = (19 * hash) + getDescriptorForType().hashCode();\n          if (hasName()) {\n            hash = (37 * hash) + NAME_FIELD_NUMBER;\n            hash = (53 * hash) + getName().hashCode();\n          }\n          if (hasType()) {\n            hash = (37 * hash) + TYPE_FIELD_NUMBER;\n            hash = (53 * hash) + getType();\n          }\n          switch (valueCase_) {\n            case 3:\n              hash = (37 * hash) + INT_VALUE_FIELD_NUMBER;\n              hash = (53 * hash) + getIntValue();\n              break;\n            case 4:\n              hash = (37 * hash) + LONG_VALUE_FIELD_NUMBER;\n              hash = (53 * hash) + com.google.protobuf.Internal.hashLong(\n                  getLongValue());\n              break;\n            case 5:\n              hash = (37 * hash) + FLOAT_VALUE_FIELD_NUMBER;\n              hash = (53 * hash) + java.lang.Float.floatToIntBits(\n                  getFloatValue());\n              break;\n            case 6:\n              hash = (37 * hash) + DOUBLE_VALUE_FIELD_NUMBER;\n              hash = (53 * hash) + com.google.protobuf.Internal.hashLong(\n                  java.lang.Double.doubleToLongBits(getDoubleValue()));\n              break;\n            case 7:\n              hash = (37 * hash) + BOOLEAN_VALUE_FIELD_NUMBER;\n              hash = (53 * hash) + com.google.protobuf.Internal.hashBoolean(\n                  getBooleanValue());\n              break;\n            case 8:\n              hash = (37 * hash) + STRING_VALUE_FIELD_NUMBER;\n              hash = (53 * hash) + getStringValue().hashCode();\n              break;\n            case 9:\n              hash = (37 * hash) + EXTENSION_VALUE_FIELD_NUMBER;\n              hash = (53 * hash) + getExtensionValue().hashCode();\n              break;\n            case 0:\n            default:\n          }\n          hash = (29 * hash) + unknownFields.hashCode();\n          memoizedHashCode = hash;\n          return hash;\n        }\n\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter parseFrom(\n            com.google.protobuf.ByteString data)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter parseFrom(\n            com.google.protobuf.ByteString data,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter parseFrom(byte[] data)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter parseFrom(\n            byte[] data,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter parseFrom(java.io.InputStream input)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter parseFrom(\n            java.io.InputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter parseDelimitedFrom(java.io.InputStream input)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseDelimitedWithIOException(PARSER, input);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter parseDelimitedFrom(\n            java.io.InputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseDelimitedWithIOException(PARSER, input, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter parseFrom(\n            com.google.protobuf.CodedInputStream input)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter parseFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input, extensionRegistry);\n        }\n\n        public Builder newBuilderForType() { return newBuilder(); }\n        public static Builder newBuilder() {\n          return DEFAULT_INSTANCE.toBuilder();\n        }\n        public static Builder newBuilder(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter prototype) {\n          return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);\n        }\n        public Builder toBuilder() {\n          return this == DEFAULT_INSTANCE\n              ? new Builder() : new Builder().mergeFrom(this);\n        }\n\n        @java.lang.Override\n        protected Builder newBuilderForType(\n            com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n          Builder builder = new Builder(parent);\n          return builder;\n        }\n        /**\n         * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.Template.Parameter}\n         */\n        public static final class Builder extends\n            com.google.protobuf.GeneratedMessageV3.Builder<Builder> implements\n            // @@protoc_insertion_point(builder_implements:org.eclipse.tahu.protobuf.Payload.Template.Parameter)\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.ParameterOrBuilder {\n          public static final com.google.protobuf.Descriptors.Descriptor\n              getDescriptor() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_descriptor;\n          }\n\n          protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n              internalGetFieldAccessorTable() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_fieldAccessorTable\n                .ensureFieldAccessorsInitialized(\n                    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.Builder.class);\n          }\n\n          // Construct using org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.newBuilder()\n          private Builder() {\n            maybeForceBuilderInitialization();\n          }\n\n          private Builder(\n              com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n            super(parent);\n            maybeForceBuilderInitialization();\n          }\n          private void maybeForceBuilderInitialization() {\n            if (com.google.protobuf.GeneratedMessageV3\n                    .alwaysUseFieldBuilders) {\n            }\n          }\n          public Builder clear() {\n            super.clear();\n            name_ = \"\";\n            bitField0_ = (bitField0_ & ~0x00000001);\n            type_ = 0;\n            bitField0_ = (bitField0_ & ~0x00000002);\n            valueCase_ = 0;\n            value_ = null;\n            return this;\n          }\n\n          public com.google.protobuf.Descriptors.Descriptor\n              getDescriptorForType() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_descriptor;\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter getDefaultInstanceForType() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.getDefaultInstance();\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter build() {\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter result = buildPartial();\n            if (!result.isInitialized()) {\n              throw newUninitializedMessageException(result);\n            }\n            return result;\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter buildPartial() {\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter result = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter(this);\n            int from_bitField0_ = bitField0_;\n            int to_bitField0_ = 0;\n            if (((from_bitField0_ & 0x00000001) == 0x00000001)) {\n              to_bitField0_ |= 0x00000001;\n            }\n            result.name_ = name_;\n            if (((from_bitField0_ & 0x00000002) == 0x00000002)) {\n              to_bitField0_ |= 0x00000002;\n            }\n            result.type_ = type_;\n            if (valueCase_ == 3) {\n              result.value_ = value_;\n            }\n            if (valueCase_ == 4) {\n              result.value_ = value_;\n            }\n            if (valueCase_ == 5) {\n              result.value_ = value_;\n            }\n            if (valueCase_ == 6) {\n              result.value_ = value_;\n            }\n            if (valueCase_ == 7) {\n              result.value_ = value_;\n            }\n            if (valueCase_ == 8) {\n              result.value_ = value_;\n            }\n            if (valueCase_ == 9) {\n              if (extensionValueBuilder_ == null) {\n                result.value_ = value_;\n              } else {\n                result.value_ = extensionValueBuilder_.build();\n              }\n            }\n            result.bitField0_ = to_bitField0_;\n            result.valueCase_ = valueCase_;\n            onBuilt();\n            return result;\n          }\n\n          public Builder clone() {\n            return (Builder) super.clone();\n          }\n          public Builder setField(\n              com.google.protobuf.Descriptors.FieldDescriptor field,\n              Object value) {\n            return (Builder) super.setField(field, value);\n          }\n          public Builder clearField(\n              com.google.protobuf.Descriptors.FieldDescriptor field) {\n            return (Builder) super.clearField(field);\n          }\n          public Builder clearOneof(\n              com.google.protobuf.Descriptors.OneofDescriptor oneof) {\n            return (Builder) super.clearOneof(oneof);\n          }\n          public Builder setRepeatedField(\n              com.google.protobuf.Descriptors.FieldDescriptor field,\n              int index, Object value) {\n            return (Builder) super.setRepeatedField(field, index, value);\n          }\n          public Builder addRepeatedField(\n              com.google.protobuf.Descriptors.FieldDescriptor field,\n              Object value) {\n            return (Builder) super.addRepeatedField(field, value);\n          }\n          public Builder mergeFrom(com.google.protobuf.Message other) {\n            if (other instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter) {\n              return mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter)other);\n            } else {\n              super.mergeFrom(other);\n              return this;\n            }\n          }\n\n          public Builder mergeFrom(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter other) {\n            if (other == org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.getDefaultInstance()) return this;\n            if (other.hasName()) {\n              bitField0_ |= 0x00000001;\n              name_ = other.name_;\n              onChanged();\n            }\n            if (other.hasType()) {\n              setType(other.getType());\n            }\n            switch (other.getValueCase()) {\n              case INT_VALUE: {\n                setIntValue(other.getIntValue());\n                break;\n              }\n              case LONG_VALUE: {\n                setLongValue(other.getLongValue());\n                break;\n              }\n              case FLOAT_VALUE: {\n                setFloatValue(other.getFloatValue());\n                break;\n              }\n              case DOUBLE_VALUE: {\n                setDoubleValue(other.getDoubleValue());\n                break;\n              }\n              case BOOLEAN_VALUE: {\n                setBooleanValue(other.getBooleanValue());\n                break;\n              }\n              case STRING_VALUE: {\n                valueCase_ = 8;\n                value_ = other.value_;\n                onChanged();\n                break;\n              }\n              case EXTENSION_VALUE: {\n                mergeExtensionValue(other.getExtensionValue());\n                break;\n              }\n              case VALUE_NOT_SET: {\n                break;\n              }\n            }\n            this.mergeUnknownFields(other.unknownFields);\n            onChanged();\n            return this;\n          }\n\n          public final boolean isInitialized() {\n            if (hasExtensionValue()) {\n              if (!getExtensionValue().isInitialized()) {\n                return false;\n              }\n            }\n            return true;\n          }\n\n          public Builder mergeFrom(\n              com.google.protobuf.CodedInputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws java.io.IOException {\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter parsedMessage = null;\n            try {\n              parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry);\n            } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n              parsedMessage = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter) e.getUnfinishedMessage();\n              throw e.unwrapIOException();\n            } finally {\n              if (parsedMessage != null) {\n                mergeFrom(parsedMessage);\n              }\n            }\n            return this;\n          }\n          private int valueCase_ = 0;\n          private java.lang.Object value_;\n          public ValueCase\n              getValueCase() {\n            return ValueCase.forNumber(\n                valueCase_);\n          }\n\n          public Builder clearValue() {\n            valueCase_ = 0;\n            value_ = null;\n            onChanged();\n            return this;\n          }\n\n          private int bitField0_;\n\n          private java.lang.Object name_ = \"\";\n          /**\n           * <code>optional string name = 1;</code>\n           */\n          public boolean hasName() {\n            return ((bitField0_ & 0x00000001) == 0x00000001);\n          }\n          /**\n           * <code>optional string name = 1;</code>\n           */\n          public java.lang.String getName() {\n            java.lang.Object ref = name_;\n            if (!(ref instanceof java.lang.String)) {\n              com.google.protobuf.ByteString bs =\n                  (com.google.protobuf.ByteString) ref;\n              java.lang.String s = bs.toStringUtf8();\n              if (bs.isValidUtf8()) {\n                name_ = s;\n              }\n              return s;\n            } else {\n              return (java.lang.String) ref;\n            }\n          }\n          /**\n           * <code>optional string name = 1;</code>\n           */\n          public com.google.protobuf.ByteString\n              getNameBytes() {\n            java.lang.Object ref = name_;\n            if (ref instanceof String) {\n              com.google.protobuf.ByteString b = \n                  com.google.protobuf.ByteString.copyFromUtf8(\n                      (java.lang.String) ref);\n              name_ = b;\n              return b;\n            } else {\n              return (com.google.protobuf.ByteString) ref;\n            }\n          }\n          /**\n           * <code>optional string name = 1;</code>\n           */\n          public Builder setName(\n              java.lang.String value) {\n            if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000001;\n            name_ = value;\n            onChanged();\n            return this;\n          }\n          /**\n           * <code>optional string name = 1;</code>\n           */\n          public Builder clearName() {\n            bitField0_ = (bitField0_ & ~0x00000001);\n            name_ = getDefaultInstance().getName();\n            onChanged();\n            return this;\n          }\n          /**\n           * <code>optional string name = 1;</code>\n           */\n          public Builder setNameBytes(\n              com.google.protobuf.ByteString value) {\n            if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000001;\n            name_ = value;\n            onChanged();\n            return this;\n          }\n\n          private int type_ ;\n          /**\n           * <code>optional uint32 type = 2;</code>\n           */\n          public boolean hasType() {\n            return ((bitField0_ & 0x00000002) == 0x00000002);\n          }\n          /**\n           * <code>optional uint32 type = 2;</code>\n           */\n          public int getType() {\n            return type_;\n          }\n          /**\n           * <code>optional uint32 type = 2;</code>\n           */\n          public Builder setType(int value) {\n            bitField0_ |= 0x00000002;\n            type_ = value;\n            onChanged();\n            return this;\n          }\n          /**\n           * <code>optional uint32 type = 2;</code>\n           */\n          public Builder clearType() {\n            bitField0_ = (bitField0_ & ~0x00000002);\n            type_ = 0;\n            onChanged();\n            return this;\n          }\n\n          /**\n           * <code>optional uint32 int_value = 3;</code>\n           */\n          public boolean hasIntValue() {\n            return valueCase_ == 3;\n          }\n          /**\n           * <code>optional uint32 int_value = 3;</code>\n           */\n          public int getIntValue() {\n            if (valueCase_ == 3) {\n              return (java.lang.Integer) value_;\n            }\n            return 0;\n          }\n          /**\n           * <code>optional uint32 int_value = 3;</code>\n           */\n          public Builder setIntValue(int value) {\n            valueCase_ = 3;\n            value_ = value;\n            onChanged();\n            return this;\n          }\n          /**\n           * <code>optional uint32 int_value = 3;</code>\n           */\n          public Builder clearIntValue() {\n            if (valueCase_ == 3) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n            return this;\n          }\n\n          /**\n           * <code>optional uint64 long_value = 4;</code>\n           */\n          public boolean hasLongValue() {\n            return valueCase_ == 4;\n          }\n          /**\n           * <code>optional uint64 long_value = 4;</code>\n           */\n          public long getLongValue() {\n            if (valueCase_ == 4) {\n              return (java.lang.Long) value_;\n            }\n            return 0L;\n          }\n          /**\n           * <code>optional uint64 long_value = 4;</code>\n           */\n          public Builder setLongValue(long value) {\n            valueCase_ = 4;\n            value_ = value;\n            onChanged();\n            return this;\n          }\n          /**\n           * <code>optional uint64 long_value = 4;</code>\n           */\n          public Builder clearLongValue() {\n            if (valueCase_ == 4) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n            return this;\n          }\n\n          /**\n           * <code>optional float float_value = 5;</code>\n           */\n          public boolean hasFloatValue() {\n            return valueCase_ == 5;\n          }\n          /**\n           * <code>optional float float_value = 5;</code>\n           */\n          public float getFloatValue() {\n            if (valueCase_ == 5) {\n              return (java.lang.Float) value_;\n            }\n            return 0F;\n          }\n          /**\n           * <code>optional float float_value = 5;</code>\n           */\n          public Builder setFloatValue(float value) {\n            valueCase_ = 5;\n            value_ = value;\n            onChanged();\n            return this;\n          }\n          /**\n           * <code>optional float float_value = 5;</code>\n           */\n          public Builder clearFloatValue() {\n            if (valueCase_ == 5) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n            return this;\n          }\n\n          /**\n           * <code>optional double double_value = 6;</code>\n           */\n          public boolean hasDoubleValue() {\n            return valueCase_ == 6;\n          }\n          /**\n           * <code>optional double double_value = 6;</code>\n           */\n          public double getDoubleValue() {\n            if (valueCase_ == 6) {\n              return (java.lang.Double) value_;\n            }\n            return 0D;\n          }\n          /**\n           * <code>optional double double_value = 6;</code>\n           */\n          public Builder setDoubleValue(double value) {\n            valueCase_ = 6;\n            value_ = value;\n            onChanged();\n            return this;\n          }\n          /**\n           * <code>optional double double_value = 6;</code>\n           */\n          public Builder clearDoubleValue() {\n            if (valueCase_ == 6) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n            return this;\n          }\n\n          /**\n           * <code>optional bool boolean_value = 7;</code>\n           */\n          public boolean hasBooleanValue() {\n            return valueCase_ == 7;\n          }\n          /**\n           * <code>optional bool boolean_value = 7;</code>\n           */\n          public boolean getBooleanValue() {\n            if (valueCase_ == 7) {\n              return (java.lang.Boolean) value_;\n            }\n            return false;\n          }\n          /**\n           * <code>optional bool boolean_value = 7;</code>\n           */\n          public Builder setBooleanValue(boolean value) {\n            valueCase_ = 7;\n            value_ = value;\n            onChanged();\n            return this;\n          }\n          /**\n           * <code>optional bool boolean_value = 7;</code>\n           */\n          public Builder clearBooleanValue() {\n            if (valueCase_ == 7) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n            return this;\n          }\n\n          /**\n           * <code>optional string string_value = 8;</code>\n           */\n          public boolean hasStringValue() {\n            return valueCase_ == 8;\n          }\n          /**\n           * <code>optional string string_value = 8;</code>\n           */\n          public java.lang.String getStringValue() {\n            java.lang.Object ref = \"\";\n            if (valueCase_ == 8) {\n              ref = value_;\n            }\n            if (!(ref instanceof java.lang.String)) {\n              com.google.protobuf.ByteString bs =\n                  (com.google.protobuf.ByteString) ref;\n              java.lang.String s = bs.toStringUtf8();\n              if (valueCase_ == 8) {\n                if (bs.isValidUtf8()) {\n                  value_ = s;\n                }\n              }\n              return s;\n            } else {\n              return (java.lang.String) ref;\n            }\n          }\n          /**\n           * <code>optional string string_value = 8;</code>\n           */\n          public com.google.protobuf.ByteString\n              getStringValueBytes() {\n            java.lang.Object ref = \"\";\n            if (valueCase_ == 8) {\n              ref = value_;\n            }\n            if (ref instanceof String) {\n              com.google.protobuf.ByteString b = \n                  com.google.protobuf.ByteString.copyFromUtf8(\n                      (java.lang.String) ref);\n              if (valueCase_ == 8) {\n                value_ = b;\n              }\n              return b;\n            } else {\n              return (com.google.protobuf.ByteString) ref;\n            }\n          }\n          /**\n           * <code>optional string string_value = 8;</code>\n           */\n          public Builder setStringValue(\n              java.lang.String value) {\n            if (value == null) {\n    throw new NullPointerException();\n  }\n  valueCase_ = 8;\n            value_ = value;\n            onChanged();\n            return this;\n          }\n          /**\n           * <code>optional string string_value = 8;</code>\n           */\n          public Builder clearStringValue() {\n            if (valueCase_ == 8) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n            return this;\n          }\n          /**\n           * <code>optional string string_value = 8;</code>\n           */\n          public Builder setStringValueBytes(\n              com.google.protobuf.ByteString value) {\n            if (value == null) {\n    throw new NullPointerException();\n  }\n  valueCase_ = 8;\n            value_ = value;\n            onChanged();\n            return this;\n          }\n\n          private com.google.protobuf.SingleFieldBuilderV3<\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtensionOrBuilder> extensionValueBuilder_;\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension extension_value = 9;</code>\n           */\n          public boolean hasExtensionValue() {\n            return valueCase_ == 9;\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension extension_value = 9;</code>\n           */\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension getExtensionValue() {\n            if (extensionValueBuilder_ == null) {\n              if (valueCase_ == 9) {\n                return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension) value_;\n              }\n              return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.getDefaultInstance();\n            } else {\n              if (valueCase_ == 9) {\n                return extensionValueBuilder_.getMessage();\n              }\n              return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.getDefaultInstance();\n            }\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension extension_value = 9;</code>\n           */\n          public Builder setExtensionValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension value) {\n            if (extensionValueBuilder_ == null) {\n              if (value == null) {\n                throw new NullPointerException();\n              }\n              value_ = value;\n              onChanged();\n            } else {\n              extensionValueBuilder_.setMessage(value);\n            }\n            valueCase_ = 9;\n            return this;\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension extension_value = 9;</code>\n           */\n          public Builder setExtensionValue(\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.Builder builderForValue) {\n            if (extensionValueBuilder_ == null) {\n              value_ = builderForValue.build();\n              onChanged();\n            } else {\n              extensionValueBuilder_.setMessage(builderForValue.build());\n            }\n            valueCase_ = 9;\n            return this;\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension extension_value = 9;</code>\n           */\n          public Builder mergeExtensionValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension value) {\n            if (extensionValueBuilder_ == null) {\n              if (valueCase_ == 9 &&\n                  value_ != org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.getDefaultInstance()) {\n                value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.newBuilder((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension) value_)\n                    .mergeFrom(value).buildPartial();\n              } else {\n                value_ = value;\n              }\n              onChanged();\n            } else {\n              if (valueCase_ == 9) {\n                extensionValueBuilder_.mergeFrom(value);\n              }\n              extensionValueBuilder_.setMessage(value);\n            }\n            valueCase_ = 9;\n            return this;\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension extension_value = 9;</code>\n           */\n          public Builder clearExtensionValue() {\n            if (extensionValueBuilder_ == null) {\n              if (valueCase_ == 9) {\n                valueCase_ = 0;\n                value_ = null;\n                onChanged();\n              }\n            } else {\n              if (valueCase_ == 9) {\n                valueCase_ = 0;\n                value_ = null;\n              }\n              extensionValueBuilder_.clear();\n            }\n            return this;\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension extension_value = 9;</code>\n           */\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.Builder getExtensionValueBuilder() {\n            return getExtensionValueFieldBuilder().getBuilder();\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension extension_value = 9;</code>\n           */\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtensionOrBuilder getExtensionValueOrBuilder() {\n            if ((valueCase_ == 9) && (extensionValueBuilder_ != null)) {\n              return extensionValueBuilder_.getMessageOrBuilder();\n            } else {\n              if (valueCase_ == 9) {\n                return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension) value_;\n              }\n              return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.getDefaultInstance();\n            }\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension extension_value = 9;</code>\n           */\n          private com.google.protobuf.SingleFieldBuilderV3<\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtensionOrBuilder> \n              getExtensionValueFieldBuilder() {\n            if (extensionValueBuilder_ == null) {\n              if (!(valueCase_ == 9)) {\n                value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.getDefaultInstance();\n              }\n              extensionValueBuilder_ = new com.google.protobuf.SingleFieldBuilderV3<\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtensionOrBuilder>(\n                      (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.ParameterValueExtension) value_,\n                      getParentForChildren(),\n                      isClean());\n              value_ = null;\n            }\n            valueCase_ = 9;\n            onChanged();;\n            return extensionValueBuilder_;\n          }\n          public final Builder setUnknownFields(\n              final com.google.protobuf.UnknownFieldSet unknownFields) {\n            return super.setUnknownFields(unknownFields);\n          }\n\n          public final Builder mergeUnknownFields(\n              final com.google.protobuf.UnknownFieldSet unknownFields) {\n            return super.mergeUnknownFields(unknownFields);\n          }\n\n\n          // @@protoc_insertion_point(builder_scope:org.eclipse.tahu.protobuf.Payload.Template.Parameter)\n        }\n\n        // @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.Template.Parameter)\n        private static final org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter DEFAULT_INSTANCE;\n        static {\n          DEFAULT_INSTANCE = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter();\n        }\n\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter getDefaultInstance() {\n          return DEFAULT_INSTANCE;\n        }\n\n        @java.lang.Deprecated public static final com.google.protobuf.Parser<Parameter>\n            PARSER = new com.google.protobuf.AbstractParser<Parameter>() {\n          public Parameter parsePartialFrom(\n              com.google.protobuf.CodedInputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws com.google.protobuf.InvalidProtocolBufferException {\n              return new Parameter(input, extensionRegistry);\n          }\n        };\n\n        public static com.google.protobuf.Parser<Parameter> parser() {\n          return PARSER;\n        }\n\n        @java.lang.Override\n        public com.google.protobuf.Parser<Parameter> getParserForType() {\n          return PARSER;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter getDefaultInstanceForType() {\n          return DEFAULT_INSTANCE;\n        }\n\n      }\n\n      private int bitField0_;\n      public static final int VERSION_FIELD_NUMBER = 1;\n      private volatile java.lang.Object version_;\n      /**\n       * <pre>\n       * The version of the Template to prevent mismatches\n       * </pre>\n       *\n       * <code>optional string version = 1;</code>\n       */\n      public boolean hasVersion() {\n        return ((bitField0_ & 0x00000001) == 0x00000001);\n      }\n      /**\n       * <pre>\n       * The version of the Template to prevent mismatches\n       * </pre>\n       *\n       * <code>optional string version = 1;</code>\n       */\n      public java.lang.String getVersion() {\n        java.lang.Object ref = version_;\n        if (ref instanceof java.lang.String) {\n          return (java.lang.String) ref;\n        } else {\n          com.google.protobuf.ByteString bs = \n              (com.google.protobuf.ByteString) ref;\n          java.lang.String s = bs.toStringUtf8();\n          if (bs.isValidUtf8()) {\n            version_ = s;\n          }\n          return s;\n        }\n      }\n      /**\n       * <pre>\n       * The version of the Template to prevent mismatches\n       * </pre>\n       *\n       * <code>optional string version = 1;</code>\n       */\n      public com.google.protobuf.ByteString\n          getVersionBytes() {\n        java.lang.Object ref = version_;\n        if (ref instanceof java.lang.String) {\n          com.google.protobuf.ByteString b = \n              com.google.protobuf.ByteString.copyFromUtf8(\n                  (java.lang.String) ref);\n          version_ = b;\n          return b;\n        } else {\n          return (com.google.protobuf.ByteString) ref;\n        }\n      }\n\n      public static final int METRICS_FIELD_NUMBER = 2;\n      private java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric> metrics_;\n      /**\n       * <pre>\n       * Each metric includes a name, datatype, and optionally a value\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric> getMetricsList() {\n        return metrics_;\n      }\n      /**\n       * <pre>\n       * Each metric includes a name, datatype, and optionally a value\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder> \n          getMetricsOrBuilderList() {\n        return metrics_;\n      }\n      /**\n       * <pre>\n       * Each metric includes a name, datatype, and optionally a value\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public int getMetricsCount() {\n        return metrics_.size();\n      }\n      /**\n       * <pre>\n       * Each metric includes a name, datatype, and optionally a value\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric getMetrics(int index) {\n        return metrics_.get(index);\n      }\n      /**\n       * <pre>\n       * Each metric includes a name, datatype, and optionally a value\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder getMetricsOrBuilder(\n          int index) {\n        return metrics_.get(index);\n      }\n\n      public static final int PARAMETERS_FIELD_NUMBER = 3;\n      private java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter> parameters_;\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n       */\n      public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter> getParametersList() {\n        return parameters_;\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n       */\n      public java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.ParameterOrBuilder> \n          getParametersOrBuilderList() {\n        return parameters_;\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n       */\n      public int getParametersCount() {\n        return parameters_.size();\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter getParameters(int index) {\n        return parameters_.get(index);\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.ParameterOrBuilder getParametersOrBuilder(\n          int index) {\n        return parameters_.get(index);\n      }\n\n      public static final int TEMPLATE_REF_FIELD_NUMBER = 4;\n      private volatile java.lang.Object templateRef_;\n      /**\n       * <pre>\n       * Reference to a template if this is extending a Template or an instance - must exist if an instance\n       * </pre>\n       *\n       * <code>optional string template_ref = 4;</code>\n       */\n      public boolean hasTemplateRef() {\n        return ((bitField0_ & 0x00000002) == 0x00000002);\n      }\n      /**\n       * <pre>\n       * Reference to a template if this is extending a Template or an instance - must exist if an instance\n       * </pre>\n       *\n       * <code>optional string template_ref = 4;</code>\n       */\n      public java.lang.String getTemplateRef() {\n        java.lang.Object ref = templateRef_;\n        if (ref instanceof java.lang.String) {\n          return (java.lang.String) ref;\n        } else {\n          com.google.protobuf.ByteString bs = \n              (com.google.protobuf.ByteString) ref;\n          java.lang.String s = bs.toStringUtf8();\n          if (bs.isValidUtf8()) {\n            templateRef_ = s;\n          }\n          return s;\n        }\n      }\n      /**\n       * <pre>\n       * Reference to a template if this is extending a Template or an instance - must exist if an instance\n       * </pre>\n       *\n       * <code>optional string template_ref = 4;</code>\n       */\n      public com.google.protobuf.ByteString\n          getTemplateRefBytes() {\n        java.lang.Object ref = templateRef_;\n        if (ref instanceof java.lang.String) {\n          com.google.protobuf.ByteString b = \n              com.google.protobuf.ByteString.copyFromUtf8(\n                  (java.lang.String) ref);\n          templateRef_ = b;\n          return b;\n        } else {\n          return (com.google.protobuf.ByteString) ref;\n        }\n      }\n\n      public static final int IS_DEFINITION_FIELD_NUMBER = 5;\n      private boolean isDefinition_;\n      /**\n       * <code>optional bool is_definition = 5;</code>\n       */\n      public boolean hasIsDefinition() {\n        return ((bitField0_ & 0x00000004) == 0x00000004);\n      }\n      /**\n       * <code>optional bool is_definition = 5;</code>\n       */\n      public boolean getIsDefinition() {\n        return isDefinition_;\n      }\n\n      private byte memoizedIsInitialized = -1;\n      public final boolean isInitialized() {\n        byte isInitialized = memoizedIsInitialized;\n        if (isInitialized == 1) return true;\n        if (isInitialized == 0) return false;\n\n        for (int i = 0; i < getMetricsCount(); i++) {\n          if (!getMetrics(i).isInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n        }\n        for (int i = 0; i < getParametersCount(); i++) {\n          if (!getParameters(i).isInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n        }\n        if (!extensionsAreInitialized()) {\n          memoizedIsInitialized = 0;\n          return false;\n        }\n        memoizedIsInitialized = 1;\n        return true;\n      }\n\n      public void writeTo(com.google.protobuf.CodedOutputStream output)\n                          throws java.io.IOException {\n        com.google.protobuf.GeneratedMessageV3\n          .ExtendableMessage<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template>.ExtensionWriter\n            extensionWriter = newExtensionWriter();\n        if (((bitField0_ & 0x00000001) == 0x00000001)) {\n          com.google.protobuf.GeneratedMessageV3.writeString(output, 1, version_);\n        }\n        for (int i = 0; i < metrics_.size(); i++) {\n          output.writeMessage(2, metrics_.get(i));\n        }\n        for (int i = 0; i < parameters_.size(); i++) {\n          output.writeMessage(3, parameters_.get(i));\n        }\n        if (((bitField0_ & 0x00000002) == 0x00000002)) {\n          com.google.protobuf.GeneratedMessageV3.writeString(output, 4, templateRef_);\n        }\n        if (((bitField0_ & 0x00000004) == 0x00000004)) {\n          output.writeBool(5, isDefinition_);\n        }\n        extensionWriter.writeUntil(536870912, output);\n        unknownFields.writeTo(output);\n      }\n\n      public int getSerializedSize() {\n        int size = memoizedSize;\n        if (size != -1) return size;\n\n        size = 0;\n        if (((bitField0_ & 0x00000001) == 0x00000001)) {\n          size += com.google.protobuf.GeneratedMessageV3.computeStringSize(1, version_);\n        }\n        for (int i = 0; i < metrics_.size(); i++) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeMessageSize(2, metrics_.get(i));\n        }\n        for (int i = 0; i < parameters_.size(); i++) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeMessageSize(3, parameters_.get(i));\n        }\n        if (((bitField0_ & 0x00000002) == 0x00000002)) {\n          size += com.google.protobuf.GeneratedMessageV3.computeStringSize(4, templateRef_);\n        }\n        if (((bitField0_ & 0x00000004) == 0x00000004)) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeBoolSize(5, isDefinition_);\n        }\n        size += extensionsSerializedSize();\n        size += unknownFields.getSerializedSize();\n        memoizedSize = size;\n        return size;\n      }\n\n      private static final long serialVersionUID = 0L;\n      @java.lang.Override\n      public boolean equals(final java.lang.Object obj) {\n        if (obj == this) {\n         return true;\n        }\n        if (!(obj instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template)) {\n          return super.equals(obj);\n        }\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template other = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template) obj;\n\n        boolean result = true;\n        result = result && (hasVersion() == other.hasVersion());\n        if (hasVersion()) {\n          result = result && getVersion()\n              .equals(other.getVersion());\n        }\n        result = result && getMetricsList()\n            .equals(other.getMetricsList());\n        result = result && getParametersList()\n            .equals(other.getParametersList());\n        result = result && (hasTemplateRef() == other.hasTemplateRef());\n        if (hasTemplateRef()) {\n          result = result && getTemplateRef()\n              .equals(other.getTemplateRef());\n        }\n        result = result && (hasIsDefinition() == other.hasIsDefinition());\n        if (hasIsDefinition()) {\n          result = result && (getIsDefinition()\n              == other.getIsDefinition());\n        }\n        result = result && unknownFields.equals(other.unknownFields);\n        result = result &&\n            getExtensionFields().equals(other.getExtensionFields());\n        return result;\n      }\n\n      @java.lang.Override\n      public int hashCode() {\n        if (memoizedHashCode != 0) {\n          return memoizedHashCode;\n        }\n        int hash = 41;\n        hash = (19 * hash) + getDescriptorForType().hashCode();\n        if (hasVersion()) {\n          hash = (37 * hash) + VERSION_FIELD_NUMBER;\n          hash = (53 * hash) + getVersion().hashCode();\n        }\n        if (getMetricsCount() > 0) {\n          hash = (37 * hash) + METRICS_FIELD_NUMBER;\n          hash = (53 * hash) + getMetricsList().hashCode();\n        }\n        if (getParametersCount() > 0) {\n          hash = (37 * hash) + PARAMETERS_FIELD_NUMBER;\n          hash = (53 * hash) + getParametersList().hashCode();\n        }\n        if (hasTemplateRef()) {\n          hash = (37 * hash) + TEMPLATE_REF_FIELD_NUMBER;\n          hash = (53 * hash) + getTemplateRef().hashCode();\n        }\n        if (hasIsDefinition()) {\n          hash = (37 * hash) + IS_DEFINITION_FIELD_NUMBER;\n          hash = (53 * hash) + com.google.protobuf.Internal.hashBoolean(\n              getIsDefinition());\n        }\n        hash = hashFields(hash, getExtensionFields());\n        hash = (29 * hash) + unknownFields.hashCode();\n        memoizedHashCode = hash;\n        return hash;\n      }\n\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template parseFrom(\n          com.google.protobuf.ByteString data)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template parseFrom(\n          com.google.protobuf.ByteString data,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template parseFrom(byte[] data)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template parseFrom(\n          byte[] data,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template parseFrom(java.io.InputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template parseFrom(\n          java.io.InputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template parseDelimitedFrom(java.io.InputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseDelimitedWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template parseDelimitedFrom(\n          java.io.InputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseDelimitedWithIOException(PARSER, input, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template parseFrom(\n          com.google.protobuf.CodedInputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template parseFrom(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input, extensionRegistry);\n      }\n\n      public Builder newBuilderForType() { return newBuilder(); }\n      public static Builder newBuilder() {\n        return DEFAULT_INSTANCE.toBuilder();\n      }\n      public static Builder newBuilder(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template prototype) {\n        return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);\n      }\n      public Builder toBuilder() {\n        return this == DEFAULT_INSTANCE\n            ? new Builder() : new Builder().mergeFrom(this);\n      }\n\n      @java.lang.Override\n      protected Builder newBuilderForType(\n          com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n        Builder builder = new Builder(parent);\n        return builder;\n      }\n      /**\n       * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.Template}\n       */\n      public static final class Builder extends\n          com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template, Builder> implements\n          // @@protoc_insertion_point(builder_implements:org.eclipse.tahu.protobuf.Payload.Template)\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.TemplateOrBuilder {\n        public static final com.google.protobuf.Descriptors.Descriptor\n            getDescriptor() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Template_descriptor;\n        }\n\n        protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n            internalGetFieldAccessorTable() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Template_fieldAccessorTable\n              .ensureFieldAccessorsInitialized(\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Builder.class);\n        }\n\n        // Construct using org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.newBuilder()\n        private Builder() {\n          maybeForceBuilderInitialization();\n        }\n\n        private Builder(\n            com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n          super(parent);\n          maybeForceBuilderInitialization();\n        }\n        private void maybeForceBuilderInitialization() {\n          if (com.google.protobuf.GeneratedMessageV3\n                  .alwaysUseFieldBuilders) {\n            getMetricsFieldBuilder();\n            getParametersFieldBuilder();\n          }\n        }\n        public Builder clear() {\n          super.clear();\n          version_ = \"\";\n          bitField0_ = (bitField0_ & ~0x00000001);\n          if (metricsBuilder_ == null) {\n            metrics_ = java.util.Collections.emptyList();\n            bitField0_ = (bitField0_ & ~0x00000002);\n          } else {\n            metricsBuilder_.clear();\n          }\n          if (parametersBuilder_ == null) {\n            parameters_ = java.util.Collections.emptyList();\n            bitField0_ = (bitField0_ & ~0x00000004);\n          } else {\n            parametersBuilder_.clear();\n          }\n          templateRef_ = \"\";\n          bitField0_ = (bitField0_ & ~0x00000008);\n          isDefinition_ = false;\n          bitField0_ = (bitField0_ & ~0x00000010);\n          return this;\n        }\n\n        public com.google.protobuf.Descriptors.Descriptor\n            getDescriptorForType() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Template_descriptor;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template getDefaultInstanceForType() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.getDefaultInstance();\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template build() {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template result = buildPartial();\n          if (!result.isInitialized()) {\n            throw newUninitializedMessageException(result);\n          }\n          return result;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template buildPartial() {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template result = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template(this);\n          int from_bitField0_ = bitField0_;\n          int to_bitField0_ = 0;\n          if (((from_bitField0_ & 0x00000001) == 0x00000001)) {\n            to_bitField0_ |= 0x00000001;\n          }\n          result.version_ = version_;\n          if (metricsBuilder_ == null) {\n            if (((bitField0_ & 0x00000002) == 0x00000002)) {\n              metrics_ = java.util.Collections.unmodifiableList(metrics_);\n              bitField0_ = (bitField0_ & ~0x00000002);\n            }\n            result.metrics_ = metrics_;\n          } else {\n            result.metrics_ = metricsBuilder_.build();\n          }\n          if (parametersBuilder_ == null) {\n            if (((bitField0_ & 0x00000004) == 0x00000004)) {\n              parameters_ = java.util.Collections.unmodifiableList(parameters_);\n              bitField0_ = (bitField0_ & ~0x00000004);\n            }\n            result.parameters_ = parameters_;\n          } else {\n            result.parameters_ = parametersBuilder_.build();\n          }\n          if (((from_bitField0_ & 0x00000008) == 0x00000008)) {\n            to_bitField0_ |= 0x00000002;\n          }\n          result.templateRef_ = templateRef_;\n          if (((from_bitField0_ & 0x00000010) == 0x00000010)) {\n            to_bitField0_ |= 0x00000004;\n          }\n          result.isDefinition_ = isDefinition_;\n          result.bitField0_ = to_bitField0_;\n          onBuilt();\n          return result;\n        }\n\n        public Builder clone() {\n          return (Builder) super.clone();\n        }\n        public Builder setField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            Object value) {\n          return (Builder) super.setField(field, value);\n        }\n        public Builder clearField(\n            com.google.protobuf.Descriptors.FieldDescriptor field) {\n          return (Builder) super.clearField(field);\n        }\n        public Builder clearOneof(\n            com.google.protobuf.Descriptors.OneofDescriptor oneof) {\n          return (Builder) super.clearOneof(oneof);\n        }\n        public Builder setRepeatedField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            int index, Object value) {\n          return (Builder) super.setRepeatedField(field, index, value);\n        }\n        public Builder addRepeatedField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            Object value) {\n          return (Builder) super.addRepeatedField(field, value);\n        }\n        public <Type> Builder setExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template, Type> extension,\n            Type value) {\n          return (Builder) super.setExtension(extension, value);\n        }\n        public <Type> Builder setExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template, java.util.List<Type>> extension,\n            int index, Type value) {\n          return (Builder) super.setExtension(extension, index, value);\n        }\n        public <Type> Builder addExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template, java.util.List<Type>> extension,\n            Type value) {\n          return (Builder) super.addExtension(extension, value);\n        }\n        public <Type> Builder clearExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template, ?> extension) {\n          return (Builder) super.clearExtension(extension);\n        }\n        public Builder mergeFrom(com.google.protobuf.Message other) {\n          if (other instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template) {\n            return mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template)other);\n          } else {\n            super.mergeFrom(other);\n            return this;\n          }\n        }\n\n        public Builder mergeFrom(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template other) {\n          if (other == org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.getDefaultInstance()) return this;\n          if (other.hasVersion()) {\n            bitField0_ |= 0x00000001;\n            version_ = other.version_;\n            onChanged();\n          }\n          if (metricsBuilder_ == null) {\n            if (!other.metrics_.isEmpty()) {\n              if (metrics_.isEmpty()) {\n                metrics_ = other.metrics_;\n                bitField0_ = (bitField0_ & ~0x00000002);\n              } else {\n                ensureMetricsIsMutable();\n                metrics_.addAll(other.metrics_);\n              }\n              onChanged();\n            }\n          } else {\n            if (!other.metrics_.isEmpty()) {\n              if (metricsBuilder_.isEmpty()) {\n                metricsBuilder_.dispose();\n                metricsBuilder_ = null;\n                metrics_ = other.metrics_;\n                bitField0_ = (bitField0_ & ~0x00000002);\n                metricsBuilder_ = \n                  com.google.protobuf.GeneratedMessageV3.alwaysUseFieldBuilders ?\n                     getMetricsFieldBuilder() : null;\n              } else {\n                metricsBuilder_.addAllMessages(other.metrics_);\n              }\n            }\n          }\n          if (parametersBuilder_ == null) {\n            if (!other.parameters_.isEmpty()) {\n              if (parameters_.isEmpty()) {\n                parameters_ = other.parameters_;\n                bitField0_ = (bitField0_ & ~0x00000004);\n              } else {\n                ensureParametersIsMutable();\n                parameters_.addAll(other.parameters_);\n              }\n              onChanged();\n            }\n          } else {\n            if (!other.parameters_.isEmpty()) {\n              if (parametersBuilder_.isEmpty()) {\n                parametersBuilder_.dispose();\n                parametersBuilder_ = null;\n                parameters_ = other.parameters_;\n                bitField0_ = (bitField0_ & ~0x00000004);\n                parametersBuilder_ = \n                  com.google.protobuf.GeneratedMessageV3.alwaysUseFieldBuilders ?\n                     getParametersFieldBuilder() : null;\n              } else {\n                parametersBuilder_.addAllMessages(other.parameters_);\n              }\n            }\n          }\n          if (other.hasTemplateRef()) {\n            bitField0_ |= 0x00000008;\n            templateRef_ = other.templateRef_;\n            onChanged();\n          }\n          if (other.hasIsDefinition()) {\n            setIsDefinition(other.getIsDefinition());\n          }\n          this.mergeExtensionFields(other);\n          this.mergeUnknownFields(other.unknownFields);\n          onChanged();\n          return this;\n        }\n\n        public final boolean isInitialized() {\n          for (int i = 0; i < getMetricsCount(); i++) {\n            if (!getMetrics(i).isInitialized()) {\n              return false;\n            }\n          }\n          for (int i = 0; i < getParametersCount(); i++) {\n            if (!getParameters(i).isInitialized()) {\n              return false;\n            }\n          }\n          if (!extensionsAreInitialized()) {\n            return false;\n          }\n          return true;\n        }\n\n        public Builder mergeFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template parsedMessage = null;\n          try {\n            parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry);\n          } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n            parsedMessage = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template) e.getUnfinishedMessage();\n            throw e.unwrapIOException();\n          } finally {\n            if (parsedMessage != null) {\n              mergeFrom(parsedMessage);\n            }\n          }\n          return this;\n        }\n        private int bitField0_;\n\n        private java.lang.Object version_ = \"\";\n        /**\n         * <pre>\n         * The version of the Template to prevent mismatches\n         * </pre>\n         *\n         * <code>optional string version = 1;</code>\n         */\n        public boolean hasVersion() {\n          return ((bitField0_ & 0x00000001) == 0x00000001);\n        }\n        /**\n         * <pre>\n         * The version of the Template to prevent mismatches\n         * </pre>\n         *\n         * <code>optional string version = 1;</code>\n         */\n        public java.lang.String getVersion() {\n          java.lang.Object ref = version_;\n          if (!(ref instanceof java.lang.String)) {\n            com.google.protobuf.ByteString bs =\n                (com.google.protobuf.ByteString) ref;\n            java.lang.String s = bs.toStringUtf8();\n            if (bs.isValidUtf8()) {\n              version_ = s;\n            }\n            return s;\n          } else {\n            return (java.lang.String) ref;\n          }\n        }\n        /**\n         * <pre>\n         * The version of the Template to prevent mismatches\n         * </pre>\n         *\n         * <code>optional string version = 1;</code>\n         */\n        public com.google.protobuf.ByteString\n            getVersionBytes() {\n          java.lang.Object ref = version_;\n          if (ref instanceof String) {\n            com.google.protobuf.ByteString b = \n                com.google.protobuf.ByteString.copyFromUtf8(\n                    (java.lang.String) ref);\n            version_ = b;\n            return b;\n          } else {\n            return (com.google.protobuf.ByteString) ref;\n          }\n        }\n        /**\n         * <pre>\n         * The version of the Template to prevent mismatches\n         * </pre>\n         *\n         * <code>optional string version = 1;</code>\n         */\n        public Builder setVersion(\n            java.lang.String value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000001;\n          version_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * The version of the Template to prevent mismatches\n         * </pre>\n         *\n         * <code>optional string version = 1;</code>\n         */\n        public Builder clearVersion() {\n          bitField0_ = (bitField0_ & ~0x00000001);\n          version_ = getDefaultInstance().getVersion();\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * The version of the Template to prevent mismatches\n         * </pre>\n         *\n         * <code>optional string version = 1;</code>\n         */\n        public Builder setVersionBytes(\n            com.google.protobuf.ByteString value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000001;\n          version_ = value;\n          onChanged();\n          return this;\n        }\n\n        private java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric> metrics_ =\n          java.util.Collections.emptyList();\n        private void ensureMetricsIsMutable() {\n          if (!((bitField0_ & 0x00000002) == 0x00000002)) {\n            metrics_ = new java.util.ArrayList<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric>(metrics_);\n            bitField0_ |= 0x00000002;\n           }\n        }\n\n        private com.google.protobuf.RepeatedFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder> metricsBuilder_;\n\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric> getMetricsList() {\n          if (metricsBuilder_ == null) {\n            return java.util.Collections.unmodifiableList(metrics_);\n          } else {\n            return metricsBuilder_.getMessageList();\n          }\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public int getMetricsCount() {\n          if (metricsBuilder_ == null) {\n            return metrics_.size();\n          } else {\n            return metricsBuilder_.getCount();\n          }\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric getMetrics(int index) {\n          if (metricsBuilder_ == null) {\n            return metrics_.get(index);\n          } else {\n            return metricsBuilder_.getMessage(index);\n          }\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public Builder setMetrics(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric value) {\n          if (metricsBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            ensureMetricsIsMutable();\n            metrics_.set(index, value);\n            onChanged();\n          } else {\n            metricsBuilder_.setMessage(index, value);\n          }\n          return this;\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public Builder setMetrics(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder builderForValue) {\n          if (metricsBuilder_ == null) {\n            ensureMetricsIsMutable();\n            metrics_.set(index, builderForValue.build());\n            onChanged();\n          } else {\n            metricsBuilder_.setMessage(index, builderForValue.build());\n          }\n          return this;\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public Builder addMetrics(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric value) {\n          if (metricsBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            ensureMetricsIsMutable();\n            metrics_.add(value);\n            onChanged();\n          } else {\n            metricsBuilder_.addMessage(value);\n          }\n          return this;\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public Builder addMetrics(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric value) {\n          if (metricsBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            ensureMetricsIsMutable();\n            metrics_.add(index, value);\n            onChanged();\n          } else {\n            metricsBuilder_.addMessage(index, value);\n          }\n          return this;\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public Builder addMetrics(\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder builderForValue) {\n          if (metricsBuilder_ == null) {\n            ensureMetricsIsMutable();\n            metrics_.add(builderForValue.build());\n            onChanged();\n          } else {\n            metricsBuilder_.addMessage(builderForValue.build());\n          }\n          return this;\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public Builder addMetrics(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder builderForValue) {\n          if (metricsBuilder_ == null) {\n            ensureMetricsIsMutable();\n            metrics_.add(index, builderForValue.build());\n            onChanged();\n          } else {\n            metricsBuilder_.addMessage(index, builderForValue.build());\n          }\n          return this;\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public Builder addAllMetrics(\n            java.lang.Iterable<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric> values) {\n          if (metricsBuilder_ == null) {\n            ensureMetricsIsMutable();\n            com.google.protobuf.AbstractMessageLite.Builder.addAll(\n                values, metrics_);\n            onChanged();\n          } else {\n            metricsBuilder_.addAllMessages(values);\n          }\n          return this;\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public Builder clearMetrics() {\n          if (metricsBuilder_ == null) {\n            metrics_ = java.util.Collections.emptyList();\n            bitField0_ = (bitField0_ & ~0x00000002);\n            onChanged();\n          } else {\n            metricsBuilder_.clear();\n          }\n          return this;\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public Builder removeMetrics(int index) {\n          if (metricsBuilder_ == null) {\n            ensureMetricsIsMutable();\n            metrics_.remove(index);\n            onChanged();\n          } else {\n            metricsBuilder_.remove(index);\n          }\n          return this;\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder getMetricsBuilder(\n            int index) {\n          return getMetricsFieldBuilder().getBuilder(index);\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder getMetricsOrBuilder(\n            int index) {\n          if (metricsBuilder_ == null) {\n            return metrics_.get(index);  } else {\n            return metricsBuilder_.getMessageOrBuilder(index);\n          }\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder> \n             getMetricsOrBuilderList() {\n          if (metricsBuilder_ != null) {\n            return metricsBuilder_.getMessageOrBuilderList();\n          } else {\n            return java.util.Collections.unmodifiableList(metrics_);\n          }\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder addMetricsBuilder() {\n          return getMetricsFieldBuilder().addBuilder(\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.getDefaultInstance());\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder addMetricsBuilder(\n            int index) {\n          return getMetricsFieldBuilder().addBuilder(\n              index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.getDefaultInstance());\n        }\n        /**\n         * <pre>\n         * Each metric includes a name, datatype, and optionally a value\n         * </pre>\n         *\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n         */\n        public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder> \n             getMetricsBuilderList() {\n          return getMetricsFieldBuilder().getBuilderList();\n        }\n        private com.google.protobuf.RepeatedFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder> \n            getMetricsFieldBuilder() {\n          if (metricsBuilder_ == null) {\n            metricsBuilder_ = new com.google.protobuf.RepeatedFieldBuilderV3<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder>(\n                    metrics_,\n                    ((bitField0_ & 0x00000002) == 0x00000002),\n                    getParentForChildren(),\n                    isClean());\n            metrics_ = null;\n          }\n          return metricsBuilder_;\n        }\n\n        private java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter> parameters_ =\n          java.util.Collections.emptyList();\n        private void ensureParametersIsMutable() {\n          if (!((bitField0_ & 0x00000004) == 0x00000004)) {\n            parameters_ = new java.util.ArrayList<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter>(parameters_);\n            bitField0_ |= 0x00000004;\n           }\n        }\n\n        private com.google.protobuf.RepeatedFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.ParameterOrBuilder> parametersBuilder_;\n\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter> getParametersList() {\n          if (parametersBuilder_ == null) {\n            return java.util.Collections.unmodifiableList(parameters_);\n          } else {\n            return parametersBuilder_.getMessageList();\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public int getParametersCount() {\n          if (parametersBuilder_ == null) {\n            return parameters_.size();\n          } else {\n            return parametersBuilder_.getCount();\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter getParameters(int index) {\n          if (parametersBuilder_ == null) {\n            return parameters_.get(index);\n          } else {\n            return parametersBuilder_.getMessage(index);\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public Builder setParameters(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter value) {\n          if (parametersBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            ensureParametersIsMutable();\n            parameters_.set(index, value);\n            onChanged();\n          } else {\n            parametersBuilder_.setMessage(index, value);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public Builder setParameters(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.Builder builderForValue) {\n          if (parametersBuilder_ == null) {\n            ensureParametersIsMutable();\n            parameters_.set(index, builderForValue.build());\n            onChanged();\n          } else {\n            parametersBuilder_.setMessage(index, builderForValue.build());\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public Builder addParameters(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter value) {\n          if (parametersBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            ensureParametersIsMutable();\n            parameters_.add(value);\n            onChanged();\n          } else {\n            parametersBuilder_.addMessage(value);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public Builder addParameters(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter value) {\n          if (parametersBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            ensureParametersIsMutable();\n            parameters_.add(index, value);\n            onChanged();\n          } else {\n            parametersBuilder_.addMessage(index, value);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public Builder addParameters(\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.Builder builderForValue) {\n          if (parametersBuilder_ == null) {\n            ensureParametersIsMutable();\n            parameters_.add(builderForValue.build());\n            onChanged();\n          } else {\n            parametersBuilder_.addMessage(builderForValue.build());\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public Builder addParameters(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.Builder builderForValue) {\n          if (parametersBuilder_ == null) {\n            ensureParametersIsMutable();\n            parameters_.add(index, builderForValue.build());\n            onChanged();\n          } else {\n            parametersBuilder_.addMessage(index, builderForValue.build());\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public Builder addAllParameters(\n            java.lang.Iterable<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter> values) {\n          if (parametersBuilder_ == null) {\n            ensureParametersIsMutable();\n            com.google.protobuf.AbstractMessageLite.Builder.addAll(\n                values, parameters_);\n            onChanged();\n          } else {\n            parametersBuilder_.addAllMessages(values);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public Builder clearParameters() {\n          if (parametersBuilder_ == null) {\n            parameters_ = java.util.Collections.emptyList();\n            bitField0_ = (bitField0_ & ~0x00000004);\n            onChanged();\n          } else {\n            parametersBuilder_.clear();\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public Builder removeParameters(int index) {\n          if (parametersBuilder_ == null) {\n            ensureParametersIsMutable();\n            parameters_.remove(index);\n            onChanged();\n          } else {\n            parametersBuilder_.remove(index);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.Builder getParametersBuilder(\n            int index) {\n          return getParametersFieldBuilder().getBuilder(index);\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.ParameterOrBuilder getParametersOrBuilder(\n            int index) {\n          if (parametersBuilder_ == null) {\n            return parameters_.get(index);  } else {\n            return parametersBuilder_.getMessageOrBuilder(index);\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.ParameterOrBuilder> \n             getParametersOrBuilderList() {\n          if (parametersBuilder_ != null) {\n            return parametersBuilder_.getMessageOrBuilderList();\n          } else {\n            return java.util.Collections.unmodifiableList(parameters_);\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.Builder addParametersBuilder() {\n          return getParametersFieldBuilder().addBuilder(\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.getDefaultInstance());\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.Builder addParametersBuilder(\n            int index) {\n          return getParametersFieldBuilder().addBuilder(\n              index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.getDefaultInstance());\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.Template.Parameter parameters = 3;</code>\n         */\n        public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.Builder> \n             getParametersBuilderList() {\n          return getParametersFieldBuilder().getBuilderList();\n        }\n        private com.google.protobuf.RepeatedFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.ParameterOrBuilder> \n            getParametersFieldBuilder() {\n          if (parametersBuilder_ == null) {\n            parametersBuilder_ = new com.google.protobuf.RepeatedFieldBuilderV3<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Parameter.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.ParameterOrBuilder>(\n                    parameters_,\n                    ((bitField0_ & 0x00000004) == 0x00000004),\n                    getParentForChildren(),\n                    isClean());\n            parameters_ = null;\n          }\n          return parametersBuilder_;\n        }\n\n        private java.lang.Object templateRef_ = \"\";\n        /**\n         * <pre>\n         * Reference to a template if this is extending a Template or an instance - must exist if an instance\n         * </pre>\n         *\n         * <code>optional string template_ref = 4;</code>\n         */\n        public boolean hasTemplateRef() {\n          return ((bitField0_ & 0x00000008) == 0x00000008);\n        }\n        /**\n         * <pre>\n         * Reference to a template if this is extending a Template or an instance - must exist if an instance\n         * </pre>\n         *\n         * <code>optional string template_ref = 4;</code>\n         */\n        public java.lang.String getTemplateRef() {\n          java.lang.Object ref = templateRef_;\n          if (!(ref instanceof java.lang.String)) {\n            com.google.protobuf.ByteString bs =\n                (com.google.protobuf.ByteString) ref;\n            java.lang.String s = bs.toStringUtf8();\n            if (bs.isValidUtf8()) {\n              templateRef_ = s;\n            }\n            return s;\n          } else {\n            return (java.lang.String) ref;\n          }\n        }\n        /**\n         * <pre>\n         * Reference to a template if this is extending a Template or an instance - must exist if an instance\n         * </pre>\n         *\n         * <code>optional string template_ref = 4;</code>\n         */\n        public com.google.protobuf.ByteString\n            getTemplateRefBytes() {\n          java.lang.Object ref = templateRef_;\n          if (ref instanceof String) {\n            com.google.protobuf.ByteString b = \n                com.google.protobuf.ByteString.copyFromUtf8(\n                    (java.lang.String) ref);\n            templateRef_ = b;\n            return b;\n          } else {\n            return (com.google.protobuf.ByteString) ref;\n          }\n        }\n        /**\n         * <pre>\n         * Reference to a template if this is extending a Template or an instance - must exist if an instance\n         * </pre>\n         *\n         * <code>optional string template_ref = 4;</code>\n         */\n        public Builder setTemplateRef(\n            java.lang.String value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000008;\n          templateRef_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Reference to a template if this is extending a Template or an instance - must exist if an instance\n         * </pre>\n         *\n         * <code>optional string template_ref = 4;</code>\n         */\n        public Builder clearTemplateRef() {\n          bitField0_ = (bitField0_ & ~0x00000008);\n          templateRef_ = getDefaultInstance().getTemplateRef();\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Reference to a template if this is extending a Template or an instance - must exist if an instance\n         * </pre>\n         *\n         * <code>optional string template_ref = 4;</code>\n         */\n        public Builder setTemplateRefBytes(\n            com.google.protobuf.ByteString value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000008;\n          templateRef_ = value;\n          onChanged();\n          return this;\n        }\n\n        private boolean isDefinition_ ;\n        /**\n         * <code>optional bool is_definition = 5;</code>\n         */\n        public boolean hasIsDefinition() {\n          return ((bitField0_ & 0x00000010) == 0x00000010);\n        }\n        /**\n         * <code>optional bool is_definition = 5;</code>\n         */\n        public boolean getIsDefinition() {\n          return isDefinition_;\n        }\n        /**\n         * <code>optional bool is_definition = 5;</code>\n         */\n        public Builder setIsDefinition(boolean value) {\n          bitField0_ |= 0x00000010;\n          isDefinition_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional bool is_definition = 5;</code>\n         */\n        public Builder clearIsDefinition() {\n          bitField0_ = (bitField0_ & ~0x00000010);\n          isDefinition_ = false;\n          onChanged();\n          return this;\n        }\n        public final Builder setUnknownFields(\n            final com.google.protobuf.UnknownFieldSet unknownFields) {\n          return super.setUnknownFields(unknownFields);\n        }\n\n        public final Builder mergeUnknownFields(\n            final com.google.protobuf.UnknownFieldSet unknownFields) {\n          return super.mergeUnknownFields(unknownFields);\n        }\n\n\n        // @@protoc_insertion_point(builder_scope:org.eclipse.tahu.protobuf.Payload.Template)\n      }\n\n      // @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.Template)\n      private static final org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template DEFAULT_INSTANCE;\n      static {\n        DEFAULT_INSTANCE = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template();\n      }\n\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template getDefaultInstance() {\n        return DEFAULT_INSTANCE;\n      }\n\n      @java.lang.Deprecated public static final com.google.protobuf.Parser<Template>\n          PARSER = new com.google.protobuf.AbstractParser<Template>() {\n        public Template parsePartialFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n            return new Template(input, extensionRegistry);\n        }\n      };\n\n      public static com.google.protobuf.Parser<Template> parser() {\n        return PARSER;\n      }\n\n      @java.lang.Override\n      public com.google.protobuf.Parser<Template> getParserForType() {\n        return PARSER;\n      }\n\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template getDefaultInstanceForType() {\n        return DEFAULT_INSTANCE;\n      }\n\n    }\n\n    public interface DataSetOrBuilder extends\n        // @@protoc_insertion_point(interface_extends:org.eclipse.tahu.protobuf.Payload.DataSet)\n        com.google.protobuf.GeneratedMessageV3.\n            ExtendableMessageOrBuilder<DataSet> {\n\n      /**\n       * <code>optional uint64 num_of_columns = 1;</code>\n       */\n      boolean hasNumOfColumns();\n      /**\n       * <code>optional uint64 num_of_columns = 1;</code>\n       */\n      long getNumOfColumns();\n\n      /**\n       * <code>repeated string columns = 2;</code>\n       */\n      java.util.List<java.lang.String>\n          getColumnsList();\n      /**\n       * <code>repeated string columns = 2;</code>\n       */\n      int getColumnsCount();\n      /**\n       * <code>repeated string columns = 2;</code>\n       */\n      java.lang.String getColumns(int index);\n      /**\n       * <code>repeated string columns = 2;</code>\n       */\n      com.google.protobuf.ByteString\n          getColumnsBytes(int index);\n\n      /**\n       * <code>repeated uint32 types = 3;</code>\n       */\n      java.util.List<java.lang.Integer> getTypesList();\n      /**\n       * <code>repeated uint32 types = 3;</code>\n       */\n      int getTypesCount();\n      /**\n       * <code>repeated uint32 types = 3;</code>\n       */\n      int getTypes(int index);\n\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n       */\n      java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row> \n          getRowsList();\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row getRows(int index);\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n       */\n      int getRowsCount();\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n       */\n      java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.RowOrBuilder> \n          getRowsOrBuilderList();\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.RowOrBuilder getRowsOrBuilder(\n          int index);\n    }\n    /**\n     * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.DataSet}\n     */\n    public  static final class DataSet extends\n        com.google.protobuf.GeneratedMessageV3.ExtendableMessage<\n          DataSet> implements\n        // @@protoc_insertion_point(message_implements:org.eclipse.tahu.protobuf.Payload.DataSet)\n        DataSetOrBuilder {\n      // Use DataSet.newBuilder() to construct.\n      private DataSet(com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet, ?> builder) {\n        super(builder);\n      }\n      private DataSet() {\n        numOfColumns_ = 0L;\n        columns_ = com.google.protobuf.LazyStringArrayList.EMPTY;\n        types_ = java.util.Collections.emptyList();\n        rows_ = java.util.Collections.emptyList();\n      }\n\n      @java.lang.Override\n      public final com.google.protobuf.UnknownFieldSet\n      getUnknownFields() {\n        return this.unknownFields;\n      }\n      private DataSet(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        this();\n        int mutable_bitField0_ = 0;\n        com.google.protobuf.UnknownFieldSet.Builder unknownFields =\n            com.google.protobuf.UnknownFieldSet.newBuilder();\n        try {\n          boolean done = false;\n          while (!done) {\n            int tag = input.readTag();\n            switch (tag) {\n              case 0:\n                done = true;\n                break;\n              default: {\n                if (!parseUnknownField(input, unknownFields,\n                                       extensionRegistry, tag)) {\n                  done = true;\n                }\n                break;\n              }\n              case 8: {\n                bitField0_ |= 0x00000001;\n                numOfColumns_ = input.readUInt64();\n                break;\n              }\n              case 18: {\n                com.google.protobuf.ByteString bs = input.readBytes();\n                if (!((mutable_bitField0_ & 0x00000002) == 0x00000002)) {\n                  columns_ = new com.google.protobuf.LazyStringArrayList();\n                  mutable_bitField0_ |= 0x00000002;\n                }\n                columns_.add(bs);\n                break;\n              }\n              case 24: {\n                if (!((mutable_bitField0_ & 0x00000004) == 0x00000004)) {\n                  types_ = new java.util.ArrayList<java.lang.Integer>();\n                  mutable_bitField0_ |= 0x00000004;\n                }\n                types_.add(input.readUInt32());\n                break;\n              }\n              case 26: {\n                int length = input.readRawVarint32();\n                int limit = input.pushLimit(length);\n                if (!((mutable_bitField0_ & 0x00000004) == 0x00000004) && input.getBytesUntilLimit() > 0) {\n                  types_ = new java.util.ArrayList<java.lang.Integer>();\n                  mutable_bitField0_ |= 0x00000004;\n                }\n                while (input.getBytesUntilLimit() > 0) {\n                  types_.add(input.readUInt32());\n                }\n                input.popLimit(limit);\n                break;\n              }\n              case 34: {\n                if (!((mutable_bitField0_ & 0x00000008) == 0x00000008)) {\n                  rows_ = new java.util.ArrayList<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row>();\n                  mutable_bitField0_ |= 0x00000008;\n                }\n                rows_.add(\n                    input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.PARSER, extensionRegistry));\n                break;\n              }\n            }\n          }\n        } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n          throw e.setUnfinishedMessage(this);\n        } catch (java.io.IOException e) {\n          throw new com.google.protobuf.InvalidProtocolBufferException(\n              e).setUnfinishedMessage(this);\n        } finally {\n          if (((mutable_bitField0_ & 0x00000002) == 0x00000002)) {\n            columns_ = columns_.getUnmodifiableView();\n          }\n          if (((mutable_bitField0_ & 0x00000004) == 0x00000004)) {\n            types_ = java.util.Collections.unmodifiableList(types_);\n          }\n          if (((mutable_bitField0_ & 0x00000008) == 0x00000008)) {\n            rows_ = java.util.Collections.unmodifiableList(rows_);\n          }\n          this.unknownFields = unknownFields.build();\n          makeExtensionsImmutable();\n        }\n      }\n      public static final com.google.protobuf.Descriptors.Descriptor\n          getDescriptor() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_descriptor;\n      }\n\n      protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n          internalGetFieldAccessorTable() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_fieldAccessorTable\n            .ensureFieldAccessorsInitialized(\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Builder.class);\n      }\n\n      public interface DataSetValueOrBuilder extends\n          // @@protoc_insertion_point(interface_extends:org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue)\n          com.google.protobuf.MessageOrBuilder {\n\n        /**\n         * <code>optional uint32 int_value = 1;</code>\n         */\n        boolean hasIntValue();\n        /**\n         * <code>optional uint32 int_value = 1;</code>\n         */\n        int getIntValue();\n\n        /**\n         * <code>optional uint64 long_value = 2;</code>\n         */\n        boolean hasLongValue();\n        /**\n         * <code>optional uint64 long_value = 2;</code>\n         */\n        long getLongValue();\n\n        /**\n         * <code>optional float float_value = 3;</code>\n         */\n        boolean hasFloatValue();\n        /**\n         * <code>optional float float_value = 3;</code>\n         */\n        float getFloatValue();\n\n        /**\n         * <code>optional double double_value = 4;</code>\n         */\n        boolean hasDoubleValue();\n        /**\n         * <code>optional double double_value = 4;</code>\n         */\n        double getDoubleValue();\n\n        /**\n         * <code>optional bool boolean_value = 5;</code>\n         */\n        boolean hasBooleanValue();\n        /**\n         * <code>optional bool boolean_value = 5;</code>\n         */\n        boolean getBooleanValue();\n\n        /**\n         * <code>optional string string_value = 6;</code>\n         */\n        boolean hasStringValue();\n        /**\n         * <code>optional string string_value = 6;</code>\n         */\n        java.lang.String getStringValue();\n        /**\n         * <code>optional string string_value = 6;</code>\n         */\n        com.google.protobuf.ByteString\n            getStringValueBytes();\n\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension extension_value = 7;</code>\n         */\n        boolean hasExtensionValue();\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension extension_value = 7;</code>\n         */\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension getExtensionValue();\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension extension_value = 7;</code>\n         */\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtensionOrBuilder getExtensionValueOrBuilder();\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.ValueCase getValueCase();\n      }\n      /**\n       * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue}\n       */\n      public  static final class DataSetValue extends\n          com.google.protobuf.GeneratedMessageV3 implements\n          // @@protoc_insertion_point(message_implements:org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue)\n          DataSetValueOrBuilder {\n        // Use DataSetValue.newBuilder() to construct.\n        private DataSetValue(com.google.protobuf.GeneratedMessageV3.Builder<?> builder) {\n          super(builder);\n        }\n        private DataSetValue() {\n        }\n\n        @java.lang.Override\n        public final com.google.protobuf.UnknownFieldSet\n        getUnknownFields() {\n          return this.unknownFields;\n        }\n        private DataSetValue(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          this();\n          int mutable_bitField0_ = 0;\n          com.google.protobuf.UnknownFieldSet.Builder unknownFields =\n              com.google.protobuf.UnknownFieldSet.newBuilder();\n          try {\n            boolean done = false;\n            while (!done) {\n              int tag = input.readTag();\n              switch (tag) {\n                case 0:\n                  done = true;\n                  break;\n                default: {\n                  if (!parseUnknownField(input, unknownFields,\n                                         extensionRegistry, tag)) {\n                    done = true;\n                  }\n                  break;\n                }\n                case 8: {\n                  valueCase_ = 1;\n                  value_ = input.readUInt32();\n                  break;\n                }\n                case 16: {\n                  valueCase_ = 2;\n                  value_ = input.readUInt64();\n                  break;\n                }\n                case 29: {\n                  valueCase_ = 3;\n                  value_ = input.readFloat();\n                  break;\n                }\n                case 33: {\n                  valueCase_ = 4;\n                  value_ = input.readDouble();\n                  break;\n                }\n                case 40: {\n                  valueCase_ = 5;\n                  value_ = input.readBool();\n                  break;\n                }\n                case 50: {\n                  com.google.protobuf.ByteString bs = input.readBytes();\n                  valueCase_ = 6;\n                  value_ = bs;\n                  break;\n                }\n                case 58: {\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.Builder subBuilder = null;\n                  if (valueCase_ == 7) {\n                    subBuilder = ((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension) value_).toBuilder();\n                  }\n                  value_ =\n                      input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.PARSER, extensionRegistry);\n                  if (subBuilder != null) {\n                    subBuilder.mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension) value_);\n                    value_ = subBuilder.buildPartial();\n                  }\n                  valueCase_ = 7;\n                  break;\n                }\n              }\n            }\n          } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n            throw e.setUnfinishedMessage(this);\n          } catch (java.io.IOException e) {\n            throw new com.google.protobuf.InvalidProtocolBufferException(\n                e).setUnfinishedMessage(this);\n          } finally {\n            this.unknownFields = unknownFields.build();\n            makeExtensionsImmutable();\n          }\n        }\n        public static final com.google.protobuf.Descriptors.Descriptor\n            getDescriptor() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_descriptor;\n        }\n\n        protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n            internalGetFieldAccessorTable() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_fieldAccessorTable\n              .ensureFieldAccessorsInitialized(\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.Builder.class);\n        }\n\n        public interface DataSetValueExtensionOrBuilder extends\n            // @@protoc_insertion_point(interface_extends:org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension)\n            com.google.protobuf.GeneratedMessageV3.\n                ExtendableMessageOrBuilder<DataSetValueExtension> {\n        }\n        /**\n         * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension}\n         */\n        public  static final class DataSetValueExtension extends\n            com.google.protobuf.GeneratedMessageV3.ExtendableMessage<\n              DataSetValueExtension> implements\n            // @@protoc_insertion_point(message_implements:org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension)\n            DataSetValueExtensionOrBuilder {\n          // Use DataSetValueExtension.newBuilder() to construct.\n          private DataSetValueExtension(com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension, ?> builder) {\n            super(builder);\n          }\n          private DataSetValueExtension() {\n          }\n\n          @java.lang.Override\n          public final com.google.protobuf.UnknownFieldSet\n          getUnknownFields() {\n            return this.unknownFields;\n          }\n          private DataSetValueExtension(\n              com.google.protobuf.CodedInputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws com.google.protobuf.InvalidProtocolBufferException {\n            this();\n            com.google.protobuf.UnknownFieldSet.Builder unknownFields =\n                com.google.protobuf.UnknownFieldSet.newBuilder();\n            try {\n              boolean done = false;\n              while (!done) {\n                int tag = input.readTag();\n                switch (tag) {\n                  case 0:\n                    done = true;\n                    break;\n                  default: {\n                    if (!parseUnknownField(input, unknownFields,\n                                           extensionRegistry, tag)) {\n                      done = true;\n                    }\n                    break;\n                  }\n                }\n              }\n            } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n              throw e.setUnfinishedMessage(this);\n            } catch (java.io.IOException e) {\n              throw new com.google.protobuf.InvalidProtocolBufferException(\n                  e).setUnfinishedMessage(this);\n            } finally {\n              this.unknownFields = unknownFields.build();\n              makeExtensionsImmutable();\n            }\n          }\n          public static final com.google.protobuf.Descriptors.Descriptor\n              getDescriptor() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_descriptor;\n          }\n\n          protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n              internalGetFieldAccessorTable() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_fieldAccessorTable\n                .ensureFieldAccessorsInitialized(\n                    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.Builder.class);\n          }\n\n          private byte memoizedIsInitialized = -1;\n          public final boolean isInitialized() {\n            byte isInitialized = memoizedIsInitialized;\n            if (isInitialized == 1) return true;\n            if (isInitialized == 0) return false;\n\n            if (!extensionsAreInitialized()) {\n              memoizedIsInitialized = 0;\n              return false;\n            }\n            memoizedIsInitialized = 1;\n            return true;\n          }\n\n          public void writeTo(com.google.protobuf.CodedOutputStream output)\n                              throws java.io.IOException {\n            com.google.protobuf.GeneratedMessageV3\n              .ExtendableMessage<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension>.ExtensionWriter\n                extensionWriter = newExtensionWriter();\n            extensionWriter.writeUntil(536870912, output);\n            unknownFields.writeTo(output);\n          }\n\n          public int getSerializedSize() {\n            int size = memoizedSize;\n            if (size != -1) return size;\n\n            size = 0;\n            size += extensionsSerializedSize();\n            size += unknownFields.getSerializedSize();\n            memoizedSize = size;\n            return size;\n          }\n\n          private static final long serialVersionUID = 0L;\n          @java.lang.Override\n          public boolean equals(final java.lang.Object obj) {\n            if (obj == this) {\n             return true;\n            }\n            if (!(obj instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension)) {\n              return super.equals(obj);\n            }\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension other = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension) obj;\n\n            boolean result = true;\n            result = result && unknownFields.equals(other.unknownFields);\n            result = result &&\n                getExtensionFields().equals(other.getExtensionFields());\n            return result;\n          }\n\n          @java.lang.Override\n          public int hashCode() {\n            if (memoizedHashCode != 0) {\n              return memoizedHashCode;\n            }\n            int hash = 41;\n            hash = (19 * hash) + getDescriptorForType().hashCode();\n            hash = hashFields(hash, getExtensionFields());\n            hash = (29 * hash) + unknownFields.hashCode();\n            memoizedHashCode = hash;\n            return hash;\n          }\n\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension parseFrom(\n              com.google.protobuf.ByteString data)\n              throws com.google.protobuf.InvalidProtocolBufferException {\n            return PARSER.parseFrom(data);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension parseFrom(\n              com.google.protobuf.ByteString data,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws com.google.protobuf.InvalidProtocolBufferException {\n            return PARSER.parseFrom(data, extensionRegistry);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension parseFrom(byte[] data)\n              throws com.google.protobuf.InvalidProtocolBufferException {\n            return PARSER.parseFrom(data);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension parseFrom(\n              byte[] data,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws com.google.protobuf.InvalidProtocolBufferException {\n            return PARSER.parseFrom(data, extensionRegistry);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension parseFrom(java.io.InputStream input)\n              throws java.io.IOException {\n            return com.google.protobuf.GeneratedMessageV3\n                .parseWithIOException(PARSER, input);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension parseFrom(\n              java.io.InputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws java.io.IOException {\n            return com.google.protobuf.GeneratedMessageV3\n                .parseWithIOException(PARSER, input, extensionRegistry);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension parseDelimitedFrom(java.io.InputStream input)\n              throws java.io.IOException {\n            return com.google.protobuf.GeneratedMessageV3\n                .parseDelimitedWithIOException(PARSER, input);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension parseDelimitedFrom(\n              java.io.InputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws java.io.IOException {\n            return com.google.protobuf.GeneratedMessageV3\n                .parseDelimitedWithIOException(PARSER, input, extensionRegistry);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension parseFrom(\n              com.google.protobuf.CodedInputStream input)\n              throws java.io.IOException {\n            return com.google.protobuf.GeneratedMessageV3\n                .parseWithIOException(PARSER, input);\n          }\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension parseFrom(\n              com.google.protobuf.CodedInputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws java.io.IOException {\n            return com.google.protobuf.GeneratedMessageV3\n                .parseWithIOException(PARSER, input, extensionRegistry);\n          }\n\n          public Builder newBuilderForType() { return newBuilder(); }\n          public static Builder newBuilder() {\n            return DEFAULT_INSTANCE.toBuilder();\n          }\n          public static Builder newBuilder(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension prototype) {\n            return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);\n          }\n          public Builder toBuilder() {\n            return this == DEFAULT_INSTANCE\n                ? new Builder() : new Builder().mergeFrom(this);\n          }\n\n          @java.lang.Override\n          protected Builder newBuilderForType(\n              com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n            Builder builder = new Builder(parent);\n            return builder;\n          }\n          /**\n           * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension}\n           */\n          public static final class Builder extends\n              com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension, Builder> implements\n              // @@protoc_insertion_point(builder_implements:org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension)\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtensionOrBuilder {\n            public static final com.google.protobuf.Descriptors.Descriptor\n                getDescriptor() {\n              return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_descriptor;\n            }\n\n            protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n                internalGetFieldAccessorTable() {\n              return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_fieldAccessorTable\n                  .ensureFieldAccessorsInitialized(\n                      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.Builder.class);\n            }\n\n            // Construct using org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.newBuilder()\n            private Builder() {\n              maybeForceBuilderInitialization();\n            }\n\n            private Builder(\n                com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n              super(parent);\n              maybeForceBuilderInitialization();\n            }\n            private void maybeForceBuilderInitialization() {\n              if (com.google.protobuf.GeneratedMessageV3\n                      .alwaysUseFieldBuilders) {\n              }\n            }\n            public Builder clear() {\n              super.clear();\n              return this;\n            }\n\n            public com.google.protobuf.Descriptors.Descriptor\n                getDescriptorForType() {\n              return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_descriptor;\n            }\n\n            public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension getDefaultInstanceForType() {\n              return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.getDefaultInstance();\n            }\n\n            public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension build() {\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension result = buildPartial();\n              if (!result.isInitialized()) {\n                throw newUninitializedMessageException(result);\n              }\n              return result;\n            }\n\n            public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension buildPartial() {\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension result = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension(this);\n              onBuilt();\n              return result;\n            }\n\n            public Builder clone() {\n              return (Builder) super.clone();\n            }\n            public Builder setField(\n                com.google.protobuf.Descriptors.FieldDescriptor field,\n                Object value) {\n              return (Builder) super.setField(field, value);\n            }\n            public Builder clearField(\n                com.google.protobuf.Descriptors.FieldDescriptor field) {\n              return (Builder) super.clearField(field);\n            }\n            public Builder clearOneof(\n                com.google.protobuf.Descriptors.OneofDescriptor oneof) {\n              return (Builder) super.clearOneof(oneof);\n            }\n            public Builder setRepeatedField(\n                com.google.protobuf.Descriptors.FieldDescriptor field,\n                int index, Object value) {\n              return (Builder) super.setRepeatedField(field, index, value);\n            }\n            public Builder addRepeatedField(\n                com.google.protobuf.Descriptors.FieldDescriptor field,\n                Object value) {\n              return (Builder) super.addRepeatedField(field, value);\n            }\n            public <Type> Builder setExtension(\n                com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension, Type> extension,\n                Type value) {\n              return (Builder) super.setExtension(extension, value);\n            }\n            public <Type> Builder setExtension(\n                com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension, java.util.List<Type>> extension,\n                int index, Type value) {\n              return (Builder) super.setExtension(extension, index, value);\n            }\n            public <Type> Builder addExtension(\n                com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension, java.util.List<Type>> extension,\n                Type value) {\n              return (Builder) super.addExtension(extension, value);\n            }\n            public <Type> Builder clearExtension(\n                com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension, ?> extension) {\n              return (Builder) super.clearExtension(extension);\n            }\n            public Builder mergeFrom(com.google.protobuf.Message other) {\n              if (other instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension) {\n                return mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension)other);\n              } else {\n                super.mergeFrom(other);\n                return this;\n              }\n            }\n\n            public Builder mergeFrom(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension other) {\n              if (other == org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.getDefaultInstance()) return this;\n              this.mergeExtensionFields(other);\n              this.mergeUnknownFields(other.unknownFields);\n              onChanged();\n              return this;\n            }\n\n            public final boolean isInitialized() {\n              if (!extensionsAreInitialized()) {\n                return false;\n              }\n              return true;\n            }\n\n            public Builder mergeFrom(\n                com.google.protobuf.CodedInputStream input,\n                com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n                throws java.io.IOException {\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension parsedMessage = null;\n              try {\n                parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry);\n              } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n                parsedMessage = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension) e.getUnfinishedMessage();\n                throw e.unwrapIOException();\n              } finally {\n                if (parsedMessage != null) {\n                  mergeFrom(parsedMessage);\n                }\n              }\n              return this;\n            }\n            public final Builder setUnknownFields(\n                final com.google.protobuf.UnknownFieldSet unknownFields) {\n              return super.setUnknownFields(unknownFields);\n            }\n\n            public final Builder mergeUnknownFields(\n                final com.google.protobuf.UnknownFieldSet unknownFields) {\n              return super.mergeUnknownFields(unknownFields);\n            }\n\n\n            // @@protoc_insertion_point(builder_scope:org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension)\n          }\n\n          // @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension)\n          private static final org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension DEFAULT_INSTANCE;\n          static {\n            DEFAULT_INSTANCE = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension();\n          }\n\n          public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension getDefaultInstance() {\n            return DEFAULT_INSTANCE;\n          }\n\n          @java.lang.Deprecated public static final com.google.protobuf.Parser<DataSetValueExtension>\n              PARSER = new com.google.protobuf.AbstractParser<DataSetValueExtension>() {\n            public DataSetValueExtension parsePartialFrom(\n                com.google.protobuf.CodedInputStream input,\n                com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n                throws com.google.protobuf.InvalidProtocolBufferException {\n                return new DataSetValueExtension(input, extensionRegistry);\n            }\n          };\n\n          public static com.google.protobuf.Parser<DataSetValueExtension> parser() {\n            return PARSER;\n          }\n\n          @java.lang.Override\n          public com.google.protobuf.Parser<DataSetValueExtension> getParserForType() {\n            return PARSER;\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension getDefaultInstanceForType() {\n            return DEFAULT_INSTANCE;\n          }\n\n        }\n\n        private int bitField0_;\n        private int valueCase_ = 0;\n        private java.lang.Object value_;\n        public enum ValueCase\n            implements com.google.protobuf.Internal.EnumLite {\n          INT_VALUE(1),\n          LONG_VALUE(2),\n          FLOAT_VALUE(3),\n          DOUBLE_VALUE(4),\n          BOOLEAN_VALUE(5),\n          STRING_VALUE(6),\n          EXTENSION_VALUE(7),\n          VALUE_NOT_SET(0);\n          private final int value;\n          private ValueCase(int value) {\n            this.value = value;\n          }\n          /**\n           * @deprecated Use {@link #forNumber(int)} instead.\n           */\n          @java.lang.Deprecated\n          public static ValueCase valueOf(int value) {\n            return forNumber(value);\n          }\n\n          public static ValueCase forNumber(int value) {\n            switch (value) {\n              case 1: return INT_VALUE;\n              case 2: return LONG_VALUE;\n              case 3: return FLOAT_VALUE;\n              case 4: return DOUBLE_VALUE;\n              case 5: return BOOLEAN_VALUE;\n              case 6: return STRING_VALUE;\n              case 7: return EXTENSION_VALUE;\n              case 0: return VALUE_NOT_SET;\n              default: return null;\n            }\n          }\n          public int getNumber() {\n            return this.value;\n          }\n        };\n\n        public ValueCase\n        getValueCase() {\n          return ValueCase.forNumber(\n              valueCase_);\n        }\n\n        public static final int INT_VALUE_FIELD_NUMBER = 1;\n        /**\n         * <code>optional uint32 int_value = 1;</code>\n         */\n        public boolean hasIntValue() {\n          return valueCase_ == 1;\n        }\n        /**\n         * <code>optional uint32 int_value = 1;</code>\n         */\n        public int getIntValue() {\n          if (valueCase_ == 1) {\n            return (java.lang.Integer) value_;\n          }\n          return 0;\n        }\n\n        public static final int LONG_VALUE_FIELD_NUMBER = 2;\n        /**\n         * <code>optional uint64 long_value = 2;</code>\n         */\n        public boolean hasLongValue() {\n          return valueCase_ == 2;\n        }\n        /**\n         * <code>optional uint64 long_value = 2;</code>\n         */\n        public long getLongValue() {\n          if (valueCase_ == 2) {\n            return (java.lang.Long) value_;\n          }\n          return 0L;\n        }\n\n        public static final int FLOAT_VALUE_FIELD_NUMBER = 3;\n        /**\n         * <code>optional float float_value = 3;</code>\n         */\n        public boolean hasFloatValue() {\n          return valueCase_ == 3;\n        }\n        /**\n         * <code>optional float float_value = 3;</code>\n         */\n        public float getFloatValue() {\n          if (valueCase_ == 3) {\n            return (java.lang.Float) value_;\n          }\n          return 0F;\n        }\n\n        public static final int DOUBLE_VALUE_FIELD_NUMBER = 4;\n        /**\n         * <code>optional double double_value = 4;</code>\n         */\n        public boolean hasDoubleValue() {\n          return valueCase_ == 4;\n        }\n        /**\n         * <code>optional double double_value = 4;</code>\n         */\n        public double getDoubleValue() {\n          if (valueCase_ == 4) {\n            return (java.lang.Double) value_;\n          }\n          return 0D;\n        }\n\n        public static final int BOOLEAN_VALUE_FIELD_NUMBER = 5;\n        /**\n         * <code>optional bool boolean_value = 5;</code>\n         */\n        public boolean hasBooleanValue() {\n          return valueCase_ == 5;\n        }\n        /**\n         * <code>optional bool boolean_value = 5;</code>\n         */\n        public boolean getBooleanValue() {\n          if (valueCase_ == 5) {\n            return (java.lang.Boolean) value_;\n          }\n          return false;\n        }\n\n        public static final int STRING_VALUE_FIELD_NUMBER = 6;\n        /**\n         * <code>optional string string_value = 6;</code>\n         */\n        public boolean hasStringValue() {\n          return valueCase_ == 6;\n        }\n        /**\n         * <code>optional string string_value = 6;</code>\n         */\n        public java.lang.String getStringValue() {\n          java.lang.Object ref = \"\";\n          if (valueCase_ == 6) {\n            ref = value_;\n          }\n          if (ref instanceof java.lang.String) {\n            return (java.lang.String) ref;\n          } else {\n            com.google.protobuf.ByteString bs = \n                (com.google.protobuf.ByteString) ref;\n            java.lang.String s = bs.toStringUtf8();\n            if (bs.isValidUtf8() && (valueCase_ == 6)) {\n              value_ = s;\n            }\n            return s;\n          }\n        }\n        /**\n         * <code>optional string string_value = 6;</code>\n         */\n        public com.google.protobuf.ByteString\n            getStringValueBytes() {\n          java.lang.Object ref = \"\";\n          if (valueCase_ == 6) {\n            ref = value_;\n          }\n          if (ref instanceof java.lang.String) {\n            com.google.protobuf.ByteString b = \n                com.google.protobuf.ByteString.copyFromUtf8(\n                    (java.lang.String) ref);\n            if (valueCase_ == 6) {\n              value_ = b;\n            }\n            return b;\n          } else {\n            return (com.google.protobuf.ByteString) ref;\n          }\n        }\n\n        public static final int EXTENSION_VALUE_FIELD_NUMBER = 7;\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension extension_value = 7;</code>\n         */\n        public boolean hasExtensionValue() {\n          return valueCase_ == 7;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension extension_value = 7;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension getExtensionValue() {\n          if (valueCase_ == 7) {\n             return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension) value_;\n          }\n          return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.getDefaultInstance();\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension extension_value = 7;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtensionOrBuilder getExtensionValueOrBuilder() {\n          if (valueCase_ == 7) {\n             return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension) value_;\n          }\n          return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.getDefaultInstance();\n        }\n\n        private byte memoizedIsInitialized = -1;\n        public final boolean isInitialized() {\n          byte isInitialized = memoizedIsInitialized;\n          if (isInitialized == 1) return true;\n          if (isInitialized == 0) return false;\n\n          if (hasExtensionValue()) {\n            if (!getExtensionValue().isInitialized()) {\n              memoizedIsInitialized = 0;\n              return false;\n            }\n          }\n          memoizedIsInitialized = 1;\n          return true;\n        }\n\n        public void writeTo(com.google.protobuf.CodedOutputStream output)\n                            throws java.io.IOException {\n          if (valueCase_ == 1) {\n            output.writeUInt32(\n                1, (int)((java.lang.Integer) value_));\n          }\n          if (valueCase_ == 2) {\n            output.writeUInt64(\n                2, (long)((java.lang.Long) value_));\n          }\n          if (valueCase_ == 3) {\n            output.writeFloat(\n                3, (float)((java.lang.Float) value_));\n          }\n          if (valueCase_ == 4) {\n            output.writeDouble(\n                4, (double)((java.lang.Double) value_));\n          }\n          if (valueCase_ == 5) {\n            output.writeBool(\n                5, (boolean)((java.lang.Boolean) value_));\n          }\n          if (valueCase_ == 6) {\n            com.google.protobuf.GeneratedMessageV3.writeString(output, 6, value_);\n          }\n          if (valueCase_ == 7) {\n            output.writeMessage(7, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension) value_);\n          }\n          unknownFields.writeTo(output);\n        }\n\n        public int getSerializedSize() {\n          int size = memoizedSize;\n          if (size != -1) return size;\n\n          size = 0;\n          if (valueCase_ == 1) {\n            size += com.google.protobuf.CodedOutputStream\n              .computeUInt32Size(\n                  1, (int)((java.lang.Integer) value_));\n          }\n          if (valueCase_ == 2) {\n            size += com.google.protobuf.CodedOutputStream\n              .computeUInt64Size(\n                  2, (long)((java.lang.Long) value_));\n          }\n          if (valueCase_ == 3) {\n            size += com.google.protobuf.CodedOutputStream\n              .computeFloatSize(\n                  3, (float)((java.lang.Float) value_));\n          }\n          if (valueCase_ == 4) {\n            size += com.google.protobuf.CodedOutputStream\n              .computeDoubleSize(\n                  4, (double)((java.lang.Double) value_));\n          }\n          if (valueCase_ == 5) {\n            size += com.google.protobuf.CodedOutputStream\n              .computeBoolSize(\n                  5, (boolean)((java.lang.Boolean) value_));\n          }\n          if (valueCase_ == 6) {\n            size += com.google.protobuf.GeneratedMessageV3.computeStringSize(6, value_);\n          }\n          if (valueCase_ == 7) {\n            size += com.google.protobuf.CodedOutputStream\n              .computeMessageSize(7, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension) value_);\n          }\n          size += unknownFields.getSerializedSize();\n          memoizedSize = size;\n          return size;\n        }\n\n        private static final long serialVersionUID = 0L;\n        @java.lang.Override\n        public boolean equals(final java.lang.Object obj) {\n          if (obj == this) {\n           return true;\n          }\n          if (!(obj instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue)) {\n            return super.equals(obj);\n          }\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue other = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue) obj;\n\n          boolean result = true;\n          result = result && getValueCase().equals(\n              other.getValueCase());\n          if (!result) return false;\n          switch (valueCase_) {\n            case 1:\n              result = result && (getIntValue()\n                  == other.getIntValue());\n              break;\n            case 2:\n              result = result && (getLongValue()\n                  == other.getLongValue());\n              break;\n            case 3:\n              result = result && (\n                  java.lang.Float.floatToIntBits(getFloatValue())\n                  == java.lang.Float.floatToIntBits(\n                      other.getFloatValue()));\n              break;\n            case 4:\n              result = result && (\n                  java.lang.Double.doubleToLongBits(getDoubleValue())\n                  == java.lang.Double.doubleToLongBits(\n                      other.getDoubleValue()));\n              break;\n            case 5:\n              result = result && (getBooleanValue()\n                  == other.getBooleanValue());\n              break;\n            case 6:\n              result = result && getStringValue()\n                  .equals(other.getStringValue());\n              break;\n            case 7:\n              result = result && getExtensionValue()\n                  .equals(other.getExtensionValue());\n              break;\n            case 0:\n            default:\n          }\n          result = result && unknownFields.equals(other.unknownFields);\n          return result;\n        }\n\n        @java.lang.Override\n        public int hashCode() {\n          if (memoizedHashCode != 0) {\n            return memoizedHashCode;\n          }\n          int hash = 41;\n          hash = (19 * hash) + getDescriptorForType().hashCode();\n          switch (valueCase_) {\n            case 1:\n              hash = (37 * hash) + INT_VALUE_FIELD_NUMBER;\n              hash = (53 * hash) + getIntValue();\n              break;\n            case 2:\n              hash = (37 * hash) + LONG_VALUE_FIELD_NUMBER;\n              hash = (53 * hash) + com.google.protobuf.Internal.hashLong(\n                  getLongValue());\n              break;\n            case 3:\n              hash = (37 * hash) + FLOAT_VALUE_FIELD_NUMBER;\n              hash = (53 * hash) + java.lang.Float.floatToIntBits(\n                  getFloatValue());\n              break;\n            case 4:\n              hash = (37 * hash) + DOUBLE_VALUE_FIELD_NUMBER;\n              hash = (53 * hash) + com.google.protobuf.Internal.hashLong(\n                  java.lang.Double.doubleToLongBits(getDoubleValue()));\n              break;\n            case 5:\n              hash = (37 * hash) + BOOLEAN_VALUE_FIELD_NUMBER;\n              hash = (53 * hash) + com.google.protobuf.Internal.hashBoolean(\n                  getBooleanValue());\n              break;\n            case 6:\n              hash = (37 * hash) + STRING_VALUE_FIELD_NUMBER;\n              hash = (53 * hash) + getStringValue().hashCode();\n              break;\n            case 7:\n              hash = (37 * hash) + EXTENSION_VALUE_FIELD_NUMBER;\n              hash = (53 * hash) + getExtensionValue().hashCode();\n              break;\n            case 0:\n            default:\n          }\n          hash = (29 * hash) + unknownFields.hashCode();\n          memoizedHashCode = hash;\n          return hash;\n        }\n\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue parseFrom(\n            com.google.protobuf.ByteString data)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue parseFrom(\n            com.google.protobuf.ByteString data,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue parseFrom(byte[] data)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue parseFrom(\n            byte[] data,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue parseFrom(java.io.InputStream input)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue parseFrom(\n            java.io.InputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue parseDelimitedFrom(java.io.InputStream input)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseDelimitedWithIOException(PARSER, input);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue parseDelimitedFrom(\n            java.io.InputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseDelimitedWithIOException(PARSER, input, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue parseFrom(\n            com.google.protobuf.CodedInputStream input)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue parseFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input, extensionRegistry);\n        }\n\n        public Builder newBuilderForType() { return newBuilder(); }\n        public static Builder newBuilder() {\n          return DEFAULT_INSTANCE.toBuilder();\n        }\n        public static Builder newBuilder(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue prototype) {\n          return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);\n        }\n        public Builder toBuilder() {\n          return this == DEFAULT_INSTANCE\n              ? new Builder() : new Builder().mergeFrom(this);\n        }\n\n        @java.lang.Override\n        protected Builder newBuilderForType(\n            com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n          Builder builder = new Builder(parent);\n          return builder;\n        }\n        /**\n         * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue}\n         */\n        public static final class Builder extends\n            com.google.protobuf.GeneratedMessageV3.Builder<Builder> implements\n            // @@protoc_insertion_point(builder_implements:org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue)\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValueOrBuilder {\n          public static final com.google.protobuf.Descriptors.Descriptor\n              getDescriptor() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_descriptor;\n          }\n\n          protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n              internalGetFieldAccessorTable() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_fieldAccessorTable\n                .ensureFieldAccessorsInitialized(\n                    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.Builder.class);\n          }\n\n          // Construct using org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.newBuilder()\n          private Builder() {\n            maybeForceBuilderInitialization();\n          }\n\n          private Builder(\n              com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n            super(parent);\n            maybeForceBuilderInitialization();\n          }\n          private void maybeForceBuilderInitialization() {\n            if (com.google.protobuf.GeneratedMessageV3\n                    .alwaysUseFieldBuilders) {\n            }\n          }\n          public Builder clear() {\n            super.clear();\n            valueCase_ = 0;\n            value_ = null;\n            return this;\n          }\n\n          public com.google.protobuf.Descriptors.Descriptor\n              getDescriptorForType() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_descriptor;\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue getDefaultInstanceForType() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.getDefaultInstance();\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue build() {\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue result = buildPartial();\n            if (!result.isInitialized()) {\n              throw newUninitializedMessageException(result);\n            }\n            return result;\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue buildPartial() {\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue result = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue(this);\n            int from_bitField0_ = bitField0_;\n            int to_bitField0_ = 0;\n            if (valueCase_ == 1) {\n              result.value_ = value_;\n            }\n            if (valueCase_ == 2) {\n              result.value_ = value_;\n            }\n            if (valueCase_ == 3) {\n              result.value_ = value_;\n            }\n            if (valueCase_ == 4) {\n              result.value_ = value_;\n            }\n            if (valueCase_ == 5) {\n              result.value_ = value_;\n            }\n            if (valueCase_ == 6) {\n              result.value_ = value_;\n            }\n            if (valueCase_ == 7) {\n              if (extensionValueBuilder_ == null) {\n                result.value_ = value_;\n              } else {\n                result.value_ = extensionValueBuilder_.build();\n              }\n            }\n            result.bitField0_ = to_bitField0_;\n            result.valueCase_ = valueCase_;\n            onBuilt();\n            return result;\n          }\n\n          public Builder clone() {\n            return (Builder) super.clone();\n          }\n          public Builder setField(\n              com.google.protobuf.Descriptors.FieldDescriptor field,\n              Object value) {\n            return (Builder) super.setField(field, value);\n          }\n          public Builder clearField(\n              com.google.protobuf.Descriptors.FieldDescriptor field) {\n            return (Builder) super.clearField(field);\n          }\n          public Builder clearOneof(\n              com.google.protobuf.Descriptors.OneofDescriptor oneof) {\n            return (Builder) super.clearOneof(oneof);\n          }\n          public Builder setRepeatedField(\n              com.google.protobuf.Descriptors.FieldDescriptor field,\n              int index, Object value) {\n            return (Builder) super.setRepeatedField(field, index, value);\n          }\n          public Builder addRepeatedField(\n              com.google.protobuf.Descriptors.FieldDescriptor field,\n              Object value) {\n            return (Builder) super.addRepeatedField(field, value);\n          }\n          public Builder mergeFrom(com.google.protobuf.Message other) {\n            if (other instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue) {\n              return mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue)other);\n            } else {\n              super.mergeFrom(other);\n              return this;\n            }\n          }\n\n          public Builder mergeFrom(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue other) {\n            if (other == org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.getDefaultInstance()) return this;\n            switch (other.getValueCase()) {\n              case INT_VALUE: {\n                setIntValue(other.getIntValue());\n                break;\n              }\n              case LONG_VALUE: {\n                setLongValue(other.getLongValue());\n                break;\n              }\n              case FLOAT_VALUE: {\n                setFloatValue(other.getFloatValue());\n                break;\n              }\n              case DOUBLE_VALUE: {\n                setDoubleValue(other.getDoubleValue());\n                break;\n              }\n              case BOOLEAN_VALUE: {\n                setBooleanValue(other.getBooleanValue());\n                break;\n              }\n              case STRING_VALUE: {\n                valueCase_ = 6;\n                value_ = other.value_;\n                onChanged();\n                break;\n              }\n              case EXTENSION_VALUE: {\n                mergeExtensionValue(other.getExtensionValue());\n                break;\n              }\n              case VALUE_NOT_SET: {\n                break;\n              }\n            }\n            this.mergeUnknownFields(other.unknownFields);\n            onChanged();\n            return this;\n          }\n\n          public final boolean isInitialized() {\n            if (hasExtensionValue()) {\n              if (!getExtensionValue().isInitialized()) {\n                return false;\n              }\n            }\n            return true;\n          }\n\n          public Builder mergeFrom(\n              com.google.protobuf.CodedInputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws java.io.IOException {\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue parsedMessage = null;\n            try {\n              parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry);\n            } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n              parsedMessage = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue) e.getUnfinishedMessage();\n              throw e.unwrapIOException();\n            } finally {\n              if (parsedMessage != null) {\n                mergeFrom(parsedMessage);\n              }\n            }\n            return this;\n          }\n          private int valueCase_ = 0;\n          private java.lang.Object value_;\n          public ValueCase\n              getValueCase() {\n            return ValueCase.forNumber(\n                valueCase_);\n          }\n\n          public Builder clearValue() {\n            valueCase_ = 0;\n            value_ = null;\n            onChanged();\n            return this;\n          }\n\n          private int bitField0_;\n\n          /**\n           * <code>optional uint32 int_value = 1;</code>\n           */\n          public boolean hasIntValue() {\n            return valueCase_ == 1;\n          }\n          /**\n           * <code>optional uint32 int_value = 1;</code>\n           */\n          public int getIntValue() {\n            if (valueCase_ == 1) {\n              return (java.lang.Integer) value_;\n            }\n            return 0;\n          }\n          /**\n           * <code>optional uint32 int_value = 1;</code>\n           */\n          public Builder setIntValue(int value) {\n            valueCase_ = 1;\n            value_ = value;\n            onChanged();\n            return this;\n          }\n          /**\n           * <code>optional uint32 int_value = 1;</code>\n           */\n          public Builder clearIntValue() {\n            if (valueCase_ == 1) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n            return this;\n          }\n\n          /**\n           * <code>optional uint64 long_value = 2;</code>\n           */\n          public boolean hasLongValue() {\n            return valueCase_ == 2;\n          }\n          /**\n           * <code>optional uint64 long_value = 2;</code>\n           */\n          public long getLongValue() {\n            if (valueCase_ == 2) {\n              return (java.lang.Long) value_;\n            }\n            return 0L;\n          }\n          /**\n           * <code>optional uint64 long_value = 2;</code>\n           */\n          public Builder setLongValue(long value) {\n            valueCase_ = 2;\n            value_ = value;\n            onChanged();\n            return this;\n          }\n          /**\n           * <code>optional uint64 long_value = 2;</code>\n           */\n          public Builder clearLongValue() {\n            if (valueCase_ == 2) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n            return this;\n          }\n\n          /**\n           * <code>optional float float_value = 3;</code>\n           */\n          public boolean hasFloatValue() {\n            return valueCase_ == 3;\n          }\n          /**\n           * <code>optional float float_value = 3;</code>\n           */\n          public float getFloatValue() {\n            if (valueCase_ == 3) {\n              return (java.lang.Float) value_;\n            }\n            return 0F;\n          }\n          /**\n           * <code>optional float float_value = 3;</code>\n           */\n          public Builder setFloatValue(float value) {\n            valueCase_ = 3;\n            value_ = value;\n            onChanged();\n            return this;\n          }\n          /**\n           * <code>optional float float_value = 3;</code>\n           */\n          public Builder clearFloatValue() {\n            if (valueCase_ == 3) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n            return this;\n          }\n\n          /**\n           * <code>optional double double_value = 4;</code>\n           */\n          public boolean hasDoubleValue() {\n            return valueCase_ == 4;\n          }\n          /**\n           * <code>optional double double_value = 4;</code>\n           */\n          public double getDoubleValue() {\n            if (valueCase_ == 4) {\n              return (java.lang.Double) value_;\n            }\n            return 0D;\n          }\n          /**\n           * <code>optional double double_value = 4;</code>\n           */\n          public Builder setDoubleValue(double value) {\n            valueCase_ = 4;\n            value_ = value;\n            onChanged();\n            return this;\n          }\n          /**\n           * <code>optional double double_value = 4;</code>\n           */\n          public Builder clearDoubleValue() {\n            if (valueCase_ == 4) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n            return this;\n          }\n\n          /**\n           * <code>optional bool boolean_value = 5;</code>\n           */\n          public boolean hasBooleanValue() {\n            return valueCase_ == 5;\n          }\n          /**\n           * <code>optional bool boolean_value = 5;</code>\n           */\n          public boolean getBooleanValue() {\n            if (valueCase_ == 5) {\n              return (java.lang.Boolean) value_;\n            }\n            return false;\n          }\n          /**\n           * <code>optional bool boolean_value = 5;</code>\n           */\n          public Builder setBooleanValue(boolean value) {\n            valueCase_ = 5;\n            value_ = value;\n            onChanged();\n            return this;\n          }\n          /**\n           * <code>optional bool boolean_value = 5;</code>\n           */\n          public Builder clearBooleanValue() {\n            if (valueCase_ == 5) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n            return this;\n          }\n\n          /**\n           * <code>optional string string_value = 6;</code>\n           */\n          public boolean hasStringValue() {\n            return valueCase_ == 6;\n          }\n          /**\n           * <code>optional string string_value = 6;</code>\n           */\n          public java.lang.String getStringValue() {\n            java.lang.Object ref = \"\";\n            if (valueCase_ == 6) {\n              ref = value_;\n            }\n            if (!(ref instanceof java.lang.String)) {\n              com.google.protobuf.ByteString bs =\n                  (com.google.protobuf.ByteString) ref;\n              java.lang.String s = bs.toStringUtf8();\n              if (valueCase_ == 6) {\n                if (bs.isValidUtf8()) {\n                  value_ = s;\n                }\n              }\n              return s;\n            } else {\n              return (java.lang.String) ref;\n            }\n          }\n          /**\n           * <code>optional string string_value = 6;</code>\n           */\n          public com.google.protobuf.ByteString\n              getStringValueBytes() {\n            java.lang.Object ref = \"\";\n            if (valueCase_ == 6) {\n              ref = value_;\n            }\n            if (ref instanceof String) {\n              com.google.protobuf.ByteString b = \n                  com.google.protobuf.ByteString.copyFromUtf8(\n                      (java.lang.String) ref);\n              if (valueCase_ == 6) {\n                value_ = b;\n              }\n              return b;\n            } else {\n              return (com.google.protobuf.ByteString) ref;\n            }\n          }\n          /**\n           * <code>optional string string_value = 6;</code>\n           */\n          public Builder setStringValue(\n              java.lang.String value) {\n            if (value == null) {\n    throw new NullPointerException();\n  }\n  valueCase_ = 6;\n            value_ = value;\n            onChanged();\n            return this;\n          }\n          /**\n           * <code>optional string string_value = 6;</code>\n           */\n          public Builder clearStringValue() {\n            if (valueCase_ == 6) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n            return this;\n          }\n          /**\n           * <code>optional string string_value = 6;</code>\n           */\n          public Builder setStringValueBytes(\n              com.google.protobuf.ByteString value) {\n            if (value == null) {\n    throw new NullPointerException();\n  }\n  valueCase_ = 6;\n            value_ = value;\n            onChanged();\n            return this;\n          }\n\n          private com.google.protobuf.SingleFieldBuilderV3<\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtensionOrBuilder> extensionValueBuilder_;\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension extension_value = 7;</code>\n           */\n          public boolean hasExtensionValue() {\n            return valueCase_ == 7;\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension extension_value = 7;</code>\n           */\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension getExtensionValue() {\n            if (extensionValueBuilder_ == null) {\n              if (valueCase_ == 7) {\n                return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension) value_;\n              }\n              return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.getDefaultInstance();\n            } else {\n              if (valueCase_ == 7) {\n                return extensionValueBuilder_.getMessage();\n              }\n              return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.getDefaultInstance();\n            }\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension extension_value = 7;</code>\n           */\n          public Builder setExtensionValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension value) {\n            if (extensionValueBuilder_ == null) {\n              if (value == null) {\n                throw new NullPointerException();\n              }\n              value_ = value;\n              onChanged();\n            } else {\n              extensionValueBuilder_.setMessage(value);\n            }\n            valueCase_ = 7;\n            return this;\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension extension_value = 7;</code>\n           */\n          public Builder setExtensionValue(\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.Builder builderForValue) {\n            if (extensionValueBuilder_ == null) {\n              value_ = builderForValue.build();\n              onChanged();\n            } else {\n              extensionValueBuilder_.setMessage(builderForValue.build());\n            }\n            valueCase_ = 7;\n            return this;\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension extension_value = 7;</code>\n           */\n          public Builder mergeExtensionValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension value) {\n            if (extensionValueBuilder_ == null) {\n              if (valueCase_ == 7 &&\n                  value_ != org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.getDefaultInstance()) {\n                value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.newBuilder((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension) value_)\n                    .mergeFrom(value).buildPartial();\n              } else {\n                value_ = value;\n              }\n              onChanged();\n            } else {\n              if (valueCase_ == 7) {\n                extensionValueBuilder_.mergeFrom(value);\n              }\n              extensionValueBuilder_.setMessage(value);\n            }\n            valueCase_ = 7;\n            return this;\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension extension_value = 7;</code>\n           */\n          public Builder clearExtensionValue() {\n            if (extensionValueBuilder_ == null) {\n              if (valueCase_ == 7) {\n                valueCase_ = 0;\n                value_ = null;\n                onChanged();\n              }\n            } else {\n              if (valueCase_ == 7) {\n                valueCase_ = 0;\n                value_ = null;\n              }\n              extensionValueBuilder_.clear();\n            }\n            return this;\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension extension_value = 7;</code>\n           */\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.Builder getExtensionValueBuilder() {\n            return getExtensionValueFieldBuilder().getBuilder();\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension extension_value = 7;</code>\n           */\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtensionOrBuilder getExtensionValueOrBuilder() {\n            if ((valueCase_ == 7) && (extensionValueBuilder_ != null)) {\n              return extensionValueBuilder_.getMessageOrBuilder();\n            } else {\n              if (valueCase_ == 7) {\n                return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension) value_;\n              }\n              return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.getDefaultInstance();\n            }\n          }\n          /**\n           * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension extension_value = 7;</code>\n           */\n          private com.google.protobuf.SingleFieldBuilderV3<\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtensionOrBuilder> \n              getExtensionValueFieldBuilder() {\n            if (extensionValueBuilder_ == null) {\n              if (!(valueCase_ == 7)) {\n                value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.getDefaultInstance();\n              }\n              extensionValueBuilder_ = new com.google.protobuf.SingleFieldBuilderV3<\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtensionOrBuilder>(\n                      (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.DataSetValueExtension) value_,\n                      getParentForChildren(),\n                      isClean());\n              value_ = null;\n            }\n            valueCase_ = 7;\n            onChanged();;\n            return extensionValueBuilder_;\n          }\n          public final Builder setUnknownFields(\n              final com.google.protobuf.UnknownFieldSet unknownFields) {\n            return super.setUnknownFields(unknownFields);\n          }\n\n          public final Builder mergeUnknownFields(\n              final com.google.protobuf.UnknownFieldSet unknownFields) {\n            return super.mergeUnknownFields(unknownFields);\n          }\n\n\n          // @@protoc_insertion_point(builder_scope:org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue)\n        }\n\n        // @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue)\n        private static final org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue DEFAULT_INSTANCE;\n        static {\n          DEFAULT_INSTANCE = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue();\n        }\n\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue getDefaultInstance() {\n          return DEFAULT_INSTANCE;\n        }\n\n        @java.lang.Deprecated public static final com.google.protobuf.Parser<DataSetValue>\n            PARSER = new com.google.protobuf.AbstractParser<DataSetValue>() {\n          public DataSetValue parsePartialFrom(\n              com.google.protobuf.CodedInputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws com.google.protobuf.InvalidProtocolBufferException {\n              return new DataSetValue(input, extensionRegistry);\n          }\n        };\n\n        public static com.google.protobuf.Parser<DataSetValue> parser() {\n          return PARSER;\n        }\n\n        @java.lang.Override\n        public com.google.protobuf.Parser<DataSetValue> getParserForType() {\n          return PARSER;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue getDefaultInstanceForType() {\n          return DEFAULT_INSTANCE;\n        }\n\n      }\n\n      public interface RowOrBuilder extends\n          // @@protoc_insertion_point(interface_extends:org.eclipse.tahu.protobuf.Payload.DataSet.Row)\n          com.google.protobuf.GeneratedMessageV3.\n              ExtendableMessageOrBuilder<Row> {\n\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n         */\n        java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue> \n            getElementsList();\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n         */\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue getElements(int index);\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n         */\n        int getElementsCount();\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n         */\n        java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValueOrBuilder> \n            getElementsOrBuilderList();\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n         */\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValueOrBuilder getElementsOrBuilder(\n            int index);\n      }\n      /**\n       * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.DataSet.Row}\n       */\n      public  static final class Row extends\n          com.google.protobuf.GeneratedMessageV3.ExtendableMessage<\n            Row> implements\n          // @@protoc_insertion_point(message_implements:org.eclipse.tahu.protobuf.Payload.DataSet.Row)\n          RowOrBuilder {\n        // Use Row.newBuilder() to construct.\n        private Row(com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row, ?> builder) {\n          super(builder);\n        }\n        private Row() {\n          elements_ = java.util.Collections.emptyList();\n        }\n\n        @java.lang.Override\n        public final com.google.protobuf.UnknownFieldSet\n        getUnknownFields() {\n          return this.unknownFields;\n        }\n        private Row(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          this();\n          int mutable_bitField0_ = 0;\n          com.google.protobuf.UnknownFieldSet.Builder unknownFields =\n              com.google.protobuf.UnknownFieldSet.newBuilder();\n          try {\n            boolean done = false;\n            while (!done) {\n              int tag = input.readTag();\n              switch (tag) {\n                case 0:\n                  done = true;\n                  break;\n                default: {\n                  if (!parseUnknownField(input, unknownFields,\n                                         extensionRegistry, tag)) {\n                    done = true;\n                  }\n                  break;\n                }\n                case 10: {\n                  if (!((mutable_bitField0_ & 0x00000001) == 0x00000001)) {\n                    elements_ = new java.util.ArrayList<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue>();\n                    mutable_bitField0_ |= 0x00000001;\n                  }\n                  elements_.add(\n                      input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.PARSER, extensionRegistry));\n                  break;\n                }\n              }\n            }\n          } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n            throw e.setUnfinishedMessage(this);\n          } catch (java.io.IOException e) {\n            throw new com.google.protobuf.InvalidProtocolBufferException(\n                e).setUnfinishedMessage(this);\n          } finally {\n            if (((mutable_bitField0_ & 0x00000001) == 0x00000001)) {\n              elements_ = java.util.Collections.unmodifiableList(elements_);\n            }\n            this.unknownFields = unknownFields.build();\n            makeExtensionsImmutable();\n          }\n        }\n        public static final com.google.protobuf.Descriptors.Descriptor\n            getDescriptor() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_Row_descriptor;\n        }\n\n        protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n            internalGetFieldAccessorTable() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_Row_fieldAccessorTable\n              .ensureFieldAccessorsInitialized(\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.Builder.class);\n        }\n\n        public static final int ELEMENTS_FIELD_NUMBER = 1;\n        private java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue> elements_;\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n         */\n        public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue> getElementsList() {\n          return elements_;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n         */\n        public java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValueOrBuilder> \n            getElementsOrBuilderList() {\n          return elements_;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n         */\n        public int getElementsCount() {\n          return elements_.size();\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue getElements(int index) {\n          return elements_.get(index);\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValueOrBuilder getElementsOrBuilder(\n            int index) {\n          return elements_.get(index);\n        }\n\n        private byte memoizedIsInitialized = -1;\n        public final boolean isInitialized() {\n          byte isInitialized = memoizedIsInitialized;\n          if (isInitialized == 1) return true;\n          if (isInitialized == 0) return false;\n\n          for (int i = 0; i < getElementsCount(); i++) {\n            if (!getElements(i).isInitialized()) {\n              memoizedIsInitialized = 0;\n              return false;\n            }\n          }\n          if (!extensionsAreInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n          memoizedIsInitialized = 1;\n          return true;\n        }\n\n        public void writeTo(com.google.protobuf.CodedOutputStream output)\n                            throws java.io.IOException {\n          com.google.protobuf.GeneratedMessageV3\n            .ExtendableMessage<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row>.ExtensionWriter\n              extensionWriter = newExtensionWriter();\n          for (int i = 0; i < elements_.size(); i++) {\n            output.writeMessage(1, elements_.get(i));\n          }\n          extensionWriter.writeUntil(536870912, output);\n          unknownFields.writeTo(output);\n        }\n\n        public int getSerializedSize() {\n          int size = memoizedSize;\n          if (size != -1) return size;\n\n          size = 0;\n          for (int i = 0; i < elements_.size(); i++) {\n            size += com.google.protobuf.CodedOutputStream\n              .computeMessageSize(1, elements_.get(i));\n          }\n          size += extensionsSerializedSize();\n          size += unknownFields.getSerializedSize();\n          memoizedSize = size;\n          return size;\n        }\n\n        private static final long serialVersionUID = 0L;\n        @java.lang.Override\n        public boolean equals(final java.lang.Object obj) {\n          if (obj == this) {\n           return true;\n          }\n          if (!(obj instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row)) {\n            return super.equals(obj);\n          }\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row other = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row) obj;\n\n          boolean result = true;\n          result = result && getElementsList()\n              .equals(other.getElementsList());\n          result = result && unknownFields.equals(other.unknownFields);\n          result = result &&\n              getExtensionFields().equals(other.getExtensionFields());\n          return result;\n        }\n\n        @java.lang.Override\n        public int hashCode() {\n          if (memoizedHashCode != 0) {\n            return memoizedHashCode;\n          }\n          int hash = 41;\n          hash = (19 * hash) + getDescriptorForType().hashCode();\n          if (getElementsCount() > 0) {\n            hash = (37 * hash) + ELEMENTS_FIELD_NUMBER;\n            hash = (53 * hash) + getElementsList().hashCode();\n          }\n          hash = hashFields(hash, getExtensionFields());\n          hash = (29 * hash) + unknownFields.hashCode();\n          memoizedHashCode = hash;\n          return hash;\n        }\n\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row parseFrom(\n            com.google.protobuf.ByteString data)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row parseFrom(\n            com.google.protobuf.ByteString data,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row parseFrom(byte[] data)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row parseFrom(\n            byte[] data,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row parseFrom(java.io.InputStream input)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row parseFrom(\n            java.io.InputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row parseDelimitedFrom(java.io.InputStream input)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseDelimitedWithIOException(PARSER, input);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row parseDelimitedFrom(\n            java.io.InputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseDelimitedWithIOException(PARSER, input, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row parseFrom(\n            com.google.protobuf.CodedInputStream input)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row parseFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input, extensionRegistry);\n        }\n\n        public Builder newBuilderForType() { return newBuilder(); }\n        public static Builder newBuilder() {\n          return DEFAULT_INSTANCE.toBuilder();\n        }\n        public static Builder newBuilder(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row prototype) {\n          return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);\n        }\n        public Builder toBuilder() {\n          return this == DEFAULT_INSTANCE\n              ? new Builder() : new Builder().mergeFrom(this);\n        }\n\n        @java.lang.Override\n        protected Builder newBuilderForType(\n            com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n          Builder builder = new Builder(parent);\n          return builder;\n        }\n        /**\n         * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.DataSet.Row}\n         */\n        public static final class Builder extends\n            com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row, Builder> implements\n            // @@protoc_insertion_point(builder_implements:org.eclipse.tahu.protobuf.Payload.DataSet.Row)\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.RowOrBuilder {\n          public static final com.google.protobuf.Descriptors.Descriptor\n              getDescriptor() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_Row_descriptor;\n          }\n\n          protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n              internalGetFieldAccessorTable() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_Row_fieldAccessorTable\n                .ensureFieldAccessorsInitialized(\n                    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.Builder.class);\n          }\n\n          // Construct using org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.newBuilder()\n          private Builder() {\n            maybeForceBuilderInitialization();\n          }\n\n          private Builder(\n              com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n            super(parent);\n            maybeForceBuilderInitialization();\n          }\n          private void maybeForceBuilderInitialization() {\n            if (com.google.protobuf.GeneratedMessageV3\n                    .alwaysUseFieldBuilders) {\n              getElementsFieldBuilder();\n            }\n          }\n          public Builder clear() {\n            super.clear();\n            if (elementsBuilder_ == null) {\n              elements_ = java.util.Collections.emptyList();\n              bitField0_ = (bitField0_ & ~0x00000001);\n            } else {\n              elementsBuilder_.clear();\n            }\n            return this;\n          }\n\n          public com.google.protobuf.Descriptors.Descriptor\n              getDescriptorForType() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_Row_descriptor;\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row getDefaultInstanceForType() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.getDefaultInstance();\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row build() {\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row result = buildPartial();\n            if (!result.isInitialized()) {\n              throw newUninitializedMessageException(result);\n            }\n            return result;\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row buildPartial() {\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row result = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row(this);\n            int from_bitField0_ = bitField0_;\n            if (elementsBuilder_ == null) {\n              if (((bitField0_ & 0x00000001) == 0x00000001)) {\n                elements_ = java.util.Collections.unmodifiableList(elements_);\n                bitField0_ = (bitField0_ & ~0x00000001);\n              }\n              result.elements_ = elements_;\n            } else {\n              result.elements_ = elementsBuilder_.build();\n            }\n            onBuilt();\n            return result;\n          }\n\n          public Builder clone() {\n            return (Builder) super.clone();\n          }\n          public Builder setField(\n              com.google.protobuf.Descriptors.FieldDescriptor field,\n              Object value) {\n            return (Builder) super.setField(field, value);\n          }\n          public Builder clearField(\n              com.google.protobuf.Descriptors.FieldDescriptor field) {\n            return (Builder) super.clearField(field);\n          }\n          public Builder clearOneof(\n              com.google.protobuf.Descriptors.OneofDescriptor oneof) {\n            return (Builder) super.clearOneof(oneof);\n          }\n          public Builder setRepeatedField(\n              com.google.protobuf.Descriptors.FieldDescriptor field,\n              int index, Object value) {\n            return (Builder) super.setRepeatedField(field, index, value);\n          }\n          public Builder addRepeatedField(\n              com.google.protobuf.Descriptors.FieldDescriptor field,\n              Object value) {\n            return (Builder) super.addRepeatedField(field, value);\n          }\n          public <Type> Builder setExtension(\n              com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row, Type> extension,\n              Type value) {\n            return (Builder) super.setExtension(extension, value);\n          }\n          public <Type> Builder setExtension(\n              com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row, java.util.List<Type>> extension,\n              int index, Type value) {\n            return (Builder) super.setExtension(extension, index, value);\n          }\n          public <Type> Builder addExtension(\n              com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row, java.util.List<Type>> extension,\n              Type value) {\n            return (Builder) super.addExtension(extension, value);\n          }\n          public <Type> Builder clearExtension(\n              com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row, ?> extension) {\n            return (Builder) super.clearExtension(extension);\n          }\n          public Builder mergeFrom(com.google.protobuf.Message other) {\n            if (other instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row) {\n              return mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row)other);\n            } else {\n              super.mergeFrom(other);\n              return this;\n            }\n          }\n\n          public Builder mergeFrom(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row other) {\n            if (other == org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.getDefaultInstance()) return this;\n            if (elementsBuilder_ == null) {\n              if (!other.elements_.isEmpty()) {\n                if (elements_.isEmpty()) {\n                  elements_ = other.elements_;\n                  bitField0_ = (bitField0_ & ~0x00000001);\n                } else {\n                  ensureElementsIsMutable();\n                  elements_.addAll(other.elements_);\n                }\n                onChanged();\n              }\n            } else {\n              if (!other.elements_.isEmpty()) {\n                if (elementsBuilder_.isEmpty()) {\n                  elementsBuilder_.dispose();\n                  elementsBuilder_ = null;\n                  elements_ = other.elements_;\n                  bitField0_ = (bitField0_ & ~0x00000001);\n                  elementsBuilder_ = \n                    com.google.protobuf.GeneratedMessageV3.alwaysUseFieldBuilders ?\n                       getElementsFieldBuilder() : null;\n                } else {\n                  elementsBuilder_.addAllMessages(other.elements_);\n                }\n              }\n            }\n            this.mergeExtensionFields(other);\n            this.mergeUnknownFields(other.unknownFields);\n            onChanged();\n            return this;\n          }\n\n          public final boolean isInitialized() {\n            for (int i = 0; i < getElementsCount(); i++) {\n              if (!getElements(i).isInitialized()) {\n                return false;\n              }\n            }\n            if (!extensionsAreInitialized()) {\n              return false;\n            }\n            return true;\n          }\n\n          public Builder mergeFrom(\n              com.google.protobuf.CodedInputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws java.io.IOException {\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row parsedMessage = null;\n            try {\n              parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry);\n            } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n              parsedMessage = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row) e.getUnfinishedMessage();\n              throw e.unwrapIOException();\n            } finally {\n              if (parsedMessage != null) {\n                mergeFrom(parsedMessage);\n              }\n            }\n            return this;\n          }\n          private int bitField0_;\n\n          private java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue> elements_ =\n            java.util.Collections.emptyList();\n          private void ensureElementsIsMutable() {\n            if (!((bitField0_ & 0x00000001) == 0x00000001)) {\n              elements_ = new java.util.ArrayList<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue>(elements_);\n              bitField0_ |= 0x00000001;\n             }\n          }\n\n          private com.google.protobuf.RepeatedFieldBuilderV3<\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValueOrBuilder> elementsBuilder_;\n\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue> getElementsList() {\n            if (elementsBuilder_ == null) {\n              return java.util.Collections.unmodifiableList(elements_);\n            } else {\n              return elementsBuilder_.getMessageList();\n            }\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public int getElementsCount() {\n            if (elementsBuilder_ == null) {\n              return elements_.size();\n            } else {\n              return elementsBuilder_.getCount();\n            }\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue getElements(int index) {\n            if (elementsBuilder_ == null) {\n              return elements_.get(index);\n            } else {\n              return elementsBuilder_.getMessage(index);\n            }\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public Builder setElements(\n              int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue value) {\n            if (elementsBuilder_ == null) {\n              if (value == null) {\n                throw new NullPointerException();\n              }\n              ensureElementsIsMutable();\n              elements_.set(index, value);\n              onChanged();\n            } else {\n              elementsBuilder_.setMessage(index, value);\n            }\n            return this;\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public Builder setElements(\n              int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.Builder builderForValue) {\n            if (elementsBuilder_ == null) {\n              ensureElementsIsMutable();\n              elements_.set(index, builderForValue.build());\n              onChanged();\n            } else {\n              elementsBuilder_.setMessage(index, builderForValue.build());\n            }\n            return this;\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public Builder addElements(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue value) {\n            if (elementsBuilder_ == null) {\n              if (value == null) {\n                throw new NullPointerException();\n              }\n              ensureElementsIsMutable();\n              elements_.add(value);\n              onChanged();\n            } else {\n              elementsBuilder_.addMessage(value);\n            }\n            return this;\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public Builder addElements(\n              int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue value) {\n            if (elementsBuilder_ == null) {\n              if (value == null) {\n                throw new NullPointerException();\n              }\n              ensureElementsIsMutable();\n              elements_.add(index, value);\n              onChanged();\n            } else {\n              elementsBuilder_.addMessage(index, value);\n            }\n            return this;\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public Builder addElements(\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.Builder builderForValue) {\n            if (elementsBuilder_ == null) {\n              ensureElementsIsMutable();\n              elements_.add(builderForValue.build());\n              onChanged();\n            } else {\n              elementsBuilder_.addMessage(builderForValue.build());\n            }\n            return this;\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public Builder addElements(\n              int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.Builder builderForValue) {\n            if (elementsBuilder_ == null) {\n              ensureElementsIsMutable();\n              elements_.add(index, builderForValue.build());\n              onChanged();\n            } else {\n              elementsBuilder_.addMessage(index, builderForValue.build());\n            }\n            return this;\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public Builder addAllElements(\n              java.lang.Iterable<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue> values) {\n            if (elementsBuilder_ == null) {\n              ensureElementsIsMutable();\n              com.google.protobuf.AbstractMessageLite.Builder.addAll(\n                  values, elements_);\n              onChanged();\n            } else {\n              elementsBuilder_.addAllMessages(values);\n            }\n            return this;\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public Builder clearElements() {\n            if (elementsBuilder_ == null) {\n              elements_ = java.util.Collections.emptyList();\n              bitField0_ = (bitField0_ & ~0x00000001);\n              onChanged();\n            } else {\n              elementsBuilder_.clear();\n            }\n            return this;\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public Builder removeElements(int index) {\n            if (elementsBuilder_ == null) {\n              ensureElementsIsMutable();\n              elements_.remove(index);\n              onChanged();\n            } else {\n              elementsBuilder_.remove(index);\n            }\n            return this;\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.Builder getElementsBuilder(\n              int index) {\n            return getElementsFieldBuilder().getBuilder(index);\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValueOrBuilder getElementsOrBuilder(\n              int index) {\n            if (elementsBuilder_ == null) {\n              return elements_.get(index);  } else {\n              return elementsBuilder_.getMessageOrBuilder(index);\n            }\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValueOrBuilder> \n               getElementsOrBuilderList() {\n            if (elementsBuilder_ != null) {\n              return elementsBuilder_.getMessageOrBuilderList();\n            } else {\n              return java.util.Collections.unmodifiableList(elements_);\n            }\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.Builder addElementsBuilder() {\n            return getElementsFieldBuilder().addBuilder(\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.getDefaultInstance());\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.Builder addElementsBuilder(\n              int index) {\n            return getElementsFieldBuilder().addBuilder(\n                index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.getDefaultInstance());\n          }\n          /**\n           * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue elements = 1;</code>\n           */\n          public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.Builder> \n               getElementsBuilderList() {\n            return getElementsFieldBuilder().getBuilderList();\n          }\n          private com.google.protobuf.RepeatedFieldBuilderV3<\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValueOrBuilder> \n              getElementsFieldBuilder() {\n            if (elementsBuilder_ == null) {\n              elementsBuilder_ = new com.google.protobuf.RepeatedFieldBuilderV3<\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValue.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.DataSetValueOrBuilder>(\n                      elements_,\n                      ((bitField0_ & 0x00000001) == 0x00000001),\n                      getParentForChildren(),\n                      isClean());\n              elements_ = null;\n            }\n            return elementsBuilder_;\n          }\n          public final Builder setUnknownFields(\n              final com.google.protobuf.UnknownFieldSet unknownFields) {\n            return super.setUnknownFields(unknownFields);\n          }\n\n          public final Builder mergeUnknownFields(\n              final com.google.protobuf.UnknownFieldSet unknownFields) {\n            return super.mergeUnknownFields(unknownFields);\n          }\n\n\n          // @@protoc_insertion_point(builder_scope:org.eclipse.tahu.protobuf.Payload.DataSet.Row)\n        }\n\n        // @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.DataSet.Row)\n        private static final org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row DEFAULT_INSTANCE;\n        static {\n          DEFAULT_INSTANCE = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row();\n        }\n\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row getDefaultInstance() {\n          return DEFAULT_INSTANCE;\n        }\n\n        @java.lang.Deprecated public static final com.google.protobuf.Parser<Row>\n            PARSER = new com.google.protobuf.AbstractParser<Row>() {\n          public Row parsePartialFrom(\n              com.google.protobuf.CodedInputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws com.google.protobuf.InvalidProtocolBufferException {\n              return new Row(input, extensionRegistry);\n          }\n        };\n\n        public static com.google.protobuf.Parser<Row> parser() {\n          return PARSER;\n        }\n\n        @java.lang.Override\n        public com.google.protobuf.Parser<Row> getParserForType() {\n          return PARSER;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row getDefaultInstanceForType() {\n          return DEFAULT_INSTANCE;\n        }\n\n      }\n\n      private int bitField0_;\n      public static final int NUM_OF_COLUMNS_FIELD_NUMBER = 1;\n      private long numOfColumns_;\n      /**\n       * <code>optional uint64 num_of_columns = 1;</code>\n       */\n      public boolean hasNumOfColumns() {\n        return ((bitField0_ & 0x00000001) == 0x00000001);\n      }\n      /**\n       * <code>optional uint64 num_of_columns = 1;</code>\n       */\n      public long getNumOfColumns() {\n        return numOfColumns_;\n      }\n\n      public static final int COLUMNS_FIELD_NUMBER = 2;\n      private com.google.protobuf.LazyStringList columns_;\n      /**\n       * <code>repeated string columns = 2;</code>\n       */\n      public com.google.protobuf.ProtocolStringList\n          getColumnsList() {\n        return columns_;\n      }\n      /**\n       * <code>repeated string columns = 2;</code>\n       */\n      public int getColumnsCount() {\n        return columns_.size();\n      }\n      /**\n       * <code>repeated string columns = 2;</code>\n       */\n      public java.lang.String getColumns(int index) {\n        return columns_.get(index);\n      }\n      /**\n       * <code>repeated string columns = 2;</code>\n       */\n      public com.google.protobuf.ByteString\n          getColumnsBytes(int index) {\n        return columns_.getByteString(index);\n      }\n\n      public static final int TYPES_FIELD_NUMBER = 3;\n      private java.util.List<java.lang.Integer> types_;\n      /**\n       * <code>repeated uint32 types = 3;</code>\n       */\n      public java.util.List<java.lang.Integer>\n          getTypesList() {\n        return types_;\n      }\n      /**\n       * <code>repeated uint32 types = 3;</code>\n       */\n      public int getTypesCount() {\n        return types_.size();\n      }\n      /**\n       * <code>repeated uint32 types = 3;</code>\n       */\n      public int getTypes(int index) {\n        return types_.get(index);\n      }\n\n      public static final int ROWS_FIELD_NUMBER = 4;\n      private java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row> rows_;\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n       */\n      public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row> getRowsList() {\n        return rows_;\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n       */\n      public java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.RowOrBuilder> \n          getRowsOrBuilderList() {\n        return rows_;\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n       */\n      public int getRowsCount() {\n        return rows_.size();\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row getRows(int index) {\n        return rows_.get(index);\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.RowOrBuilder getRowsOrBuilder(\n          int index) {\n        return rows_.get(index);\n      }\n\n      private byte memoizedIsInitialized = -1;\n      public final boolean isInitialized() {\n        byte isInitialized = memoizedIsInitialized;\n        if (isInitialized == 1) return true;\n        if (isInitialized == 0) return false;\n\n        for (int i = 0; i < getRowsCount(); i++) {\n          if (!getRows(i).isInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n        }\n        if (!extensionsAreInitialized()) {\n          memoizedIsInitialized = 0;\n          return false;\n        }\n        memoizedIsInitialized = 1;\n        return true;\n      }\n\n      public void writeTo(com.google.protobuf.CodedOutputStream output)\n                          throws java.io.IOException {\n        com.google.protobuf.GeneratedMessageV3\n          .ExtendableMessage<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet>.ExtensionWriter\n            extensionWriter = newExtensionWriter();\n        if (((bitField0_ & 0x00000001) == 0x00000001)) {\n          output.writeUInt64(1, numOfColumns_);\n        }\n        for (int i = 0; i < columns_.size(); i++) {\n          com.google.protobuf.GeneratedMessageV3.writeString(output, 2, columns_.getRaw(i));\n        }\n        for (int i = 0; i < types_.size(); i++) {\n          output.writeUInt32(3, types_.get(i));\n        }\n        for (int i = 0; i < rows_.size(); i++) {\n          output.writeMessage(4, rows_.get(i));\n        }\n        extensionWriter.writeUntil(536870912, output);\n        unknownFields.writeTo(output);\n      }\n\n      public int getSerializedSize() {\n        int size = memoizedSize;\n        if (size != -1) return size;\n\n        size = 0;\n        if (((bitField0_ & 0x00000001) == 0x00000001)) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeUInt64Size(1, numOfColumns_);\n        }\n        {\n          int dataSize = 0;\n          for (int i = 0; i < columns_.size(); i++) {\n            dataSize += computeStringSizeNoTag(columns_.getRaw(i));\n          }\n          size += dataSize;\n          size += 1 * getColumnsList().size();\n        }\n        {\n          int dataSize = 0;\n          for (int i = 0; i < types_.size(); i++) {\n            dataSize += com.google.protobuf.CodedOutputStream\n              .computeUInt32SizeNoTag(types_.get(i));\n          }\n          size += dataSize;\n          size += 1 * getTypesList().size();\n        }\n        for (int i = 0; i < rows_.size(); i++) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeMessageSize(4, rows_.get(i));\n        }\n        size += extensionsSerializedSize();\n        size += unknownFields.getSerializedSize();\n        memoizedSize = size;\n        return size;\n      }\n\n      private static final long serialVersionUID = 0L;\n      @java.lang.Override\n      public boolean equals(final java.lang.Object obj) {\n        if (obj == this) {\n         return true;\n        }\n        if (!(obj instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet)) {\n          return super.equals(obj);\n        }\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet other = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet) obj;\n\n        boolean result = true;\n        result = result && (hasNumOfColumns() == other.hasNumOfColumns());\n        if (hasNumOfColumns()) {\n          result = result && (getNumOfColumns()\n              == other.getNumOfColumns());\n        }\n        result = result && getColumnsList()\n            .equals(other.getColumnsList());\n        result = result && getTypesList()\n            .equals(other.getTypesList());\n        result = result && getRowsList()\n            .equals(other.getRowsList());\n        result = result && unknownFields.equals(other.unknownFields);\n        result = result &&\n            getExtensionFields().equals(other.getExtensionFields());\n        return result;\n      }\n\n      @java.lang.Override\n      public int hashCode() {\n        if (memoizedHashCode != 0) {\n          return memoizedHashCode;\n        }\n        int hash = 41;\n        hash = (19 * hash) + getDescriptorForType().hashCode();\n        if (hasNumOfColumns()) {\n          hash = (37 * hash) + NUM_OF_COLUMNS_FIELD_NUMBER;\n          hash = (53 * hash) + com.google.protobuf.Internal.hashLong(\n              getNumOfColumns());\n        }\n        if (getColumnsCount() > 0) {\n          hash = (37 * hash) + COLUMNS_FIELD_NUMBER;\n          hash = (53 * hash) + getColumnsList().hashCode();\n        }\n        if (getTypesCount() > 0) {\n          hash = (37 * hash) + TYPES_FIELD_NUMBER;\n          hash = (53 * hash) + getTypesList().hashCode();\n        }\n        if (getRowsCount() > 0) {\n          hash = (37 * hash) + ROWS_FIELD_NUMBER;\n          hash = (53 * hash) + getRowsList().hashCode();\n        }\n        hash = hashFields(hash, getExtensionFields());\n        hash = (29 * hash) + unknownFields.hashCode();\n        memoizedHashCode = hash;\n        return hash;\n      }\n\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet parseFrom(\n          com.google.protobuf.ByteString data)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet parseFrom(\n          com.google.protobuf.ByteString data,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet parseFrom(byte[] data)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet parseFrom(\n          byte[] data,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet parseFrom(java.io.InputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet parseFrom(\n          java.io.InputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet parseDelimitedFrom(java.io.InputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseDelimitedWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet parseDelimitedFrom(\n          java.io.InputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseDelimitedWithIOException(PARSER, input, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet parseFrom(\n          com.google.protobuf.CodedInputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet parseFrom(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input, extensionRegistry);\n      }\n\n      public Builder newBuilderForType() { return newBuilder(); }\n      public static Builder newBuilder() {\n        return DEFAULT_INSTANCE.toBuilder();\n      }\n      public static Builder newBuilder(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet prototype) {\n        return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);\n      }\n      public Builder toBuilder() {\n        return this == DEFAULT_INSTANCE\n            ? new Builder() : new Builder().mergeFrom(this);\n      }\n\n      @java.lang.Override\n      protected Builder newBuilderForType(\n          com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n        Builder builder = new Builder(parent);\n        return builder;\n      }\n      /**\n       * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.DataSet}\n       */\n      public static final class Builder extends\n          com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet, Builder> implements\n          // @@protoc_insertion_point(builder_implements:org.eclipse.tahu.protobuf.Payload.DataSet)\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSetOrBuilder {\n        public static final com.google.protobuf.Descriptors.Descriptor\n            getDescriptor() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_descriptor;\n        }\n\n        protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n            internalGetFieldAccessorTable() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_fieldAccessorTable\n              .ensureFieldAccessorsInitialized(\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Builder.class);\n        }\n\n        // Construct using org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.newBuilder()\n        private Builder() {\n          maybeForceBuilderInitialization();\n        }\n\n        private Builder(\n            com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n          super(parent);\n          maybeForceBuilderInitialization();\n        }\n        private void maybeForceBuilderInitialization() {\n          if (com.google.protobuf.GeneratedMessageV3\n                  .alwaysUseFieldBuilders) {\n            getRowsFieldBuilder();\n          }\n        }\n        public Builder clear() {\n          super.clear();\n          numOfColumns_ = 0L;\n          bitField0_ = (bitField0_ & ~0x00000001);\n          columns_ = com.google.protobuf.LazyStringArrayList.EMPTY;\n          bitField0_ = (bitField0_ & ~0x00000002);\n          types_ = java.util.Collections.emptyList();\n          bitField0_ = (bitField0_ & ~0x00000004);\n          if (rowsBuilder_ == null) {\n            rows_ = java.util.Collections.emptyList();\n            bitField0_ = (bitField0_ & ~0x00000008);\n          } else {\n            rowsBuilder_.clear();\n          }\n          return this;\n        }\n\n        public com.google.protobuf.Descriptors.Descriptor\n            getDescriptorForType() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_descriptor;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet getDefaultInstanceForType() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.getDefaultInstance();\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet build() {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet result = buildPartial();\n          if (!result.isInitialized()) {\n            throw newUninitializedMessageException(result);\n          }\n          return result;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet buildPartial() {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet result = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet(this);\n          int from_bitField0_ = bitField0_;\n          int to_bitField0_ = 0;\n          if (((from_bitField0_ & 0x00000001) == 0x00000001)) {\n            to_bitField0_ |= 0x00000001;\n          }\n          result.numOfColumns_ = numOfColumns_;\n          if (((bitField0_ & 0x00000002) == 0x00000002)) {\n            columns_ = columns_.getUnmodifiableView();\n            bitField0_ = (bitField0_ & ~0x00000002);\n          }\n          result.columns_ = columns_;\n          if (((bitField0_ & 0x00000004) == 0x00000004)) {\n            types_ = java.util.Collections.unmodifiableList(types_);\n            bitField0_ = (bitField0_ & ~0x00000004);\n          }\n          result.types_ = types_;\n          if (rowsBuilder_ == null) {\n            if (((bitField0_ & 0x00000008) == 0x00000008)) {\n              rows_ = java.util.Collections.unmodifiableList(rows_);\n              bitField0_ = (bitField0_ & ~0x00000008);\n            }\n            result.rows_ = rows_;\n          } else {\n            result.rows_ = rowsBuilder_.build();\n          }\n          result.bitField0_ = to_bitField0_;\n          onBuilt();\n          return result;\n        }\n\n        public Builder clone() {\n          return (Builder) super.clone();\n        }\n        public Builder setField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            Object value) {\n          return (Builder) super.setField(field, value);\n        }\n        public Builder clearField(\n            com.google.protobuf.Descriptors.FieldDescriptor field) {\n          return (Builder) super.clearField(field);\n        }\n        public Builder clearOneof(\n            com.google.protobuf.Descriptors.OneofDescriptor oneof) {\n          return (Builder) super.clearOneof(oneof);\n        }\n        public Builder setRepeatedField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            int index, Object value) {\n          return (Builder) super.setRepeatedField(field, index, value);\n        }\n        public Builder addRepeatedField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            Object value) {\n          return (Builder) super.addRepeatedField(field, value);\n        }\n        public <Type> Builder setExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet, Type> extension,\n            Type value) {\n          return (Builder) super.setExtension(extension, value);\n        }\n        public <Type> Builder setExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet, java.util.List<Type>> extension,\n            int index, Type value) {\n          return (Builder) super.setExtension(extension, index, value);\n        }\n        public <Type> Builder addExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet, java.util.List<Type>> extension,\n            Type value) {\n          return (Builder) super.addExtension(extension, value);\n        }\n        public <Type> Builder clearExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet, ?> extension) {\n          return (Builder) super.clearExtension(extension);\n        }\n        public Builder mergeFrom(com.google.protobuf.Message other) {\n          if (other instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet) {\n            return mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet)other);\n          } else {\n            super.mergeFrom(other);\n            return this;\n          }\n        }\n\n        public Builder mergeFrom(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet other) {\n          if (other == org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.getDefaultInstance()) return this;\n          if (other.hasNumOfColumns()) {\n            setNumOfColumns(other.getNumOfColumns());\n          }\n          if (!other.columns_.isEmpty()) {\n            if (columns_.isEmpty()) {\n              columns_ = other.columns_;\n              bitField0_ = (bitField0_ & ~0x00000002);\n            } else {\n              ensureColumnsIsMutable();\n              columns_.addAll(other.columns_);\n            }\n            onChanged();\n          }\n          if (!other.types_.isEmpty()) {\n            if (types_.isEmpty()) {\n              types_ = other.types_;\n              bitField0_ = (bitField0_ & ~0x00000004);\n            } else {\n              ensureTypesIsMutable();\n              types_.addAll(other.types_);\n            }\n            onChanged();\n          }\n          if (rowsBuilder_ == null) {\n            if (!other.rows_.isEmpty()) {\n              if (rows_.isEmpty()) {\n                rows_ = other.rows_;\n                bitField0_ = (bitField0_ & ~0x00000008);\n              } else {\n                ensureRowsIsMutable();\n                rows_.addAll(other.rows_);\n              }\n              onChanged();\n            }\n          } else {\n            if (!other.rows_.isEmpty()) {\n              if (rowsBuilder_.isEmpty()) {\n                rowsBuilder_.dispose();\n                rowsBuilder_ = null;\n                rows_ = other.rows_;\n                bitField0_ = (bitField0_ & ~0x00000008);\n                rowsBuilder_ = \n                  com.google.protobuf.GeneratedMessageV3.alwaysUseFieldBuilders ?\n                     getRowsFieldBuilder() : null;\n              } else {\n                rowsBuilder_.addAllMessages(other.rows_);\n              }\n            }\n          }\n          this.mergeExtensionFields(other);\n          this.mergeUnknownFields(other.unknownFields);\n          onChanged();\n          return this;\n        }\n\n        public final boolean isInitialized() {\n          for (int i = 0; i < getRowsCount(); i++) {\n            if (!getRows(i).isInitialized()) {\n              return false;\n            }\n          }\n          if (!extensionsAreInitialized()) {\n            return false;\n          }\n          return true;\n        }\n\n        public Builder mergeFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet parsedMessage = null;\n          try {\n            parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry);\n          } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n            parsedMessage = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet) e.getUnfinishedMessage();\n            throw e.unwrapIOException();\n          } finally {\n            if (parsedMessage != null) {\n              mergeFrom(parsedMessage);\n            }\n          }\n          return this;\n        }\n        private int bitField0_;\n\n        private long numOfColumns_ ;\n        /**\n         * <code>optional uint64 num_of_columns = 1;</code>\n         */\n        public boolean hasNumOfColumns() {\n          return ((bitField0_ & 0x00000001) == 0x00000001);\n        }\n        /**\n         * <code>optional uint64 num_of_columns = 1;</code>\n         */\n        public long getNumOfColumns() {\n          return numOfColumns_;\n        }\n        /**\n         * <code>optional uint64 num_of_columns = 1;</code>\n         */\n        public Builder setNumOfColumns(long value) {\n          bitField0_ |= 0x00000001;\n          numOfColumns_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional uint64 num_of_columns = 1;</code>\n         */\n        public Builder clearNumOfColumns() {\n          bitField0_ = (bitField0_ & ~0x00000001);\n          numOfColumns_ = 0L;\n          onChanged();\n          return this;\n        }\n\n        private com.google.protobuf.LazyStringList columns_ = com.google.protobuf.LazyStringArrayList.EMPTY;\n        private void ensureColumnsIsMutable() {\n          if (!((bitField0_ & 0x00000002) == 0x00000002)) {\n            columns_ = new com.google.protobuf.LazyStringArrayList(columns_);\n            bitField0_ |= 0x00000002;\n           }\n        }\n        /**\n         * <code>repeated string columns = 2;</code>\n         */\n        public com.google.protobuf.ProtocolStringList\n            getColumnsList() {\n          return columns_.getUnmodifiableView();\n        }\n        /**\n         * <code>repeated string columns = 2;</code>\n         */\n        public int getColumnsCount() {\n          return columns_.size();\n        }\n        /**\n         * <code>repeated string columns = 2;</code>\n         */\n        public java.lang.String getColumns(int index) {\n          return columns_.get(index);\n        }\n        /**\n         * <code>repeated string columns = 2;</code>\n         */\n        public com.google.protobuf.ByteString\n            getColumnsBytes(int index) {\n          return columns_.getByteString(index);\n        }\n        /**\n         * <code>repeated string columns = 2;</code>\n         */\n        public Builder setColumns(\n            int index, java.lang.String value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  ensureColumnsIsMutable();\n          columns_.set(index, value);\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>repeated string columns = 2;</code>\n         */\n        public Builder addColumns(\n            java.lang.String value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  ensureColumnsIsMutable();\n          columns_.add(value);\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>repeated string columns = 2;</code>\n         */\n        public Builder addAllColumns(\n            java.lang.Iterable<java.lang.String> values) {\n          ensureColumnsIsMutable();\n          com.google.protobuf.AbstractMessageLite.Builder.addAll(\n              values, columns_);\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>repeated string columns = 2;</code>\n         */\n        public Builder clearColumns() {\n          columns_ = com.google.protobuf.LazyStringArrayList.EMPTY;\n          bitField0_ = (bitField0_ & ~0x00000002);\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>repeated string columns = 2;</code>\n         */\n        public Builder addColumnsBytes(\n            com.google.protobuf.ByteString value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  ensureColumnsIsMutable();\n          columns_.add(value);\n          onChanged();\n          return this;\n        }\n\n        private java.util.List<java.lang.Integer> types_ = java.util.Collections.emptyList();\n        private void ensureTypesIsMutable() {\n          if (!((bitField0_ & 0x00000004) == 0x00000004)) {\n            types_ = new java.util.ArrayList<java.lang.Integer>(types_);\n            bitField0_ |= 0x00000004;\n           }\n        }\n        /**\n         * <code>repeated uint32 types = 3;</code>\n         */\n        public java.util.List<java.lang.Integer>\n            getTypesList() {\n          return java.util.Collections.unmodifiableList(types_);\n        }\n        /**\n         * <code>repeated uint32 types = 3;</code>\n         */\n        public int getTypesCount() {\n          return types_.size();\n        }\n        /**\n         * <code>repeated uint32 types = 3;</code>\n         */\n        public int getTypes(int index) {\n          return types_.get(index);\n        }\n        /**\n         * <code>repeated uint32 types = 3;</code>\n         */\n        public Builder setTypes(\n            int index, int value) {\n          ensureTypesIsMutable();\n          types_.set(index, value);\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>repeated uint32 types = 3;</code>\n         */\n        public Builder addTypes(int value) {\n          ensureTypesIsMutable();\n          types_.add(value);\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>repeated uint32 types = 3;</code>\n         */\n        public Builder addAllTypes(\n            java.lang.Iterable<? extends java.lang.Integer> values) {\n          ensureTypesIsMutable();\n          com.google.protobuf.AbstractMessageLite.Builder.addAll(\n              values, types_);\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>repeated uint32 types = 3;</code>\n         */\n        public Builder clearTypes() {\n          types_ = java.util.Collections.emptyList();\n          bitField0_ = (bitField0_ & ~0x00000004);\n          onChanged();\n          return this;\n        }\n\n        private java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row> rows_ =\n          java.util.Collections.emptyList();\n        private void ensureRowsIsMutable() {\n          if (!((bitField0_ & 0x00000008) == 0x00000008)) {\n            rows_ = new java.util.ArrayList<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row>(rows_);\n            bitField0_ |= 0x00000008;\n           }\n        }\n\n        private com.google.protobuf.RepeatedFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.RowOrBuilder> rowsBuilder_;\n\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row> getRowsList() {\n          if (rowsBuilder_ == null) {\n            return java.util.Collections.unmodifiableList(rows_);\n          } else {\n            return rowsBuilder_.getMessageList();\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public int getRowsCount() {\n          if (rowsBuilder_ == null) {\n            return rows_.size();\n          } else {\n            return rowsBuilder_.getCount();\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row getRows(int index) {\n          if (rowsBuilder_ == null) {\n            return rows_.get(index);\n          } else {\n            return rowsBuilder_.getMessage(index);\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public Builder setRows(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row value) {\n          if (rowsBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            ensureRowsIsMutable();\n            rows_.set(index, value);\n            onChanged();\n          } else {\n            rowsBuilder_.setMessage(index, value);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public Builder setRows(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.Builder builderForValue) {\n          if (rowsBuilder_ == null) {\n            ensureRowsIsMutable();\n            rows_.set(index, builderForValue.build());\n            onChanged();\n          } else {\n            rowsBuilder_.setMessage(index, builderForValue.build());\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public Builder addRows(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row value) {\n          if (rowsBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            ensureRowsIsMutable();\n            rows_.add(value);\n            onChanged();\n          } else {\n            rowsBuilder_.addMessage(value);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public Builder addRows(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row value) {\n          if (rowsBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            ensureRowsIsMutable();\n            rows_.add(index, value);\n            onChanged();\n          } else {\n            rowsBuilder_.addMessage(index, value);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public Builder addRows(\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.Builder builderForValue) {\n          if (rowsBuilder_ == null) {\n            ensureRowsIsMutable();\n            rows_.add(builderForValue.build());\n            onChanged();\n          } else {\n            rowsBuilder_.addMessage(builderForValue.build());\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public Builder addRows(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.Builder builderForValue) {\n          if (rowsBuilder_ == null) {\n            ensureRowsIsMutable();\n            rows_.add(index, builderForValue.build());\n            onChanged();\n          } else {\n            rowsBuilder_.addMessage(index, builderForValue.build());\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public Builder addAllRows(\n            java.lang.Iterable<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row> values) {\n          if (rowsBuilder_ == null) {\n            ensureRowsIsMutable();\n            com.google.protobuf.AbstractMessageLite.Builder.addAll(\n                values, rows_);\n            onChanged();\n          } else {\n            rowsBuilder_.addAllMessages(values);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public Builder clearRows() {\n          if (rowsBuilder_ == null) {\n            rows_ = java.util.Collections.emptyList();\n            bitField0_ = (bitField0_ & ~0x00000008);\n            onChanged();\n          } else {\n            rowsBuilder_.clear();\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public Builder removeRows(int index) {\n          if (rowsBuilder_ == null) {\n            ensureRowsIsMutable();\n            rows_.remove(index);\n            onChanged();\n          } else {\n            rowsBuilder_.remove(index);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.Builder getRowsBuilder(\n            int index) {\n          return getRowsFieldBuilder().getBuilder(index);\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.RowOrBuilder getRowsOrBuilder(\n            int index) {\n          if (rowsBuilder_ == null) {\n            return rows_.get(index);  } else {\n            return rowsBuilder_.getMessageOrBuilder(index);\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.RowOrBuilder> \n             getRowsOrBuilderList() {\n          if (rowsBuilder_ != null) {\n            return rowsBuilder_.getMessageOrBuilderList();\n          } else {\n            return java.util.Collections.unmodifiableList(rows_);\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.Builder addRowsBuilder() {\n          return getRowsFieldBuilder().addBuilder(\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.getDefaultInstance());\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.Builder addRowsBuilder(\n            int index) {\n          return getRowsFieldBuilder().addBuilder(\n              index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.getDefaultInstance());\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.DataSet.Row rows = 4;</code>\n         */\n        public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.Builder> \n             getRowsBuilderList() {\n          return getRowsFieldBuilder().getBuilderList();\n        }\n        private com.google.protobuf.RepeatedFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.RowOrBuilder> \n            getRowsFieldBuilder() {\n          if (rowsBuilder_ == null) {\n            rowsBuilder_ = new com.google.protobuf.RepeatedFieldBuilderV3<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Row.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.RowOrBuilder>(\n                    rows_,\n                    ((bitField0_ & 0x00000008) == 0x00000008),\n                    getParentForChildren(),\n                    isClean());\n            rows_ = null;\n          }\n          return rowsBuilder_;\n        }\n        public final Builder setUnknownFields(\n            final com.google.protobuf.UnknownFieldSet unknownFields) {\n          return super.setUnknownFields(unknownFields);\n        }\n\n        public final Builder mergeUnknownFields(\n            final com.google.protobuf.UnknownFieldSet unknownFields) {\n          return super.mergeUnknownFields(unknownFields);\n        }\n\n\n        // @@protoc_insertion_point(builder_scope:org.eclipse.tahu.protobuf.Payload.DataSet)\n      }\n\n      // @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.DataSet)\n      private static final org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet DEFAULT_INSTANCE;\n      static {\n        DEFAULT_INSTANCE = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet();\n      }\n\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet getDefaultInstance() {\n        return DEFAULT_INSTANCE;\n      }\n\n      @java.lang.Deprecated public static final com.google.protobuf.Parser<DataSet>\n          PARSER = new com.google.protobuf.AbstractParser<DataSet>() {\n        public DataSet parsePartialFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n            return new DataSet(input, extensionRegistry);\n        }\n      };\n\n      public static com.google.protobuf.Parser<DataSet> parser() {\n        return PARSER;\n      }\n\n      @java.lang.Override\n      public com.google.protobuf.Parser<DataSet> getParserForType() {\n        return PARSER;\n      }\n\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet getDefaultInstanceForType() {\n        return DEFAULT_INSTANCE;\n      }\n\n    }\n\n    public interface PropertyValueOrBuilder extends\n        // @@protoc_insertion_point(interface_extends:org.eclipse.tahu.protobuf.Payload.PropertyValue)\n        com.google.protobuf.MessageOrBuilder {\n\n      /**\n       * <code>optional uint32 type = 1;</code>\n       */\n      boolean hasType();\n      /**\n       * <code>optional uint32 type = 1;</code>\n       */\n      int getType();\n\n      /**\n       * <code>optional bool is_null = 2;</code>\n       */\n      boolean hasIsNull();\n      /**\n       * <code>optional bool is_null = 2;</code>\n       */\n      boolean getIsNull();\n\n      /**\n       * <code>optional uint32 int_value = 3;</code>\n       */\n      boolean hasIntValue();\n      /**\n       * <code>optional uint32 int_value = 3;</code>\n       */\n      int getIntValue();\n\n      /**\n       * <code>optional uint64 long_value = 4;</code>\n       */\n      boolean hasLongValue();\n      /**\n       * <code>optional uint64 long_value = 4;</code>\n       */\n      long getLongValue();\n\n      /**\n       * <code>optional float float_value = 5;</code>\n       */\n      boolean hasFloatValue();\n      /**\n       * <code>optional float float_value = 5;</code>\n       */\n      float getFloatValue();\n\n      /**\n       * <code>optional double double_value = 6;</code>\n       */\n      boolean hasDoubleValue();\n      /**\n       * <code>optional double double_value = 6;</code>\n       */\n      double getDoubleValue();\n\n      /**\n       * <code>optional bool boolean_value = 7;</code>\n       */\n      boolean hasBooleanValue();\n      /**\n       * <code>optional bool boolean_value = 7;</code>\n       */\n      boolean getBooleanValue();\n\n      /**\n       * <code>optional string string_value = 8;</code>\n       */\n      boolean hasStringValue();\n      /**\n       * <code>optional string string_value = 8;</code>\n       */\n      java.lang.String getStringValue();\n      /**\n       * <code>optional string string_value = 8;</code>\n       */\n      com.google.protobuf.ByteString\n          getStringValueBytes();\n\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset_value = 9;</code>\n       */\n      boolean hasPropertysetValue();\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset_value = 9;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet getPropertysetValue();\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset_value = 9;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder getPropertysetValueOrBuilder();\n\n      /**\n       * <pre>\n       * List of Property Values\n       * </pre>\n       *\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySetList propertysets_value = 10;</code>\n       */\n      boolean hasPropertysetsValue();\n      /**\n       * <pre>\n       * List of Property Values\n       * </pre>\n       *\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySetList propertysets_value = 10;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList getPropertysetsValue();\n      /**\n       * <pre>\n       * List of Property Values\n       * </pre>\n       *\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySetList propertysets_value = 10;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetListOrBuilder getPropertysetsValueOrBuilder();\n\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension extension_value = 11;</code>\n       */\n      boolean hasExtensionValue();\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension extension_value = 11;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension getExtensionValue();\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension extension_value = 11;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtensionOrBuilder getExtensionValueOrBuilder();\n\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.ValueCase getValueCase();\n    }\n    /**\n     * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.PropertyValue}\n     */\n    public  static final class PropertyValue extends\n        com.google.protobuf.GeneratedMessageV3 implements\n        // @@protoc_insertion_point(message_implements:org.eclipse.tahu.protobuf.Payload.PropertyValue)\n        PropertyValueOrBuilder {\n      // Use PropertyValue.newBuilder() to construct.\n      private PropertyValue(com.google.protobuf.GeneratedMessageV3.Builder<?> builder) {\n        super(builder);\n      }\n      private PropertyValue() {\n        type_ = 0;\n        isNull_ = false;\n      }\n\n      @java.lang.Override\n      public final com.google.protobuf.UnknownFieldSet\n      getUnknownFields() {\n        return this.unknownFields;\n      }\n      private PropertyValue(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        this();\n        int mutable_bitField0_ = 0;\n        com.google.protobuf.UnknownFieldSet.Builder unknownFields =\n            com.google.protobuf.UnknownFieldSet.newBuilder();\n        try {\n          boolean done = false;\n          while (!done) {\n            int tag = input.readTag();\n            switch (tag) {\n              case 0:\n                done = true;\n                break;\n              default: {\n                if (!parseUnknownField(input, unknownFields,\n                                       extensionRegistry, tag)) {\n                  done = true;\n                }\n                break;\n              }\n              case 8: {\n                bitField0_ |= 0x00000001;\n                type_ = input.readUInt32();\n                break;\n              }\n              case 16: {\n                bitField0_ |= 0x00000002;\n                isNull_ = input.readBool();\n                break;\n              }\n              case 24: {\n                valueCase_ = 3;\n                value_ = input.readUInt32();\n                break;\n              }\n              case 32: {\n                valueCase_ = 4;\n                value_ = input.readUInt64();\n                break;\n              }\n              case 45: {\n                valueCase_ = 5;\n                value_ = input.readFloat();\n                break;\n              }\n              case 49: {\n                valueCase_ = 6;\n                value_ = input.readDouble();\n                break;\n              }\n              case 56: {\n                valueCase_ = 7;\n                value_ = input.readBool();\n                break;\n              }\n              case 66: {\n                com.google.protobuf.ByteString bs = input.readBytes();\n                valueCase_ = 8;\n                value_ = bs;\n                break;\n              }\n              case 74: {\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder subBuilder = null;\n                if (valueCase_ == 9) {\n                  subBuilder = ((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet) value_).toBuilder();\n                }\n                value_ =\n                    input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.PARSER, extensionRegistry);\n                if (subBuilder != null) {\n                  subBuilder.mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet) value_);\n                  value_ = subBuilder.buildPartial();\n                }\n                valueCase_ = 9;\n                break;\n              }\n              case 82: {\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.Builder subBuilder = null;\n                if (valueCase_ == 10) {\n                  subBuilder = ((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList) value_).toBuilder();\n                }\n                value_ =\n                    input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.PARSER, extensionRegistry);\n                if (subBuilder != null) {\n                  subBuilder.mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList) value_);\n                  value_ = subBuilder.buildPartial();\n                }\n                valueCase_ = 10;\n                break;\n              }\n              case 90: {\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.Builder subBuilder = null;\n                if (valueCase_ == 11) {\n                  subBuilder = ((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension) value_).toBuilder();\n                }\n                value_ =\n                    input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.PARSER, extensionRegistry);\n                if (subBuilder != null) {\n                  subBuilder.mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension) value_);\n                  value_ = subBuilder.buildPartial();\n                }\n                valueCase_ = 11;\n                break;\n              }\n            }\n          }\n        } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n          throw e.setUnfinishedMessage(this);\n        } catch (java.io.IOException e) {\n          throw new com.google.protobuf.InvalidProtocolBufferException(\n              e).setUnfinishedMessage(this);\n        } finally {\n          this.unknownFields = unknownFields.build();\n          makeExtensionsImmutable();\n        }\n      }\n      public static final com.google.protobuf.Descriptors.Descriptor\n          getDescriptor() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_descriptor;\n      }\n\n      protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n          internalGetFieldAccessorTable() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_fieldAccessorTable\n            .ensureFieldAccessorsInitialized(\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.Builder.class);\n      }\n\n      public interface PropertyValueExtensionOrBuilder extends\n          // @@protoc_insertion_point(interface_extends:org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension)\n          com.google.protobuf.GeneratedMessageV3.\n              ExtendableMessageOrBuilder<PropertyValueExtension> {\n      }\n      /**\n       * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension}\n       */\n      public  static final class PropertyValueExtension extends\n          com.google.protobuf.GeneratedMessageV3.ExtendableMessage<\n            PropertyValueExtension> implements\n          // @@protoc_insertion_point(message_implements:org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension)\n          PropertyValueExtensionOrBuilder {\n        // Use PropertyValueExtension.newBuilder() to construct.\n        private PropertyValueExtension(com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension, ?> builder) {\n          super(builder);\n        }\n        private PropertyValueExtension() {\n        }\n\n        @java.lang.Override\n        public final com.google.protobuf.UnknownFieldSet\n        getUnknownFields() {\n          return this.unknownFields;\n        }\n        private PropertyValueExtension(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          this();\n          com.google.protobuf.UnknownFieldSet.Builder unknownFields =\n              com.google.protobuf.UnknownFieldSet.newBuilder();\n          try {\n            boolean done = false;\n            while (!done) {\n              int tag = input.readTag();\n              switch (tag) {\n                case 0:\n                  done = true;\n                  break;\n                default: {\n                  if (!parseUnknownField(input, unknownFields,\n                                         extensionRegistry, tag)) {\n                    done = true;\n                  }\n                  break;\n                }\n              }\n            }\n          } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n            throw e.setUnfinishedMessage(this);\n          } catch (java.io.IOException e) {\n            throw new com.google.protobuf.InvalidProtocolBufferException(\n                e).setUnfinishedMessage(this);\n          } finally {\n            this.unknownFields = unknownFields.build();\n            makeExtensionsImmutable();\n          }\n        }\n        public static final com.google.protobuf.Descriptors.Descriptor\n            getDescriptor() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_descriptor;\n        }\n\n        protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n            internalGetFieldAccessorTable() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_fieldAccessorTable\n              .ensureFieldAccessorsInitialized(\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.Builder.class);\n        }\n\n        private byte memoizedIsInitialized = -1;\n        public final boolean isInitialized() {\n          byte isInitialized = memoizedIsInitialized;\n          if (isInitialized == 1) return true;\n          if (isInitialized == 0) return false;\n\n          if (!extensionsAreInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n          memoizedIsInitialized = 1;\n          return true;\n        }\n\n        public void writeTo(com.google.protobuf.CodedOutputStream output)\n                            throws java.io.IOException {\n          com.google.protobuf.GeneratedMessageV3\n            .ExtendableMessage<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension>.ExtensionWriter\n              extensionWriter = newExtensionWriter();\n          extensionWriter.writeUntil(536870912, output);\n          unknownFields.writeTo(output);\n        }\n\n        public int getSerializedSize() {\n          int size = memoizedSize;\n          if (size != -1) return size;\n\n          size = 0;\n          size += extensionsSerializedSize();\n          size += unknownFields.getSerializedSize();\n          memoizedSize = size;\n          return size;\n        }\n\n        private static final long serialVersionUID = 0L;\n        @java.lang.Override\n        public boolean equals(final java.lang.Object obj) {\n          if (obj == this) {\n           return true;\n          }\n          if (!(obj instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension)) {\n            return super.equals(obj);\n          }\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension other = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension) obj;\n\n          boolean result = true;\n          result = result && unknownFields.equals(other.unknownFields);\n          result = result &&\n              getExtensionFields().equals(other.getExtensionFields());\n          return result;\n        }\n\n        @java.lang.Override\n        public int hashCode() {\n          if (memoizedHashCode != 0) {\n            return memoizedHashCode;\n          }\n          int hash = 41;\n          hash = (19 * hash) + getDescriptorForType().hashCode();\n          hash = hashFields(hash, getExtensionFields());\n          hash = (29 * hash) + unknownFields.hashCode();\n          memoizedHashCode = hash;\n          return hash;\n        }\n\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension parseFrom(\n            com.google.protobuf.ByteString data)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension parseFrom(\n            com.google.protobuf.ByteString data,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension parseFrom(byte[] data)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension parseFrom(\n            byte[] data,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension parseFrom(java.io.InputStream input)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension parseFrom(\n            java.io.InputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension parseDelimitedFrom(java.io.InputStream input)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseDelimitedWithIOException(PARSER, input);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension parseDelimitedFrom(\n            java.io.InputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseDelimitedWithIOException(PARSER, input, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension parseFrom(\n            com.google.protobuf.CodedInputStream input)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension parseFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input, extensionRegistry);\n        }\n\n        public Builder newBuilderForType() { return newBuilder(); }\n        public static Builder newBuilder() {\n          return DEFAULT_INSTANCE.toBuilder();\n        }\n        public static Builder newBuilder(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension prototype) {\n          return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);\n        }\n        public Builder toBuilder() {\n          return this == DEFAULT_INSTANCE\n              ? new Builder() : new Builder().mergeFrom(this);\n        }\n\n        @java.lang.Override\n        protected Builder newBuilderForType(\n            com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n          Builder builder = new Builder(parent);\n          return builder;\n        }\n        /**\n         * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension}\n         */\n        public static final class Builder extends\n            com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension, Builder> implements\n            // @@protoc_insertion_point(builder_implements:org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension)\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtensionOrBuilder {\n          public static final com.google.protobuf.Descriptors.Descriptor\n              getDescriptor() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_descriptor;\n          }\n\n          protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n              internalGetFieldAccessorTable() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_fieldAccessorTable\n                .ensureFieldAccessorsInitialized(\n                    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.Builder.class);\n          }\n\n          // Construct using org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.newBuilder()\n          private Builder() {\n            maybeForceBuilderInitialization();\n          }\n\n          private Builder(\n              com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n            super(parent);\n            maybeForceBuilderInitialization();\n          }\n          private void maybeForceBuilderInitialization() {\n            if (com.google.protobuf.GeneratedMessageV3\n                    .alwaysUseFieldBuilders) {\n            }\n          }\n          public Builder clear() {\n            super.clear();\n            return this;\n          }\n\n          public com.google.protobuf.Descriptors.Descriptor\n              getDescriptorForType() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_descriptor;\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension getDefaultInstanceForType() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.getDefaultInstance();\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension build() {\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension result = buildPartial();\n            if (!result.isInitialized()) {\n              throw newUninitializedMessageException(result);\n            }\n            return result;\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension buildPartial() {\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension result = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension(this);\n            onBuilt();\n            return result;\n          }\n\n          public Builder clone() {\n            return (Builder) super.clone();\n          }\n          public Builder setField(\n              com.google.protobuf.Descriptors.FieldDescriptor field,\n              Object value) {\n            return (Builder) super.setField(field, value);\n          }\n          public Builder clearField(\n              com.google.protobuf.Descriptors.FieldDescriptor field) {\n            return (Builder) super.clearField(field);\n          }\n          public Builder clearOneof(\n              com.google.protobuf.Descriptors.OneofDescriptor oneof) {\n            return (Builder) super.clearOneof(oneof);\n          }\n          public Builder setRepeatedField(\n              com.google.protobuf.Descriptors.FieldDescriptor field,\n              int index, Object value) {\n            return (Builder) super.setRepeatedField(field, index, value);\n          }\n          public Builder addRepeatedField(\n              com.google.protobuf.Descriptors.FieldDescriptor field,\n              Object value) {\n            return (Builder) super.addRepeatedField(field, value);\n          }\n          public <Type> Builder setExtension(\n              com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension, Type> extension,\n              Type value) {\n            return (Builder) super.setExtension(extension, value);\n          }\n          public <Type> Builder setExtension(\n              com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension, java.util.List<Type>> extension,\n              int index, Type value) {\n            return (Builder) super.setExtension(extension, index, value);\n          }\n          public <Type> Builder addExtension(\n              com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension, java.util.List<Type>> extension,\n              Type value) {\n            return (Builder) super.addExtension(extension, value);\n          }\n          public <Type> Builder clearExtension(\n              com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension, ?> extension) {\n            return (Builder) super.clearExtension(extension);\n          }\n          public Builder mergeFrom(com.google.protobuf.Message other) {\n            if (other instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension) {\n              return mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension)other);\n            } else {\n              super.mergeFrom(other);\n              return this;\n            }\n          }\n\n          public Builder mergeFrom(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension other) {\n            if (other == org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.getDefaultInstance()) return this;\n            this.mergeExtensionFields(other);\n            this.mergeUnknownFields(other.unknownFields);\n            onChanged();\n            return this;\n          }\n\n          public final boolean isInitialized() {\n            if (!extensionsAreInitialized()) {\n              return false;\n            }\n            return true;\n          }\n\n          public Builder mergeFrom(\n              com.google.protobuf.CodedInputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws java.io.IOException {\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension parsedMessage = null;\n            try {\n              parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry);\n            } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n              parsedMessage = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension) e.getUnfinishedMessage();\n              throw e.unwrapIOException();\n            } finally {\n              if (parsedMessage != null) {\n                mergeFrom(parsedMessage);\n              }\n            }\n            return this;\n          }\n          public final Builder setUnknownFields(\n              final com.google.protobuf.UnknownFieldSet unknownFields) {\n            return super.setUnknownFields(unknownFields);\n          }\n\n          public final Builder mergeUnknownFields(\n              final com.google.protobuf.UnknownFieldSet unknownFields) {\n            return super.mergeUnknownFields(unknownFields);\n          }\n\n\n          // @@protoc_insertion_point(builder_scope:org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension)\n        }\n\n        // @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension)\n        private static final org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension DEFAULT_INSTANCE;\n        static {\n          DEFAULT_INSTANCE = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension();\n        }\n\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension getDefaultInstance() {\n          return DEFAULT_INSTANCE;\n        }\n\n        @java.lang.Deprecated public static final com.google.protobuf.Parser<PropertyValueExtension>\n            PARSER = new com.google.protobuf.AbstractParser<PropertyValueExtension>() {\n          public PropertyValueExtension parsePartialFrom(\n              com.google.protobuf.CodedInputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws com.google.protobuf.InvalidProtocolBufferException {\n              return new PropertyValueExtension(input, extensionRegistry);\n          }\n        };\n\n        public static com.google.protobuf.Parser<PropertyValueExtension> parser() {\n          return PARSER;\n        }\n\n        @java.lang.Override\n        public com.google.protobuf.Parser<PropertyValueExtension> getParserForType() {\n          return PARSER;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension getDefaultInstanceForType() {\n          return DEFAULT_INSTANCE;\n        }\n\n      }\n\n      private int bitField0_;\n      private int valueCase_ = 0;\n      private java.lang.Object value_;\n      public enum ValueCase\n          implements com.google.protobuf.Internal.EnumLite {\n        INT_VALUE(3),\n        LONG_VALUE(4),\n        FLOAT_VALUE(5),\n        DOUBLE_VALUE(6),\n        BOOLEAN_VALUE(7),\n        STRING_VALUE(8),\n        PROPERTYSET_VALUE(9),\n        PROPERTYSETS_VALUE(10),\n        EXTENSION_VALUE(11),\n        VALUE_NOT_SET(0);\n        private final int value;\n        private ValueCase(int value) {\n          this.value = value;\n        }\n        /**\n         * @deprecated Use {@link #forNumber(int)} instead.\n         */\n        @java.lang.Deprecated\n        public static ValueCase valueOf(int value) {\n          return forNumber(value);\n        }\n\n        public static ValueCase forNumber(int value) {\n          switch (value) {\n            case 3: return INT_VALUE;\n            case 4: return LONG_VALUE;\n            case 5: return FLOAT_VALUE;\n            case 6: return DOUBLE_VALUE;\n            case 7: return BOOLEAN_VALUE;\n            case 8: return STRING_VALUE;\n            case 9: return PROPERTYSET_VALUE;\n            case 10: return PROPERTYSETS_VALUE;\n            case 11: return EXTENSION_VALUE;\n            case 0: return VALUE_NOT_SET;\n            default: return null;\n          }\n        }\n        public int getNumber() {\n          return this.value;\n        }\n      };\n\n      public ValueCase\n      getValueCase() {\n        return ValueCase.forNumber(\n            valueCase_);\n      }\n\n      public static final int TYPE_FIELD_NUMBER = 1;\n      private int type_;\n      /**\n       * <code>optional uint32 type = 1;</code>\n       */\n      public boolean hasType() {\n        return ((bitField0_ & 0x00000001) == 0x00000001);\n      }\n      /**\n       * <code>optional uint32 type = 1;</code>\n       */\n      public int getType() {\n        return type_;\n      }\n\n      public static final int IS_NULL_FIELD_NUMBER = 2;\n      private boolean isNull_;\n      /**\n       * <code>optional bool is_null = 2;</code>\n       */\n      public boolean hasIsNull() {\n        return ((bitField0_ & 0x00000002) == 0x00000002);\n      }\n      /**\n       * <code>optional bool is_null = 2;</code>\n       */\n      public boolean getIsNull() {\n        return isNull_;\n      }\n\n      public static final int INT_VALUE_FIELD_NUMBER = 3;\n      /**\n       * <code>optional uint32 int_value = 3;</code>\n       */\n      public boolean hasIntValue() {\n        return valueCase_ == 3;\n      }\n      /**\n       * <code>optional uint32 int_value = 3;</code>\n       */\n      public int getIntValue() {\n        if (valueCase_ == 3) {\n          return (java.lang.Integer) value_;\n        }\n        return 0;\n      }\n\n      public static final int LONG_VALUE_FIELD_NUMBER = 4;\n      /**\n       * <code>optional uint64 long_value = 4;</code>\n       */\n      public boolean hasLongValue() {\n        return valueCase_ == 4;\n      }\n      /**\n       * <code>optional uint64 long_value = 4;</code>\n       */\n      public long getLongValue() {\n        if (valueCase_ == 4) {\n          return (java.lang.Long) value_;\n        }\n        return 0L;\n      }\n\n      public static final int FLOAT_VALUE_FIELD_NUMBER = 5;\n      /**\n       * <code>optional float float_value = 5;</code>\n       */\n      public boolean hasFloatValue() {\n        return valueCase_ == 5;\n      }\n      /**\n       * <code>optional float float_value = 5;</code>\n       */\n      public float getFloatValue() {\n        if (valueCase_ == 5) {\n          return (java.lang.Float) value_;\n        }\n        return 0F;\n      }\n\n      public static final int DOUBLE_VALUE_FIELD_NUMBER = 6;\n      /**\n       * <code>optional double double_value = 6;</code>\n       */\n      public boolean hasDoubleValue() {\n        return valueCase_ == 6;\n      }\n      /**\n       * <code>optional double double_value = 6;</code>\n       */\n      public double getDoubleValue() {\n        if (valueCase_ == 6) {\n          return (java.lang.Double) value_;\n        }\n        return 0D;\n      }\n\n      public static final int BOOLEAN_VALUE_FIELD_NUMBER = 7;\n      /**\n       * <code>optional bool boolean_value = 7;</code>\n       */\n      public boolean hasBooleanValue() {\n        return valueCase_ == 7;\n      }\n      /**\n       * <code>optional bool boolean_value = 7;</code>\n       */\n      public boolean getBooleanValue() {\n        if (valueCase_ == 7) {\n          return (java.lang.Boolean) value_;\n        }\n        return false;\n      }\n\n      public static final int STRING_VALUE_FIELD_NUMBER = 8;\n      /**\n       * <code>optional string string_value = 8;</code>\n       */\n      public boolean hasStringValue() {\n        return valueCase_ == 8;\n      }\n      /**\n       * <code>optional string string_value = 8;</code>\n       */\n      public java.lang.String getStringValue() {\n        java.lang.Object ref = \"\";\n        if (valueCase_ == 8) {\n          ref = value_;\n        }\n        if (ref instanceof java.lang.String) {\n          return (java.lang.String) ref;\n        } else {\n          com.google.protobuf.ByteString bs = \n              (com.google.protobuf.ByteString) ref;\n          java.lang.String s = bs.toStringUtf8();\n          if (bs.isValidUtf8() && (valueCase_ == 8)) {\n            value_ = s;\n          }\n          return s;\n        }\n      }\n      /**\n       * <code>optional string string_value = 8;</code>\n       */\n      public com.google.protobuf.ByteString\n          getStringValueBytes() {\n        java.lang.Object ref = \"\";\n        if (valueCase_ == 8) {\n          ref = value_;\n        }\n        if (ref instanceof java.lang.String) {\n          com.google.protobuf.ByteString b = \n              com.google.protobuf.ByteString.copyFromUtf8(\n                  (java.lang.String) ref);\n          if (valueCase_ == 8) {\n            value_ = b;\n          }\n          return b;\n        } else {\n          return (com.google.protobuf.ByteString) ref;\n        }\n      }\n\n      public static final int PROPERTYSET_VALUE_FIELD_NUMBER = 9;\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset_value = 9;</code>\n       */\n      public boolean hasPropertysetValue() {\n        return valueCase_ == 9;\n      }\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset_value = 9;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet getPropertysetValue() {\n        if (valueCase_ == 9) {\n           return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet) value_;\n        }\n        return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance();\n      }\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset_value = 9;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder getPropertysetValueOrBuilder() {\n        if (valueCase_ == 9) {\n           return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet) value_;\n        }\n        return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance();\n      }\n\n      public static final int PROPERTYSETS_VALUE_FIELD_NUMBER = 10;\n      /**\n       * <pre>\n       * List of Property Values\n       * </pre>\n       *\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySetList propertysets_value = 10;</code>\n       */\n      public boolean hasPropertysetsValue() {\n        return valueCase_ == 10;\n      }\n      /**\n       * <pre>\n       * List of Property Values\n       * </pre>\n       *\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySetList propertysets_value = 10;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList getPropertysetsValue() {\n        if (valueCase_ == 10) {\n           return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList) value_;\n        }\n        return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.getDefaultInstance();\n      }\n      /**\n       * <pre>\n       * List of Property Values\n       * </pre>\n       *\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySetList propertysets_value = 10;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetListOrBuilder getPropertysetsValueOrBuilder() {\n        if (valueCase_ == 10) {\n           return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList) value_;\n        }\n        return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.getDefaultInstance();\n      }\n\n      public static final int EXTENSION_VALUE_FIELD_NUMBER = 11;\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension extension_value = 11;</code>\n       */\n      public boolean hasExtensionValue() {\n        return valueCase_ == 11;\n      }\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension extension_value = 11;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension getExtensionValue() {\n        if (valueCase_ == 11) {\n           return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension) value_;\n        }\n        return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.getDefaultInstance();\n      }\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension extension_value = 11;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtensionOrBuilder getExtensionValueOrBuilder() {\n        if (valueCase_ == 11) {\n           return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension) value_;\n        }\n        return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.getDefaultInstance();\n      }\n\n      private byte memoizedIsInitialized = -1;\n      public final boolean isInitialized() {\n        byte isInitialized = memoizedIsInitialized;\n        if (isInitialized == 1) return true;\n        if (isInitialized == 0) return false;\n\n        if (hasPropertysetValue()) {\n          if (!getPropertysetValue().isInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n        }\n        if (hasPropertysetsValue()) {\n          if (!getPropertysetsValue().isInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n        }\n        if (hasExtensionValue()) {\n          if (!getExtensionValue().isInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n        }\n        memoizedIsInitialized = 1;\n        return true;\n      }\n\n      public void writeTo(com.google.protobuf.CodedOutputStream output)\n                          throws java.io.IOException {\n        if (((bitField0_ & 0x00000001) == 0x00000001)) {\n          output.writeUInt32(1, type_);\n        }\n        if (((bitField0_ & 0x00000002) == 0x00000002)) {\n          output.writeBool(2, isNull_);\n        }\n        if (valueCase_ == 3) {\n          output.writeUInt32(\n              3, (int)((java.lang.Integer) value_));\n        }\n        if (valueCase_ == 4) {\n          output.writeUInt64(\n              4, (long)((java.lang.Long) value_));\n        }\n        if (valueCase_ == 5) {\n          output.writeFloat(\n              5, (float)((java.lang.Float) value_));\n        }\n        if (valueCase_ == 6) {\n          output.writeDouble(\n              6, (double)((java.lang.Double) value_));\n        }\n        if (valueCase_ == 7) {\n          output.writeBool(\n              7, (boolean)((java.lang.Boolean) value_));\n        }\n        if (valueCase_ == 8) {\n          com.google.protobuf.GeneratedMessageV3.writeString(output, 8, value_);\n        }\n        if (valueCase_ == 9) {\n          output.writeMessage(9, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet) value_);\n        }\n        if (valueCase_ == 10) {\n          output.writeMessage(10, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList) value_);\n        }\n        if (valueCase_ == 11) {\n          output.writeMessage(11, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension) value_);\n        }\n        unknownFields.writeTo(output);\n      }\n\n      public int getSerializedSize() {\n        int size = memoizedSize;\n        if (size != -1) return size;\n\n        size = 0;\n        if (((bitField0_ & 0x00000001) == 0x00000001)) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeUInt32Size(1, type_);\n        }\n        if (((bitField0_ & 0x00000002) == 0x00000002)) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeBoolSize(2, isNull_);\n        }\n        if (valueCase_ == 3) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeUInt32Size(\n                3, (int)((java.lang.Integer) value_));\n        }\n        if (valueCase_ == 4) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeUInt64Size(\n                4, (long)((java.lang.Long) value_));\n        }\n        if (valueCase_ == 5) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeFloatSize(\n                5, (float)((java.lang.Float) value_));\n        }\n        if (valueCase_ == 6) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeDoubleSize(\n                6, (double)((java.lang.Double) value_));\n        }\n        if (valueCase_ == 7) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeBoolSize(\n                7, (boolean)((java.lang.Boolean) value_));\n        }\n        if (valueCase_ == 8) {\n          size += com.google.protobuf.GeneratedMessageV3.computeStringSize(8, value_);\n        }\n        if (valueCase_ == 9) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeMessageSize(9, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet) value_);\n        }\n        if (valueCase_ == 10) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeMessageSize(10, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList) value_);\n        }\n        if (valueCase_ == 11) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeMessageSize(11, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension) value_);\n        }\n        size += unknownFields.getSerializedSize();\n        memoizedSize = size;\n        return size;\n      }\n\n      private static final long serialVersionUID = 0L;\n      @java.lang.Override\n      public boolean equals(final java.lang.Object obj) {\n        if (obj == this) {\n         return true;\n        }\n        if (!(obj instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue)) {\n          return super.equals(obj);\n        }\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue other = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue) obj;\n\n        boolean result = true;\n        result = result && (hasType() == other.hasType());\n        if (hasType()) {\n          result = result && (getType()\n              == other.getType());\n        }\n        result = result && (hasIsNull() == other.hasIsNull());\n        if (hasIsNull()) {\n          result = result && (getIsNull()\n              == other.getIsNull());\n        }\n        result = result && getValueCase().equals(\n            other.getValueCase());\n        if (!result) return false;\n        switch (valueCase_) {\n          case 3:\n            result = result && (getIntValue()\n                == other.getIntValue());\n            break;\n          case 4:\n            result = result && (getLongValue()\n                == other.getLongValue());\n            break;\n          case 5:\n            result = result && (\n                java.lang.Float.floatToIntBits(getFloatValue())\n                == java.lang.Float.floatToIntBits(\n                    other.getFloatValue()));\n            break;\n          case 6:\n            result = result && (\n                java.lang.Double.doubleToLongBits(getDoubleValue())\n                == java.lang.Double.doubleToLongBits(\n                    other.getDoubleValue()));\n            break;\n          case 7:\n            result = result && (getBooleanValue()\n                == other.getBooleanValue());\n            break;\n          case 8:\n            result = result && getStringValue()\n                .equals(other.getStringValue());\n            break;\n          case 9:\n            result = result && getPropertysetValue()\n                .equals(other.getPropertysetValue());\n            break;\n          case 10:\n            result = result && getPropertysetsValue()\n                .equals(other.getPropertysetsValue());\n            break;\n          case 11:\n            result = result && getExtensionValue()\n                .equals(other.getExtensionValue());\n            break;\n          case 0:\n          default:\n        }\n        result = result && unknownFields.equals(other.unknownFields);\n        return result;\n      }\n\n      @java.lang.Override\n      public int hashCode() {\n        if (memoizedHashCode != 0) {\n          return memoizedHashCode;\n        }\n        int hash = 41;\n        hash = (19 * hash) + getDescriptorForType().hashCode();\n        if (hasType()) {\n          hash = (37 * hash) + TYPE_FIELD_NUMBER;\n          hash = (53 * hash) + getType();\n        }\n        if (hasIsNull()) {\n          hash = (37 * hash) + IS_NULL_FIELD_NUMBER;\n          hash = (53 * hash) + com.google.protobuf.Internal.hashBoolean(\n              getIsNull());\n        }\n        switch (valueCase_) {\n          case 3:\n            hash = (37 * hash) + INT_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + getIntValue();\n            break;\n          case 4:\n            hash = (37 * hash) + LONG_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + com.google.protobuf.Internal.hashLong(\n                getLongValue());\n            break;\n          case 5:\n            hash = (37 * hash) + FLOAT_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + java.lang.Float.floatToIntBits(\n                getFloatValue());\n            break;\n          case 6:\n            hash = (37 * hash) + DOUBLE_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + com.google.protobuf.Internal.hashLong(\n                java.lang.Double.doubleToLongBits(getDoubleValue()));\n            break;\n          case 7:\n            hash = (37 * hash) + BOOLEAN_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + com.google.protobuf.Internal.hashBoolean(\n                getBooleanValue());\n            break;\n          case 8:\n            hash = (37 * hash) + STRING_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + getStringValue().hashCode();\n            break;\n          case 9:\n            hash = (37 * hash) + PROPERTYSET_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + getPropertysetValue().hashCode();\n            break;\n          case 10:\n            hash = (37 * hash) + PROPERTYSETS_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + getPropertysetsValue().hashCode();\n            break;\n          case 11:\n            hash = (37 * hash) + EXTENSION_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + getExtensionValue().hashCode();\n            break;\n          case 0:\n          default:\n        }\n        hash = (29 * hash) + unknownFields.hashCode();\n        memoizedHashCode = hash;\n        return hash;\n      }\n\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue parseFrom(\n          com.google.protobuf.ByteString data)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue parseFrom(\n          com.google.protobuf.ByteString data,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue parseFrom(byte[] data)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue parseFrom(\n          byte[] data,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue parseFrom(java.io.InputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue parseFrom(\n          java.io.InputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue parseDelimitedFrom(java.io.InputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseDelimitedWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue parseDelimitedFrom(\n          java.io.InputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseDelimitedWithIOException(PARSER, input, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue parseFrom(\n          com.google.protobuf.CodedInputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue parseFrom(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input, extensionRegistry);\n      }\n\n      public Builder newBuilderForType() { return newBuilder(); }\n      public static Builder newBuilder() {\n        return DEFAULT_INSTANCE.toBuilder();\n      }\n      public static Builder newBuilder(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue prototype) {\n        return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);\n      }\n      public Builder toBuilder() {\n        return this == DEFAULT_INSTANCE\n            ? new Builder() : new Builder().mergeFrom(this);\n      }\n\n      @java.lang.Override\n      protected Builder newBuilderForType(\n          com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n        Builder builder = new Builder(parent);\n        return builder;\n      }\n      /**\n       * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.PropertyValue}\n       */\n      public static final class Builder extends\n          com.google.protobuf.GeneratedMessageV3.Builder<Builder> implements\n          // @@protoc_insertion_point(builder_implements:org.eclipse.tahu.protobuf.Payload.PropertyValue)\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValueOrBuilder {\n        public static final com.google.protobuf.Descriptors.Descriptor\n            getDescriptor() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_descriptor;\n        }\n\n        protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n            internalGetFieldAccessorTable() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_fieldAccessorTable\n              .ensureFieldAccessorsInitialized(\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.Builder.class);\n        }\n\n        // Construct using org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.newBuilder()\n        private Builder() {\n          maybeForceBuilderInitialization();\n        }\n\n        private Builder(\n            com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n          super(parent);\n          maybeForceBuilderInitialization();\n        }\n        private void maybeForceBuilderInitialization() {\n          if (com.google.protobuf.GeneratedMessageV3\n                  .alwaysUseFieldBuilders) {\n          }\n        }\n        public Builder clear() {\n          super.clear();\n          type_ = 0;\n          bitField0_ = (bitField0_ & ~0x00000001);\n          isNull_ = false;\n          bitField0_ = (bitField0_ & ~0x00000002);\n          valueCase_ = 0;\n          value_ = null;\n          return this;\n        }\n\n        public com.google.protobuf.Descriptors.Descriptor\n            getDescriptorForType() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_descriptor;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue getDefaultInstanceForType() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.getDefaultInstance();\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue build() {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue result = buildPartial();\n          if (!result.isInitialized()) {\n            throw newUninitializedMessageException(result);\n          }\n          return result;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue buildPartial() {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue result = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue(this);\n          int from_bitField0_ = bitField0_;\n          int to_bitField0_ = 0;\n          if (((from_bitField0_ & 0x00000001) == 0x00000001)) {\n            to_bitField0_ |= 0x00000001;\n          }\n          result.type_ = type_;\n          if (((from_bitField0_ & 0x00000002) == 0x00000002)) {\n            to_bitField0_ |= 0x00000002;\n          }\n          result.isNull_ = isNull_;\n          if (valueCase_ == 3) {\n            result.value_ = value_;\n          }\n          if (valueCase_ == 4) {\n            result.value_ = value_;\n          }\n          if (valueCase_ == 5) {\n            result.value_ = value_;\n          }\n          if (valueCase_ == 6) {\n            result.value_ = value_;\n          }\n          if (valueCase_ == 7) {\n            result.value_ = value_;\n          }\n          if (valueCase_ == 8) {\n            result.value_ = value_;\n          }\n          if (valueCase_ == 9) {\n            if (propertysetValueBuilder_ == null) {\n              result.value_ = value_;\n            } else {\n              result.value_ = propertysetValueBuilder_.build();\n            }\n          }\n          if (valueCase_ == 10) {\n            if (propertysetsValueBuilder_ == null) {\n              result.value_ = value_;\n            } else {\n              result.value_ = propertysetsValueBuilder_.build();\n            }\n          }\n          if (valueCase_ == 11) {\n            if (extensionValueBuilder_ == null) {\n              result.value_ = value_;\n            } else {\n              result.value_ = extensionValueBuilder_.build();\n            }\n          }\n          result.bitField0_ = to_bitField0_;\n          result.valueCase_ = valueCase_;\n          onBuilt();\n          return result;\n        }\n\n        public Builder clone() {\n          return (Builder) super.clone();\n        }\n        public Builder setField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            Object value) {\n          return (Builder) super.setField(field, value);\n        }\n        public Builder clearField(\n            com.google.protobuf.Descriptors.FieldDescriptor field) {\n          return (Builder) super.clearField(field);\n        }\n        public Builder clearOneof(\n            com.google.protobuf.Descriptors.OneofDescriptor oneof) {\n          return (Builder) super.clearOneof(oneof);\n        }\n        public Builder setRepeatedField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            int index, Object value) {\n          return (Builder) super.setRepeatedField(field, index, value);\n        }\n        public Builder addRepeatedField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            Object value) {\n          return (Builder) super.addRepeatedField(field, value);\n        }\n        public Builder mergeFrom(com.google.protobuf.Message other) {\n          if (other instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue) {\n            return mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue)other);\n          } else {\n            super.mergeFrom(other);\n            return this;\n          }\n        }\n\n        public Builder mergeFrom(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue other) {\n          if (other == org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.getDefaultInstance()) return this;\n          if (other.hasType()) {\n            setType(other.getType());\n          }\n          if (other.hasIsNull()) {\n            setIsNull(other.getIsNull());\n          }\n          switch (other.getValueCase()) {\n            case INT_VALUE: {\n              setIntValue(other.getIntValue());\n              break;\n            }\n            case LONG_VALUE: {\n              setLongValue(other.getLongValue());\n              break;\n            }\n            case FLOAT_VALUE: {\n              setFloatValue(other.getFloatValue());\n              break;\n            }\n            case DOUBLE_VALUE: {\n              setDoubleValue(other.getDoubleValue());\n              break;\n            }\n            case BOOLEAN_VALUE: {\n              setBooleanValue(other.getBooleanValue());\n              break;\n            }\n            case STRING_VALUE: {\n              valueCase_ = 8;\n              value_ = other.value_;\n              onChanged();\n              break;\n            }\n            case PROPERTYSET_VALUE: {\n              mergePropertysetValue(other.getPropertysetValue());\n              break;\n            }\n            case PROPERTYSETS_VALUE: {\n              mergePropertysetsValue(other.getPropertysetsValue());\n              break;\n            }\n            case EXTENSION_VALUE: {\n              mergeExtensionValue(other.getExtensionValue());\n              break;\n            }\n            case VALUE_NOT_SET: {\n              break;\n            }\n          }\n          this.mergeUnknownFields(other.unknownFields);\n          onChanged();\n          return this;\n        }\n\n        public final boolean isInitialized() {\n          if (hasPropertysetValue()) {\n            if (!getPropertysetValue().isInitialized()) {\n              return false;\n            }\n          }\n          if (hasPropertysetsValue()) {\n            if (!getPropertysetsValue().isInitialized()) {\n              return false;\n            }\n          }\n          if (hasExtensionValue()) {\n            if (!getExtensionValue().isInitialized()) {\n              return false;\n            }\n          }\n          return true;\n        }\n\n        public Builder mergeFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue parsedMessage = null;\n          try {\n            parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry);\n          } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n            parsedMessage = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue) e.getUnfinishedMessage();\n            throw e.unwrapIOException();\n          } finally {\n            if (parsedMessage != null) {\n              mergeFrom(parsedMessage);\n            }\n          }\n          return this;\n        }\n        private int valueCase_ = 0;\n        private java.lang.Object value_;\n        public ValueCase\n            getValueCase() {\n          return ValueCase.forNumber(\n              valueCase_);\n        }\n\n        public Builder clearValue() {\n          valueCase_ = 0;\n          value_ = null;\n          onChanged();\n          return this;\n        }\n\n        private int bitField0_;\n\n        private int type_ ;\n        /**\n         * <code>optional uint32 type = 1;</code>\n         */\n        public boolean hasType() {\n          return ((bitField0_ & 0x00000001) == 0x00000001);\n        }\n        /**\n         * <code>optional uint32 type = 1;</code>\n         */\n        public int getType() {\n          return type_;\n        }\n        /**\n         * <code>optional uint32 type = 1;</code>\n         */\n        public Builder setType(int value) {\n          bitField0_ |= 0x00000001;\n          type_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional uint32 type = 1;</code>\n         */\n        public Builder clearType() {\n          bitField0_ = (bitField0_ & ~0x00000001);\n          type_ = 0;\n          onChanged();\n          return this;\n        }\n\n        private boolean isNull_ ;\n        /**\n         * <code>optional bool is_null = 2;</code>\n         */\n        public boolean hasIsNull() {\n          return ((bitField0_ & 0x00000002) == 0x00000002);\n        }\n        /**\n         * <code>optional bool is_null = 2;</code>\n         */\n        public boolean getIsNull() {\n          return isNull_;\n        }\n        /**\n         * <code>optional bool is_null = 2;</code>\n         */\n        public Builder setIsNull(boolean value) {\n          bitField0_ |= 0x00000002;\n          isNull_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional bool is_null = 2;</code>\n         */\n        public Builder clearIsNull() {\n          bitField0_ = (bitField0_ & ~0x00000002);\n          isNull_ = false;\n          onChanged();\n          return this;\n        }\n\n        /**\n         * <code>optional uint32 int_value = 3;</code>\n         */\n        public boolean hasIntValue() {\n          return valueCase_ == 3;\n        }\n        /**\n         * <code>optional uint32 int_value = 3;</code>\n         */\n        public int getIntValue() {\n          if (valueCase_ == 3) {\n            return (java.lang.Integer) value_;\n          }\n          return 0;\n        }\n        /**\n         * <code>optional uint32 int_value = 3;</code>\n         */\n        public Builder setIntValue(int value) {\n          valueCase_ = 3;\n          value_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional uint32 int_value = 3;</code>\n         */\n        public Builder clearIntValue() {\n          if (valueCase_ == 3) {\n            valueCase_ = 0;\n            value_ = null;\n            onChanged();\n          }\n          return this;\n        }\n\n        /**\n         * <code>optional uint64 long_value = 4;</code>\n         */\n        public boolean hasLongValue() {\n          return valueCase_ == 4;\n        }\n        /**\n         * <code>optional uint64 long_value = 4;</code>\n         */\n        public long getLongValue() {\n          if (valueCase_ == 4) {\n            return (java.lang.Long) value_;\n          }\n          return 0L;\n        }\n        /**\n         * <code>optional uint64 long_value = 4;</code>\n         */\n        public Builder setLongValue(long value) {\n          valueCase_ = 4;\n          value_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional uint64 long_value = 4;</code>\n         */\n        public Builder clearLongValue() {\n          if (valueCase_ == 4) {\n            valueCase_ = 0;\n            value_ = null;\n            onChanged();\n          }\n          return this;\n        }\n\n        /**\n         * <code>optional float float_value = 5;</code>\n         */\n        public boolean hasFloatValue() {\n          return valueCase_ == 5;\n        }\n        /**\n         * <code>optional float float_value = 5;</code>\n         */\n        public float getFloatValue() {\n          if (valueCase_ == 5) {\n            return (java.lang.Float) value_;\n          }\n          return 0F;\n        }\n        /**\n         * <code>optional float float_value = 5;</code>\n         */\n        public Builder setFloatValue(float value) {\n          valueCase_ = 5;\n          value_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional float float_value = 5;</code>\n         */\n        public Builder clearFloatValue() {\n          if (valueCase_ == 5) {\n            valueCase_ = 0;\n            value_ = null;\n            onChanged();\n          }\n          return this;\n        }\n\n        /**\n         * <code>optional double double_value = 6;</code>\n         */\n        public boolean hasDoubleValue() {\n          return valueCase_ == 6;\n        }\n        /**\n         * <code>optional double double_value = 6;</code>\n         */\n        public double getDoubleValue() {\n          if (valueCase_ == 6) {\n            return (java.lang.Double) value_;\n          }\n          return 0D;\n        }\n        /**\n         * <code>optional double double_value = 6;</code>\n         */\n        public Builder setDoubleValue(double value) {\n          valueCase_ = 6;\n          value_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional double double_value = 6;</code>\n         */\n        public Builder clearDoubleValue() {\n          if (valueCase_ == 6) {\n            valueCase_ = 0;\n            value_ = null;\n            onChanged();\n          }\n          return this;\n        }\n\n        /**\n         * <code>optional bool boolean_value = 7;</code>\n         */\n        public boolean hasBooleanValue() {\n          return valueCase_ == 7;\n        }\n        /**\n         * <code>optional bool boolean_value = 7;</code>\n         */\n        public boolean getBooleanValue() {\n          if (valueCase_ == 7) {\n            return (java.lang.Boolean) value_;\n          }\n          return false;\n        }\n        /**\n         * <code>optional bool boolean_value = 7;</code>\n         */\n        public Builder setBooleanValue(boolean value) {\n          valueCase_ = 7;\n          value_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional bool boolean_value = 7;</code>\n         */\n        public Builder clearBooleanValue() {\n          if (valueCase_ == 7) {\n            valueCase_ = 0;\n            value_ = null;\n            onChanged();\n          }\n          return this;\n        }\n\n        /**\n         * <code>optional string string_value = 8;</code>\n         */\n        public boolean hasStringValue() {\n          return valueCase_ == 8;\n        }\n        /**\n         * <code>optional string string_value = 8;</code>\n         */\n        public java.lang.String getStringValue() {\n          java.lang.Object ref = \"\";\n          if (valueCase_ == 8) {\n            ref = value_;\n          }\n          if (!(ref instanceof java.lang.String)) {\n            com.google.protobuf.ByteString bs =\n                (com.google.protobuf.ByteString) ref;\n            java.lang.String s = bs.toStringUtf8();\n            if (valueCase_ == 8) {\n              if (bs.isValidUtf8()) {\n                value_ = s;\n              }\n            }\n            return s;\n          } else {\n            return (java.lang.String) ref;\n          }\n        }\n        /**\n         * <code>optional string string_value = 8;</code>\n         */\n        public com.google.protobuf.ByteString\n            getStringValueBytes() {\n          java.lang.Object ref = \"\";\n          if (valueCase_ == 8) {\n            ref = value_;\n          }\n          if (ref instanceof String) {\n            com.google.protobuf.ByteString b = \n                com.google.protobuf.ByteString.copyFromUtf8(\n                    (java.lang.String) ref);\n            if (valueCase_ == 8) {\n              value_ = b;\n            }\n            return b;\n          } else {\n            return (com.google.protobuf.ByteString) ref;\n          }\n        }\n        /**\n         * <code>optional string string_value = 8;</code>\n         */\n        public Builder setStringValue(\n            java.lang.String value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  valueCase_ = 8;\n          value_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional string string_value = 8;</code>\n         */\n        public Builder clearStringValue() {\n          if (valueCase_ == 8) {\n            valueCase_ = 0;\n            value_ = null;\n            onChanged();\n          }\n          return this;\n        }\n        /**\n         * <code>optional string string_value = 8;</code>\n         */\n        public Builder setStringValueBytes(\n            com.google.protobuf.ByteString value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  valueCase_ = 8;\n          value_ = value;\n          onChanged();\n          return this;\n        }\n\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder> propertysetValueBuilder_;\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset_value = 9;</code>\n         */\n        public boolean hasPropertysetValue() {\n          return valueCase_ == 9;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset_value = 9;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet getPropertysetValue() {\n          if (propertysetValueBuilder_ == null) {\n            if (valueCase_ == 9) {\n              return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet) value_;\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance();\n          } else {\n            if (valueCase_ == 9) {\n              return propertysetValueBuilder_.getMessage();\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance();\n          }\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset_value = 9;</code>\n         */\n        public Builder setPropertysetValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet value) {\n          if (propertysetValueBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            value_ = value;\n            onChanged();\n          } else {\n            propertysetValueBuilder_.setMessage(value);\n          }\n          valueCase_ = 9;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset_value = 9;</code>\n         */\n        public Builder setPropertysetValue(\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder builderForValue) {\n          if (propertysetValueBuilder_ == null) {\n            value_ = builderForValue.build();\n            onChanged();\n          } else {\n            propertysetValueBuilder_.setMessage(builderForValue.build());\n          }\n          valueCase_ = 9;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset_value = 9;</code>\n         */\n        public Builder mergePropertysetValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet value) {\n          if (propertysetValueBuilder_ == null) {\n            if (valueCase_ == 9 &&\n                value_ != org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance()) {\n              value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.newBuilder((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet) value_)\n                  .mergeFrom(value).buildPartial();\n            } else {\n              value_ = value;\n            }\n            onChanged();\n          } else {\n            if (valueCase_ == 9) {\n              propertysetValueBuilder_.mergeFrom(value);\n            }\n            propertysetValueBuilder_.setMessage(value);\n          }\n          valueCase_ = 9;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset_value = 9;</code>\n         */\n        public Builder clearPropertysetValue() {\n          if (propertysetValueBuilder_ == null) {\n            if (valueCase_ == 9) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n          } else {\n            if (valueCase_ == 9) {\n              valueCase_ = 0;\n              value_ = null;\n            }\n            propertysetValueBuilder_.clear();\n          }\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset_value = 9;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder getPropertysetValueBuilder() {\n          return getPropertysetValueFieldBuilder().getBuilder();\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset_value = 9;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder getPropertysetValueOrBuilder() {\n          if ((valueCase_ == 9) && (propertysetValueBuilder_ != null)) {\n            return propertysetValueBuilder_.getMessageOrBuilder();\n          } else {\n            if (valueCase_ == 9) {\n              return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet) value_;\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance();\n          }\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset_value = 9;</code>\n         */\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder> \n            getPropertysetValueFieldBuilder() {\n          if (propertysetValueBuilder_ == null) {\n            if (!(valueCase_ == 9)) {\n              value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance();\n            }\n            propertysetValueBuilder_ = new com.google.protobuf.SingleFieldBuilderV3<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder>(\n                    (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet) value_,\n                    getParentForChildren(),\n                    isClean());\n            value_ = null;\n          }\n          valueCase_ = 9;\n          onChanged();;\n          return propertysetValueBuilder_;\n        }\n\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetListOrBuilder> propertysetsValueBuilder_;\n        /**\n         * <pre>\n         * List of Property Values\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySetList propertysets_value = 10;</code>\n         */\n        public boolean hasPropertysetsValue() {\n          return valueCase_ == 10;\n        }\n        /**\n         * <pre>\n         * List of Property Values\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySetList propertysets_value = 10;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList getPropertysetsValue() {\n          if (propertysetsValueBuilder_ == null) {\n            if (valueCase_ == 10) {\n              return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList) value_;\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.getDefaultInstance();\n          } else {\n            if (valueCase_ == 10) {\n              return propertysetsValueBuilder_.getMessage();\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.getDefaultInstance();\n          }\n        }\n        /**\n         * <pre>\n         * List of Property Values\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySetList propertysets_value = 10;</code>\n         */\n        public Builder setPropertysetsValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList value) {\n          if (propertysetsValueBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            value_ = value;\n            onChanged();\n          } else {\n            propertysetsValueBuilder_.setMessage(value);\n          }\n          valueCase_ = 10;\n          return this;\n        }\n        /**\n         * <pre>\n         * List of Property Values\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySetList propertysets_value = 10;</code>\n         */\n        public Builder setPropertysetsValue(\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.Builder builderForValue) {\n          if (propertysetsValueBuilder_ == null) {\n            value_ = builderForValue.build();\n            onChanged();\n          } else {\n            propertysetsValueBuilder_.setMessage(builderForValue.build());\n          }\n          valueCase_ = 10;\n          return this;\n        }\n        /**\n         * <pre>\n         * List of Property Values\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySetList propertysets_value = 10;</code>\n         */\n        public Builder mergePropertysetsValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList value) {\n          if (propertysetsValueBuilder_ == null) {\n            if (valueCase_ == 10 &&\n                value_ != org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.getDefaultInstance()) {\n              value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.newBuilder((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList) value_)\n                  .mergeFrom(value).buildPartial();\n            } else {\n              value_ = value;\n            }\n            onChanged();\n          } else {\n            if (valueCase_ == 10) {\n              propertysetsValueBuilder_.mergeFrom(value);\n            }\n            propertysetsValueBuilder_.setMessage(value);\n          }\n          valueCase_ = 10;\n          return this;\n        }\n        /**\n         * <pre>\n         * List of Property Values\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySetList propertysets_value = 10;</code>\n         */\n        public Builder clearPropertysetsValue() {\n          if (propertysetsValueBuilder_ == null) {\n            if (valueCase_ == 10) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n          } else {\n            if (valueCase_ == 10) {\n              valueCase_ = 0;\n              value_ = null;\n            }\n            propertysetsValueBuilder_.clear();\n          }\n          return this;\n        }\n        /**\n         * <pre>\n         * List of Property Values\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySetList propertysets_value = 10;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.Builder getPropertysetsValueBuilder() {\n          return getPropertysetsValueFieldBuilder().getBuilder();\n        }\n        /**\n         * <pre>\n         * List of Property Values\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySetList propertysets_value = 10;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetListOrBuilder getPropertysetsValueOrBuilder() {\n          if ((valueCase_ == 10) && (propertysetsValueBuilder_ != null)) {\n            return propertysetsValueBuilder_.getMessageOrBuilder();\n          } else {\n            if (valueCase_ == 10) {\n              return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList) value_;\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.getDefaultInstance();\n          }\n        }\n        /**\n         * <pre>\n         * List of Property Values\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySetList propertysets_value = 10;</code>\n         */\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetListOrBuilder> \n            getPropertysetsValueFieldBuilder() {\n          if (propertysetsValueBuilder_ == null) {\n            if (!(valueCase_ == 10)) {\n              value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.getDefaultInstance();\n            }\n            propertysetsValueBuilder_ = new com.google.protobuf.SingleFieldBuilderV3<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetListOrBuilder>(\n                    (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList) value_,\n                    getParentForChildren(),\n                    isClean());\n            value_ = null;\n          }\n          valueCase_ = 10;\n          onChanged();;\n          return propertysetsValueBuilder_;\n        }\n\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtensionOrBuilder> extensionValueBuilder_;\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension extension_value = 11;</code>\n         */\n        public boolean hasExtensionValue() {\n          return valueCase_ == 11;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension extension_value = 11;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension getExtensionValue() {\n          if (extensionValueBuilder_ == null) {\n            if (valueCase_ == 11) {\n              return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension) value_;\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.getDefaultInstance();\n          } else {\n            if (valueCase_ == 11) {\n              return extensionValueBuilder_.getMessage();\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.getDefaultInstance();\n          }\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension extension_value = 11;</code>\n         */\n        public Builder setExtensionValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension value) {\n          if (extensionValueBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            value_ = value;\n            onChanged();\n          } else {\n            extensionValueBuilder_.setMessage(value);\n          }\n          valueCase_ = 11;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension extension_value = 11;</code>\n         */\n        public Builder setExtensionValue(\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.Builder builderForValue) {\n          if (extensionValueBuilder_ == null) {\n            value_ = builderForValue.build();\n            onChanged();\n          } else {\n            extensionValueBuilder_.setMessage(builderForValue.build());\n          }\n          valueCase_ = 11;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension extension_value = 11;</code>\n         */\n        public Builder mergeExtensionValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension value) {\n          if (extensionValueBuilder_ == null) {\n            if (valueCase_ == 11 &&\n                value_ != org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.getDefaultInstance()) {\n              value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.newBuilder((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension) value_)\n                  .mergeFrom(value).buildPartial();\n            } else {\n              value_ = value;\n            }\n            onChanged();\n          } else {\n            if (valueCase_ == 11) {\n              extensionValueBuilder_.mergeFrom(value);\n            }\n            extensionValueBuilder_.setMessage(value);\n          }\n          valueCase_ = 11;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension extension_value = 11;</code>\n         */\n        public Builder clearExtensionValue() {\n          if (extensionValueBuilder_ == null) {\n            if (valueCase_ == 11) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n          } else {\n            if (valueCase_ == 11) {\n              valueCase_ = 0;\n              value_ = null;\n            }\n            extensionValueBuilder_.clear();\n          }\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension extension_value = 11;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.Builder getExtensionValueBuilder() {\n          return getExtensionValueFieldBuilder().getBuilder();\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension extension_value = 11;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtensionOrBuilder getExtensionValueOrBuilder() {\n          if ((valueCase_ == 11) && (extensionValueBuilder_ != null)) {\n            return extensionValueBuilder_.getMessageOrBuilder();\n          } else {\n            if (valueCase_ == 11) {\n              return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension) value_;\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.getDefaultInstance();\n          }\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension extension_value = 11;</code>\n         */\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtensionOrBuilder> \n            getExtensionValueFieldBuilder() {\n          if (extensionValueBuilder_ == null) {\n            if (!(valueCase_ == 11)) {\n              value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.getDefaultInstance();\n            }\n            extensionValueBuilder_ = new com.google.protobuf.SingleFieldBuilderV3<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtensionOrBuilder>(\n                    (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PropertyValueExtension) value_,\n                    getParentForChildren(),\n                    isClean());\n            value_ = null;\n          }\n          valueCase_ = 11;\n          onChanged();;\n          return extensionValueBuilder_;\n        }\n        public final Builder setUnknownFields(\n            final com.google.protobuf.UnknownFieldSet unknownFields) {\n          return super.setUnknownFields(unknownFields);\n        }\n\n        public final Builder mergeUnknownFields(\n            final com.google.protobuf.UnknownFieldSet unknownFields) {\n          return super.mergeUnknownFields(unknownFields);\n        }\n\n\n        // @@protoc_insertion_point(builder_scope:org.eclipse.tahu.protobuf.Payload.PropertyValue)\n      }\n\n      // @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.PropertyValue)\n      private static final org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue DEFAULT_INSTANCE;\n      static {\n        DEFAULT_INSTANCE = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue();\n      }\n\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue getDefaultInstance() {\n        return DEFAULT_INSTANCE;\n      }\n\n      @java.lang.Deprecated public static final com.google.protobuf.Parser<PropertyValue>\n          PARSER = new com.google.protobuf.AbstractParser<PropertyValue>() {\n        public PropertyValue parsePartialFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n            return new PropertyValue(input, extensionRegistry);\n        }\n      };\n\n      public static com.google.protobuf.Parser<PropertyValue> parser() {\n        return PARSER;\n      }\n\n      @java.lang.Override\n      public com.google.protobuf.Parser<PropertyValue> getParserForType() {\n        return PARSER;\n      }\n\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue getDefaultInstanceForType() {\n        return DEFAULT_INSTANCE;\n      }\n\n    }\n\n    public interface PropertySetOrBuilder extends\n        // @@protoc_insertion_point(interface_extends:org.eclipse.tahu.protobuf.Payload.PropertySet)\n        com.google.protobuf.GeneratedMessageV3.\n            ExtendableMessageOrBuilder<PropertySet> {\n\n      /**\n       * <pre>\n       * Names of the properties\n       * </pre>\n       *\n       * <code>repeated string keys = 1;</code>\n       */\n      java.util.List<java.lang.String>\n          getKeysList();\n      /**\n       * <pre>\n       * Names of the properties\n       * </pre>\n       *\n       * <code>repeated string keys = 1;</code>\n       */\n      int getKeysCount();\n      /**\n       * <pre>\n       * Names of the properties\n       * </pre>\n       *\n       * <code>repeated string keys = 1;</code>\n       */\n      java.lang.String getKeys(int index);\n      /**\n       * <pre>\n       * Names of the properties\n       * </pre>\n       *\n       * <code>repeated string keys = 1;</code>\n       */\n      com.google.protobuf.ByteString\n          getKeysBytes(int index);\n\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n       */\n      java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue> \n          getValuesList();\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue getValues(int index);\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n       */\n      int getValuesCount();\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n       */\n      java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValueOrBuilder> \n          getValuesOrBuilderList();\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValueOrBuilder getValuesOrBuilder(\n          int index);\n    }\n    /**\n     * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.PropertySet}\n     */\n    public  static final class PropertySet extends\n        com.google.protobuf.GeneratedMessageV3.ExtendableMessage<\n          PropertySet> implements\n        // @@protoc_insertion_point(message_implements:org.eclipse.tahu.protobuf.Payload.PropertySet)\n        PropertySetOrBuilder {\n      // Use PropertySet.newBuilder() to construct.\n      private PropertySet(com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet, ?> builder) {\n        super(builder);\n      }\n      private PropertySet() {\n        keys_ = com.google.protobuf.LazyStringArrayList.EMPTY;\n        values_ = java.util.Collections.emptyList();\n      }\n\n      @java.lang.Override\n      public final com.google.protobuf.UnknownFieldSet\n      getUnknownFields() {\n        return this.unknownFields;\n      }\n      private PropertySet(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        this();\n        int mutable_bitField0_ = 0;\n        com.google.protobuf.UnknownFieldSet.Builder unknownFields =\n            com.google.protobuf.UnknownFieldSet.newBuilder();\n        try {\n          boolean done = false;\n          while (!done) {\n            int tag = input.readTag();\n            switch (tag) {\n              case 0:\n                done = true;\n                break;\n              default: {\n                if (!parseUnknownField(input, unknownFields,\n                                       extensionRegistry, tag)) {\n                  done = true;\n                }\n                break;\n              }\n              case 10: {\n                com.google.protobuf.ByteString bs = input.readBytes();\n                if (!((mutable_bitField0_ & 0x00000001) == 0x00000001)) {\n                  keys_ = new com.google.protobuf.LazyStringArrayList();\n                  mutable_bitField0_ |= 0x00000001;\n                }\n                keys_.add(bs);\n                break;\n              }\n              case 18: {\n                if (!((mutable_bitField0_ & 0x00000002) == 0x00000002)) {\n                  values_ = new java.util.ArrayList<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue>();\n                  mutable_bitField0_ |= 0x00000002;\n                }\n                values_.add(\n                    input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.PARSER, extensionRegistry));\n                break;\n              }\n            }\n          }\n        } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n          throw e.setUnfinishedMessage(this);\n        } catch (java.io.IOException e) {\n          throw new com.google.protobuf.InvalidProtocolBufferException(\n              e).setUnfinishedMessage(this);\n        } finally {\n          if (((mutable_bitField0_ & 0x00000001) == 0x00000001)) {\n            keys_ = keys_.getUnmodifiableView();\n          }\n          if (((mutable_bitField0_ & 0x00000002) == 0x00000002)) {\n            values_ = java.util.Collections.unmodifiableList(values_);\n          }\n          this.unknownFields = unknownFields.build();\n          makeExtensionsImmutable();\n        }\n      }\n      public static final com.google.protobuf.Descriptors.Descriptor\n          getDescriptor() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertySet_descriptor;\n      }\n\n      protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n          internalGetFieldAccessorTable() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertySet_fieldAccessorTable\n            .ensureFieldAccessorsInitialized(\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder.class);\n      }\n\n      public static final int KEYS_FIELD_NUMBER = 1;\n      private com.google.protobuf.LazyStringList keys_;\n      /**\n       * <pre>\n       * Names of the properties\n       * </pre>\n       *\n       * <code>repeated string keys = 1;</code>\n       */\n      public com.google.protobuf.ProtocolStringList\n          getKeysList() {\n        return keys_;\n      }\n      /**\n       * <pre>\n       * Names of the properties\n       * </pre>\n       *\n       * <code>repeated string keys = 1;</code>\n       */\n      public int getKeysCount() {\n        return keys_.size();\n      }\n      /**\n       * <pre>\n       * Names of the properties\n       * </pre>\n       *\n       * <code>repeated string keys = 1;</code>\n       */\n      public java.lang.String getKeys(int index) {\n        return keys_.get(index);\n      }\n      /**\n       * <pre>\n       * Names of the properties\n       * </pre>\n       *\n       * <code>repeated string keys = 1;</code>\n       */\n      public com.google.protobuf.ByteString\n          getKeysBytes(int index) {\n        return keys_.getByteString(index);\n      }\n\n      public static final int VALUES_FIELD_NUMBER = 2;\n      private java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue> values_;\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n       */\n      public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue> getValuesList() {\n        return values_;\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n       */\n      public java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValueOrBuilder> \n          getValuesOrBuilderList() {\n        return values_;\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n       */\n      public int getValuesCount() {\n        return values_.size();\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue getValues(int index) {\n        return values_.get(index);\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValueOrBuilder getValuesOrBuilder(\n          int index) {\n        return values_.get(index);\n      }\n\n      private byte memoizedIsInitialized = -1;\n      public final boolean isInitialized() {\n        byte isInitialized = memoizedIsInitialized;\n        if (isInitialized == 1) return true;\n        if (isInitialized == 0) return false;\n\n        for (int i = 0; i < getValuesCount(); i++) {\n          if (!getValues(i).isInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n        }\n        if (!extensionsAreInitialized()) {\n          memoizedIsInitialized = 0;\n          return false;\n        }\n        memoizedIsInitialized = 1;\n        return true;\n      }\n\n      public void writeTo(com.google.protobuf.CodedOutputStream output)\n                          throws java.io.IOException {\n        com.google.protobuf.GeneratedMessageV3\n          .ExtendableMessage<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet>.ExtensionWriter\n            extensionWriter = newExtensionWriter();\n        for (int i = 0; i < keys_.size(); i++) {\n          com.google.protobuf.GeneratedMessageV3.writeString(output, 1, keys_.getRaw(i));\n        }\n        for (int i = 0; i < values_.size(); i++) {\n          output.writeMessage(2, values_.get(i));\n        }\n        extensionWriter.writeUntil(536870912, output);\n        unknownFields.writeTo(output);\n      }\n\n      public int getSerializedSize() {\n        int size = memoizedSize;\n        if (size != -1) return size;\n\n        size = 0;\n        {\n          int dataSize = 0;\n          for (int i = 0; i < keys_.size(); i++) {\n            dataSize += computeStringSizeNoTag(keys_.getRaw(i));\n          }\n          size += dataSize;\n          size += 1 * getKeysList().size();\n        }\n        for (int i = 0; i < values_.size(); i++) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeMessageSize(2, values_.get(i));\n        }\n        size += extensionsSerializedSize();\n        size += unknownFields.getSerializedSize();\n        memoizedSize = size;\n        return size;\n      }\n\n      private static final long serialVersionUID = 0L;\n      @java.lang.Override\n      public boolean equals(final java.lang.Object obj) {\n        if (obj == this) {\n         return true;\n        }\n        if (!(obj instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet)) {\n          return super.equals(obj);\n        }\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet other = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet) obj;\n\n        boolean result = true;\n        result = result && getKeysList()\n            .equals(other.getKeysList());\n        result = result && getValuesList()\n            .equals(other.getValuesList());\n        result = result && unknownFields.equals(other.unknownFields);\n        result = result &&\n            getExtensionFields().equals(other.getExtensionFields());\n        return result;\n      }\n\n      @java.lang.Override\n      public int hashCode() {\n        if (memoizedHashCode != 0) {\n          return memoizedHashCode;\n        }\n        int hash = 41;\n        hash = (19 * hash) + getDescriptorForType().hashCode();\n        if (getKeysCount() > 0) {\n          hash = (37 * hash) + KEYS_FIELD_NUMBER;\n          hash = (53 * hash) + getKeysList().hashCode();\n        }\n        if (getValuesCount() > 0) {\n          hash = (37 * hash) + VALUES_FIELD_NUMBER;\n          hash = (53 * hash) + getValuesList().hashCode();\n        }\n        hash = hashFields(hash, getExtensionFields());\n        hash = (29 * hash) + unknownFields.hashCode();\n        memoizedHashCode = hash;\n        return hash;\n      }\n\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet parseFrom(\n          com.google.protobuf.ByteString data)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet parseFrom(\n          com.google.protobuf.ByteString data,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet parseFrom(byte[] data)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet parseFrom(\n          byte[] data,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet parseFrom(java.io.InputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet parseFrom(\n          java.io.InputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet parseDelimitedFrom(java.io.InputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseDelimitedWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet parseDelimitedFrom(\n          java.io.InputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseDelimitedWithIOException(PARSER, input, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet parseFrom(\n          com.google.protobuf.CodedInputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet parseFrom(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input, extensionRegistry);\n      }\n\n      public Builder newBuilderForType() { return newBuilder(); }\n      public static Builder newBuilder() {\n        return DEFAULT_INSTANCE.toBuilder();\n      }\n      public static Builder newBuilder(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet prototype) {\n        return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);\n      }\n      public Builder toBuilder() {\n        return this == DEFAULT_INSTANCE\n            ? new Builder() : new Builder().mergeFrom(this);\n      }\n\n      @java.lang.Override\n      protected Builder newBuilderForType(\n          com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n        Builder builder = new Builder(parent);\n        return builder;\n      }\n      /**\n       * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.PropertySet}\n       */\n      public static final class Builder extends\n          com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet, Builder> implements\n          // @@protoc_insertion_point(builder_implements:org.eclipse.tahu.protobuf.Payload.PropertySet)\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder {\n        public static final com.google.protobuf.Descriptors.Descriptor\n            getDescriptor() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertySet_descriptor;\n        }\n\n        protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n            internalGetFieldAccessorTable() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertySet_fieldAccessorTable\n              .ensureFieldAccessorsInitialized(\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder.class);\n        }\n\n        // Construct using org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.newBuilder()\n        private Builder() {\n          maybeForceBuilderInitialization();\n        }\n\n        private Builder(\n            com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n          super(parent);\n          maybeForceBuilderInitialization();\n        }\n        private void maybeForceBuilderInitialization() {\n          if (com.google.protobuf.GeneratedMessageV3\n                  .alwaysUseFieldBuilders) {\n            getValuesFieldBuilder();\n          }\n        }\n        public Builder clear() {\n          super.clear();\n          keys_ = com.google.protobuf.LazyStringArrayList.EMPTY;\n          bitField0_ = (bitField0_ & ~0x00000001);\n          if (valuesBuilder_ == null) {\n            values_ = java.util.Collections.emptyList();\n            bitField0_ = (bitField0_ & ~0x00000002);\n          } else {\n            valuesBuilder_.clear();\n          }\n          return this;\n        }\n\n        public com.google.protobuf.Descriptors.Descriptor\n            getDescriptorForType() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertySet_descriptor;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet getDefaultInstanceForType() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance();\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet build() {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet result = buildPartial();\n          if (!result.isInitialized()) {\n            throw newUninitializedMessageException(result);\n          }\n          return result;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet buildPartial() {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet result = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet(this);\n          int from_bitField0_ = bitField0_;\n          if (((bitField0_ & 0x00000001) == 0x00000001)) {\n            keys_ = keys_.getUnmodifiableView();\n            bitField0_ = (bitField0_ & ~0x00000001);\n          }\n          result.keys_ = keys_;\n          if (valuesBuilder_ == null) {\n            if (((bitField0_ & 0x00000002) == 0x00000002)) {\n              values_ = java.util.Collections.unmodifiableList(values_);\n              bitField0_ = (bitField0_ & ~0x00000002);\n            }\n            result.values_ = values_;\n          } else {\n            result.values_ = valuesBuilder_.build();\n          }\n          onBuilt();\n          return result;\n        }\n\n        public Builder clone() {\n          return (Builder) super.clone();\n        }\n        public Builder setField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            Object value) {\n          return (Builder) super.setField(field, value);\n        }\n        public Builder clearField(\n            com.google.protobuf.Descriptors.FieldDescriptor field) {\n          return (Builder) super.clearField(field);\n        }\n        public Builder clearOneof(\n            com.google.protobuf.Descriptors.OneofDescriptor oneof) {\n          return (Builder) super.clearOneof(oneof);\n        }\n        public Builder setRepeatedField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            int index, Object value) {\n          return (Builder) super.setRepeatedField(field, index, value);\n        }\n        public Builder addRepeatedField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            Object value) {\n          return (Builder) super.addRepeatedField(field, value);\n        }\n        public <Type> Builder setExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet, Type> extension,\n            Type value) {\n          return (Builder) super.setExtension(extension, value);\n        }\n        public <Type> Builder setExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet, java.util.List<Type>> extension,\n            int index, Type value) {\n          return (Builder) super.setExtension(extension, index, value);\n        }\n        public <Type> Builder addExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet, java.util.List<Type>> extension,\n            Type value) {\n          return (Builder) super.addExtension(extension, value);\n        }\n        public <Type> Builder clearExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet, ?> extension) {\n          return (Builder) super.clearExtension(extension);\n        }\n        public Builder mergeFrom(com.google.protobuf.Message other) {\n          if (other instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet) {\n            return mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet)other);\n          } else {\n            super.mergeFrom(other);\n            return this;\n          }\n        }\n\n        public Builder mergeFrom(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet other) {\n          if (other == org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance()) return this;\n          if (!other.keys_.isEmpty()) {\n            if (keys_.isEmpty()) {\n              keys_ = other.keys_;\n              bitField0_ = (bitField0_ & ~0x00000001);\n            } else {\n              ensureKeysIsMutable();\n              keys_.addAll(other.keys_);\n            }\n            onChanged();\n          }\n          if (valuesBuilder_ == null) {\n            if (!other.values_.isEmpty()) {\n              if (values_.isEmpty()) {\n                values_ = other.values_;\n                bitField0_ = (bitField0_ & ~0x00000002);\n              } else {\n                ensureValuesIsMutable();\n                values_.addAll(other.values_);\n              }\n              onChanged();\n            }\n          } else {\n            if (!other.values_.isEmpty()) {\n              if (valuesBuilder_.isEmpty()) {\n                valuesBuilder_.dispose();\n                valuesBuilder_ = null;\n                values_ = other.values_;\n                bitField0_ = (bitField0_ & ~0x00000002);\n                valuesBuilder_ = \n                  com.google.protobuf.GeneratedMessageV3.alwaysUseFieldBuilders ?\n                     getValuesFieldBuilder() : null;\n              } else {\n                valuesBuilder_.addAllMessages(other.values_);\n              }\n            }\n          }\n          this.mergeExtensionFields(other);\n          this.mergeUnknownFields(other.unknownFields);\n          onChanged();\n          return this;\n        }\n\n        public final boolean isInitialized() {\n          for (int i = 0; i < getValuesCount(); i++) {\n            if (!getValues(i).isInitialized()) {\n              return false;\n            }\n          }\n          if (!extensionsAreInitialized()) {\n            return false;\n          }\n          return true;\n        }\n\n        public Builder mergeFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet parsedMessage = null;\n          try {\n            parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry);\n          } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n            parsedMessage = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet) e.getUnfinishedMessage();\n            throw e.unwrapIOException();\n          } finally {\n            if (parsedMessage != null) {\n              mergeFrom(parsedMessage);\n            }\n          }\n          return this;\n        }\n        private int bitField0_;\n\n        private com.google.protobuf.LazyStringList keys_ = com.google.protobuf.LazyStringArrayList.EMPTY;\n        private void ensureKeysIsMutable() {\n          if (!((bitField0_ & 0x00000001) == 0x00000001)) {\n            keys_ = new com.google.protobuf.LazyStringArrayList(keys_);\n            bitField0_ |= 0x00000001;\n           }\n        }\n        /**\n         * <pre>\n         * Names of the properties\n         * </pre>\n         *\n         * <code>repeated string keys = 1;</code>\n         */\n        public com.google.protobuf.ProtocolStringList\n            getKeysList() {\n          return keys_.getUnmodifiableView();\n        }\n        /**\n         * <pre>\n         * Names of the properties\n         * </pre>\n         *\n         * <code>repeated string keys = 1;</code>\n         */\n        public int getKeysCount() {\n          return keys_.size();\n        }\n        /**\n         * <pre>\n         * Names of the properties\n         * </pre>\n         *\n         * <code>repeated string keys = 1;</code>\n         */\n        public java.lang.String getKeys(int index) {\n          return keys_.get(index);\n        }\n        /**\n         * <pre>\n         * Names of the properties\n         * </pre>\n         *\n         * <code>repeated string keys = 1;</code>\n         */\n        public com.google.protobuf.ByteString\n            getKeysBytes(int index) {\n          return keys_.getByteString(index);\n        }\n        /**\n         * <pre>\n         * Names of the properties\n         * </pre>\n         *\n         * <code>repeated string keys = 1;</code>\n         */\n        public Builder setKeys(\n            int index, java.lang.String value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  ensureKeysIsMutable();\n          keys_.set(index, value);\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Names of the properties\n         * </pre>\n         *\n         * <code>repeated string keys = 1;</code>\n         */\n        public Builder addKeys(\n            java.lang.String value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  ensureKeysIsMutable();\n          keys_.add(value);\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Names of the properties\n         * </pre>\n         *\n         * <code>repeated string keys = 1;</code>\n         */\n        public Builder addAllKeys(\n            java.lang.Iterable<java.lang.String> values) {\n          ensureKeysIsMutable();\n          com.google.protobuf.AbstractMessageLite.Builder.addAll(\n              values, keys_);\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Names of the properties\n         * </pre>\n         *\n         * <code>repeated string keys = 1;</code>\n         */\n        public Builder clearKeys() {\n          keys_ = com.google.protobuf.LazyStringArrayList.EMPTY;\n          bitField0_ = (bitField0_ & ~0x00000001);\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Names of the properties\n         * </pre>\n         *\n         * <code>repeated string keys = 1;</code>\n         */\n        public Builder addKeysBytes(\n            com.google.protobuf.ByteString value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  ensureKeysIsMutable();\n          keys_.add(value);\n          onChanged();\n          return this;\n        }\n\n        private java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue> values_ =\n          java.util.Collections.emptyList();\n        private void ensureValuesIsMutable() {\n          if (!((bitField0_ & 0x00000002) == 0x00000002)) {\n            values_ = new java.util.ArrayList<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue>(values_);\n            bitField0_ |= 0x00000002;\n           }\n        }\n\n        private com.google.protobuf.RepeatedFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValueOrBuilder> valuesBuilder_;\n\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue> getValuesList() {\n          if (valuesBuilder_ == null) {\n            return java.util.Collections.unmodifiableList(values_);\n          } else {\n            return valuesBuilder_.getMessageList();\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public int getValuesCount() {\n          if (valuesBuilder_ == null) {\n            return values_.size();\n          } else {\n            return valuesBuilder_.getCount();\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue getValues(int index) {\n          if (valuesBuilder_ == null) {\n            return values_.get(index);\n          } else {\n            return valuesBuilder_.getMessage(index);\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public Builder setValues(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue value) {\n          if (valuesBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            ensureValuesIsMutable();\n            values_.set(index, value);\n            onChanged();\n          } else {\n            valuesBuilder_.setMessage(index, value);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public Builder setValues(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.Builder builderForValue) {\n          if (valuesBuilder_ == null) {\n            ensureValuesIsMutable();\n            values_.set(index, builderForValue.build());\n            onChanged();\n          } else {\n            valuesBuilder_.setMessage(index, builderForValue.build());\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public Builder addValues(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue value) {\n          if (valuesBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            ensureValuesIsMutable();\n            values_.add(value);\n            onChanged();\n          } else {\n            valuesBuilder_.addMessage(value);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public Builder addValues(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue value) {\n          if (valuesBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            ensureValuesIsMutable();\n            values_.add(index, value);\n            onChanged();\n          } else {\n            valuesBuilder_.addMessage(index, value);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public Builder addValues(\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.Builder builderForValue) {\n          if (valuesBuilder_ == null) {\n            ensureValuesIsMutable();\n            values_.add(builderForValue.build());\n            onChanged();\n          } else {\n            valuesBuilder_.addMessage(builderForValue.build());\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public Builder addValues(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.Builder builderForValue) {\n          if (valuesBuilder_ == null) {\n            ensureValuesIsMutable();\n            values_.add(index, builderForValue.build());\n            onChanged();\n          } else {\n            valuesBuilder_.addMessage(index, builderForValue.build());\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public Builder addAllValues(\n            java.lang.Iterable<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue> values) {\n          if (valuesBuilder_ == null) {\n            ensureValuesIsMutable();\n            com.google.protobuf.AbstractMessageLite.Builder.addAll(\n                values, values_);\n            onChanged();\n          } else {\n            valuesBuilder_.addAllMessages(values);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public Builder clearValues() {\n          if (valuesBuilder_ == null) {\n            values_ = java.util.Collections.emptyList();\n            bitField0_ = (bitField0_ & ~0x00000002);\n            onChanged();\n          } else {\n            valuesBuilder_.clear();\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public Builder removeValues(int index) {\n          if (valuesBuilder_ == null) {\n            ensureValuesIsMutable();\n            values_.remove(index);\n            onChanged();\n          } else {\n            valuesBuilder_.remove(index);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.Builder getValuesBuilder(\n            int index) {\n          return getValuesFieldBuilder().getBuilder(index);\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValueOrBuilder getValuesOrBuilder(\n            int index) {\n          if (valuesBuilder_ == null) {\n            return values_.get(index);  } else {\n            return valuesBuilder_.getMessageOrBuilder(index);\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValueOrBuilder> \n             getValuesOrBuilderList() {\n          if (valuesBuilder_ != null) {\n            return valuesBuilder_.getMessageOrBuilderList();\n          } else {\n            return java.util.Collections.unmodifiableList(values_);\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.Builder addValuesBuilder() {\n          return getValuesFieldBuilder().addBuilder(\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.getDefaultInstance());\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.Builder addValuesBuilder(\n            int index) {\n          return getValuesFieldBuilder().addBuilder(\n              index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.getDefaultInstance());\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertyValue values = 2;</code>\n         */\n        public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.Builder> \n             getValuesBuilderList() {\n          return getValuesFieldBuilder().getBuilderList();\n        }\n        private com.google.protobuf.RepeatedFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValueOrBuilder> \n            getValuesFieldBuilder() {\n          if (valuesBuilder_ == null) {\n            valuesBuilder_ = new com.google.protobuf.RepeatedFieldBuilderV3<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValue.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertyValueOrBuilder>(\n                    values_,\n                    ((bitField0_ & 0x00000002) == 0x00000002),\n                    getParentForChildren(),\n                    isClean());\n            values_ = null;\n          }\n          return valuesBuilder_;\n        }\n        public final Builder setUnknownFields(\n            final com.google.protobuf.UnknownFieldSet unknownFields) {\n          return super.setUnknownFields(unknownFields);\n        }\n\n        public final Builder mergeUnknownFields(\n            final com.google.protobuf.UnknownFieldSet unknownFields) {\n          return super.mergeUnknownFields(unknownFields);\n        }\n\n\n        // @@protoc_insertion_point(builder_scope:org.eclipse.tahu.protobuf.Payload.PropertySet)\n      }\n\n      // @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.PropertySet)\n      private static final org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet DEFAULT_INSTANCE;\n      static {\n        DEFAULT_INSTANCE = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet();\n      }\n\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet getDefaultInstance() {\n        return DEFAULT_INSTANCE;\n      }\n\n      @java.lang.Deprecated public static final com.google.protobuf.Parser<PropertySet>\n          PARSER = new com.google.protobuf.AbstractParser<PropertySet>() {\n        public PropertySet parsePartialFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n            return new PropertySet(input, extensionRegistry);\n        }\n      };\n\n      public static com.google.protobuf.Parser<PropertySet> parser() {\n        return PARSER;\n      }\n\n      @java.lang.Override\n      public com.google.protobuf.Parser<PropertySet> getParserForType() {\n        return PARSER;\n      }\n\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet getDefaultInstanceForType() {\n        return DEFAULT_INSTANCE;\n      }\n\n    }\n\n    public interface PropertySetListOrBuilder extends\n        // @@protoc_insertion_point(interface_extends:org.eclipse.tahu.protobuf.Payload.PropertySetList)\n        com.google.protobuf.GeneratedMessageV3.\n            ExtendableMessageOrBuilder<PropertySetList> {\n\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n       */\n      java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet> \n          getPropertysetList();\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet getPropertyset(int index);\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n       */\n      int getPropertysetCount();\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n       */\n      java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder> \n          getPropertysetOrBuilderList();\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder getPropertysetOrBuilder(\n          int index);\n    }\n    /**\n     * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.PropertySetList}\n     */\n    public  static final class PropertySetList extends\n        com.google.protobuf.GeneratedMessageV3.ExtendableMessage<\n          PropertySetList> implements\n        // @@protoc_insertion_point(message_implements:org.eclipse.tahu.protobuf.Payload.PropertySetList)\n        PropertySetListOrBuilder {\n      // Use PropertySetList.newBuilder() to construct.\n      private PropertySetList(com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList, ?> builder) {\n        super(builder);\n      }\n      private PropertySetList() {\n        propertyset_ = java.util.Collections.emptyList();\n      }\n\n      @java.lang.Override\n      public final com.google.protobuf.UnknownFieldSet\n      getUnknownFields() {\n        return this.unknownFields;\n      }\n      private PropertySetList(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        this();\n        int mutable_bitField0_ = 0;\n        com.google.protobuf.UnknownFieldSet.Builder unknownFields =\n            com.google.protobuf.UnknownFieldSet.newBuilder();\n        try {\n          boolean done = false;\n          while (!done) {\n            int tag = input.readTag();\n            switch (tag) {\n              case 0:\n                done = true;\n                break;\n              default: {\n                if (!parseUnknownField(input, unknownFields,\n                                       extensionRegistry, tag)) {\n                  done = true;\n                }\n                break;\n              }\n              case 10: {\n                if (!((mutable_bitField0_ & 0x00000001) == 0x00000001)) {\n                  propertyset_ = new java.util.ArrayList<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet>();\n                  mutable_bitField0_ |= 0x00000001;\n                }\n                propertyset_.add(\n                    input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.PARSER, extensionRegistry));\n                break;\n              }\n            }\n          }\n        } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n          throw e.setUnfinishedMessage(this);\n        } catch (java.io.IOException e) {\n          throw new com.google.protobuf.InvalidProtocolBufferException(\n              e).setUnfinishedMessage(this);\n        } finally {\n          if (((mutable_bitField0_ & 0x00000001) == 0x00000001)) {\n            propertyset_ = java.util.Collections.unmodifiableList(propertyset_);\n          }\n          this.unknownFields = unknownFields.build();\n          makeExtensionsImmutable();\n        }\n      }\n      public static final com.google.protobuf.Descriptors.Descriptor\n          getDescriptor() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertySetList_descriptor;\n      }\n\n      protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n          internalGetFieldAccessorTable() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertySetList_fieldAccessorTable\n            .ensureFieldAccessorsInitialized(\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.Builder.class);\n      }\n\n      public static final int PROPERTYSET_FIELD_NUMBER = 1;\n      private java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet> propertyset_;\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n       */\n      public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet> getPropertysetList() {\n        return propertyset_;\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n       */\n      public java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder> \n          getPropertysetOrBuilderList() {\n        return propertyset_;\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n       */\n      public int getPropertysetCount() {\n        return propertyset_.size();\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet getPropertyset(int index) {\n        return propertyset_.get(index);\n      }\n      /**\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder getPropertysetOrBuilder(\n          int index) {\n        return propertyset_.get(index);\n      }\n\n      private byte memoizedIsInitialized = -1;\n      public final boolean isInitialized() {\n        byte isInitialized = memoizedIsInitialized;\n        if (isInitialized == 1) return true;\n        if (isInitialized == 0) return false;\n\n        for (int i = 0; i < getPropertysetCount(); i++) {\n          if (!getPropertyset(i).isInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n        }\n        if (!extensionsAreInitialized()) {\n          memoizedIsInitialized = 0;\n          return false;\n        }\n        memoizedIsInitialized = 1;\n        return true;\n      }\n\n      public void writeTo(com.google.protobuf.CodedOutputStream output)\n                          throws java.io.IOException {\n        com.google.protobuf.GeneratedMessageV3\n          .ExtendableMessage<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList>.ExtensionWriter\n            extensionWriter = newExtensionWriter();\n        for (int i = 0; i < propertyset_.size(); i++) {\n          output.writeMessage(1, propertyset_.get(i));\n        }\n        extensionWriter.writeUntil(536870912, output);\n        unknownFields.writeTo(output);\n      }\n\n      public int getSerializedSize() {\n        int size = memoizedSize;\n        if (size != -1) return size;\n\n        size = 0;\n        for (int i = 0; i < propertyset_.size(); i++) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeMessageSize(1, propertyset_.get(i));\n        }\n        size += extensionsSerializedSize();\n        size += unknownFields.getSerializedSize();\n        memoizedSize = size;\n        return size;\n      }\n\n      private static final long serialVersionUID = 0L;\n      @java.lang.Override\n      public boolean equals(final java.lang.Object obj) {\n        if (obj == this) {\n         return true;\n        }\n        if (!(obj instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList)) {\n          return super.equals(obj);\n        }\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList other = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList) obj;\n\n        boolean result = true;\n        result = result && getPropertysetList()\n            .equals(other.getPropertysetList());\n        result = result && unknownFields.equals(other.unknownFields);\n        result = result &&\n            getExtensionFields().equals(other.getExtensionFields());\n        return result;\n      }\n\n      @java.lang.Override\n      public int hashCode() {\n        if (memoizedHashCode != 0) {\n          return memoizedHashCode;\n        }\n        int hash = 41;\n        hash = (19 * hash) + getDescriptorForType().hashCode();\n        if (getPropertysetCount() > 0) {\n          hash = (37 * hash) + PROPERTYSET_FIELD_NUMBER;\n          hash = (53 * hash) + getPropertysetList().hashCode();\n        }\n        hash = hashFields(hash, getExtensionFields());\n        hash = (29 * hash) + unknownFields.hashCode();\n        memoizedHashCode = hash;\n        return hash;\n      }\n\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList parseFrom(\n          com.google.protobuf.ByteString data)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList parseFrom(\n          com.google.protobuf.ByteString data,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList parseFrom(byte[] data)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList parseFrom(\n          byte[] data,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList parseFrom(java.io.InputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList parseFrom(\n          java.io.InputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList parseDelimitedFrom(java.io.InputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseDelimitedWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList parseDelimitedFrom(\n          java.io.InputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseDelimitedWithIOException(PARSER, input, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList parseFrom(\n          com.google.protobuf.CodedInputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList parseFrom(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input, extensionRegistry);\n      }\n\n      public Builder newBuilderForType() { return newBuilder(); }\n      public static Builder newBuilder() {\n        return DEFAULT_INSTANCE.toBuilder();\n      }\n      public static Builder newBuilder(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList prototype) {\n        return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);\n      }\n      public Builder toBuilder() {\n        return this == DEFAULT_INSTANCE\n            ? new Builder() : new Builder().mergeFrom(this);\n      }\n\n      @java.lang.Override\n      protected Builder newBuilderForType(\n          com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n        Builder builder = new Builder(parent);\n        return builder;\n      }\n      /**\n       * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.PropertySetList}\n       */\n      public static final class Builder extends\n          com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList, Builder> implements\n          // @@protoc_insertion_point(builder_implements:org.eclipse.tahu.protobuf.Payload.PropertySetList)\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetListOrBuilder {\n        public static final com.google.protobuf.Descriptors.Descriptor\n            getDescriptor() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertySetList_descriptor;\n        }\n\n        protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n            internalGetFieldAccessorTable() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertySetList_fieldAccessorTable\n              .ensureFieldAccessorsInitialized(\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.Builder.class);\n        }\n\n        // Construct using org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.newBuilder()\n        private Builder() {\n          maybeForceBuilderInitialization();\n        }\n\n        private Builder(\n            com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n          super(parent);\n          maybeForceBuilderInitialization();\n        }\n        private void maybeForceBuilderInitialization() {\n          if (com.google.protobuf.GeneratedMessageV3\n                  .alwaysUseFieldBuilders) {\n            getPropertysetFieldBuilder();\n          }\n        }\n        public Builder clear() {\n          super.clear();\n          if (propertysetBuilder_ == null) {\n            propertyset_ = java.util.Collections.emptyList();\n            bitField0_ = (bitField0_ & ~0x00000001);\n          } else {\n            propertysetBuilder_.clear();\n          }\n          return this;\n        }\n\n        public com.google.protobuf.Descriptors.Descriptor\n            getDescriptorForType() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_PropertySetList_descriptor;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList getDefaultInstanceForType() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.getDefaultInstance();\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList build() {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList result = buildPartial();\n          if (!result.isInitialized()) {\n            throw newUninitializedMessageException(result);\n          }\n          return result;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList buildPartial() {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList result = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList(this);\n          int from_bitField0_ = bitField0_;\n          if (propertysetBuilder_ == null) {\n            if (((bitField0_ & 0x00000001) == 0x00000001)) {\n              propertyset_ = java.util.Collections.unmodifiableList(propertyset_);\n              bitField0_ = (bitField0_ & ~0x00000001);\n            }\n            result.propertyset_ = propertyset_;\n          } else {\n            result.propertyset_ = propertysetBuilder_.build();\n          }\n          onBuilt();\n          return result;\n        }\n\n        public Builder clone() {\n          return (Builder) super.clone();\n        }\n        public Builder setField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            Object value) {\n          return (Builder) super.setField(field, value);\n        }\n        public Builder clearField(\n            com.google.protobuf.Descriptors.FieldDescriptor field) {\n          return (Builder) super.clearField(field);\n        }\n        public Builder clearOneof(\n            com.google.protobuf.Descriptors.OneofDescriptor oneof) {\n          return (Builder) super.clearOneof(oneof);\n        }\n        public Builder setRepeatedField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            int index, Object value) {\n          return (Builder) super.setRepeatedField(field, index, value);\n        }\n        public Builder addRepeatedField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            Object value) {\n          return (Builder) super.addRepeatedField(field, value);\n        }\n        public <Type> Builder setExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList, Type> extension,\n            Type value) {\n          return (Builder) super.setExtension(extension, value);\n        }\n        public <Type> Builder setExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList, java.util.List<Type>> extension,\n            int index, Type value) {\n          return (Builder) super.setExtension(extension, index, value);\n        }\n        public <Type> Builder addExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList, java.util.List<Type>> extension,\n            Type value) {\n          return (Builder) super.addExtension(extension, value);\n        }\n        public <Type> Builder clearExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList, ?> extension) {\n          return (Builder) super.clearExtension(extension);\n        }\n        public Builder mergeFrom(com.google.protobuf.Message other) {\n          if (other instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList) {\n            return mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList)other);\n          } else {\n            super.mergeFrom(other);\n            return this;\n          }\n        }\n\n        public Builder mergeFrom(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList other) {\n          if (other == org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList.getDefaultInstance()) return this;\n          if (propertysetBuilder_ == null) {\n            if (!other.propertyset_.isEmpty()) {\n              if (propertyset_.isEmpty()) {\n                propertyset_ = other.propertyset_;\n                bitField0_ = (bitField0_ & ~0x00000001);\n              } else {\n                ensurePropertysetIsMutable();\n                propertyset_.addAll(other.propertyset_);\n              }\n              onChanged();\n            }\n          } else {\n            if (!other.propertyset_.isEmpty()) {\n              if (propertysetBuilder_.isEmpty()) {\n                propertysetBuilder_.dispose();\n                propertysetBuilder_ = null;\n                propertyset_ = other.propertyset_;\n                bitField0_ = (bitField0_ & ~0x00000001);\n                propertysetBuilder_ = \n                  com.google.protobuf.GeneratedMessageV3.alwaysUseFieldBuilders ?\n                     getPropertysetFieldBuilder() : null;\n              } else {\n                propertysetBuilder_.addAllMessages(other.propertyset_);\n              }\n            }\n          }\n          this.mergeExtensionFields(other);\n          this.mergeUnknownFields(other.unknownFields);\n          onChanged();\n          return this;\n        }\n\n        public final boolean isInitialized() {\n          for (int i = 0; i < getPropertysetCount(); i++) {\n            if (!getPropertyset(i).isInitialized()) {\n              return false;\n            }\n          }\n          if (!extensionsAreInitialized()) {\n            return false;\n          }\n          return true;\n        }\n\n        public Builder mergeFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList parsedMessage = null;\n          try {\n            parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry);\n          } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n            parsedMessage = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList) e.getUnfinishedMessage();\n            throw e.unwrapIOException();\n          } finally {\n            if (parsedMessage != null) {\n              mergeFrom(parsedMessage);\n            }\n          }\n          return this;\n        }\n        private int bitField0_;\n\n        private java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet> propertyset_ =\n          java.util.Collections.emptyList();\n        private void ensurePropertysetIsMutable() {\n          if (!((bitField0_ & 0x00000001) == 0x00000001)) {\n            propertyset_ = new java.util.ArrayList<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet>(propertyset_);\n            bitField0_ |= 0x00000001;\n           }\n        }\n\n        private com.google.protobuf.RepeatedFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder> propertysetBuilder_;\n\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet> getPropertysetList() {\n          if (propertysetBuilder_ == null) {\n            return java.util.Collections.unmodifiableList(propertyset_);\n          } else {\n            return propertysetBuilder_.getMessageList();\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public int getPropertysetCount() {\n          if (propertysetBuilder_ == null) {\n            return propertyset_.size();\n          } else {\n            return propertysetBuilder_.getCount();\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet getPropertyset(int index) {\n          if (propertysetBuilder_ == null) {\n            return propertyset_.get(index);\n          } else {\n            return propertysetBuilder_.getMessage(index);\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public Builder setPropertyset(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet value) {\n          if (propertysetBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            ensurePropertysetIsMutable();\n            propertyset_.set(index, value);\n            onChanged();\n          } else {\n            propertysetBuilder_.setMessage(index, value);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public Builder setPropertyset(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder builderForValue) {\n          if (propertysetBuilder_ == null) {\n            ensurePropertysetIsMutable();\n            propertyset_.set(index, builderForValue.build());\n            onChanged();\n          } else {\n            propertysetBuilder_.setMessage(index, builderForValue.build());\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public Builder addPropertyset(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet value) {\n          if (propertysetBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            ensurePropertysetIsMutable();\n            propertyset_.add(value);\n            onChanged();\n          } else {\n            propertysetBuilder_.addMessage(value);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public Builder addPropertyset(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet value) {\n          if (propertysetBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            ensurePropertysetIsMutable();\n            propertyset_.add(index, value);\n            onChanged();\n          } else {\n            propertysetBuilder_.addMessage(index, value);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public Builder addPropertyset(\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder builderForValue) {\n          if (propertysetBuilder_ == null) {\n            ensurePropertysetIsMutable();\n            propertyset_.add(builderForValue.build());\n            onChanged();\n          } else {\n            propertysetBuilder_.addMessage(builderForValue.build());\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public Builder addPropertyset(\n            int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder builderForValue) {\n          if (propertysetBuilder_ == null) {\n            ensurePropertysetIsMutable();\n            propertyset_.add(index, builderForValue.build());\n            onChanged();\n          } else {\n            propertysetBuilder_.addMessage(index, builderForValue.build());\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public Builder addAllPropertyset(\n            java.lang.Iterable<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet> values) {\n          if (propertysetBuilder_ == null) {\n            ensurePropertysetIsMutable();\n            com.google.protobuf.AbstractMessageLite.Builder.addAll(\n                values, propertyset_);\n            onChanged();\n          } else {\n            propertysetBuilder_.addAllMessages(values);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public Builder clearPropertyset() {\n          if (propertysetBuilder_ == null) {\n            propertyset_ = java.util.Collections.emptyList();\n            bitField0_ = (bitField0_ & ~0x00000001);\n            onChanged();\n          } else {\n            propertysetBuilder_.clear();\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public Builder removePropertyset(int index) {\n          if (propertysetBuilder_ == null) {\n            ensurePropertysetIsMutable();\n            propertyset_.remove(index);\n            onChanged();\n          } else {\n            propertysetBuilder_.remove(index);\n          }\n          return this;\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder getPropertysetBuilder(\n            int index) {\n          return getPropertysetFieldBuilder().getBuilder(index);\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder getPropertysetOrBuilder(\n            int index) {\n          if (propertysetBuilder_ == null) {\n            return propertyset_.get(index);  } else {\n            return propertysetBuilder_.getMessageOrBuilder(index);\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder> \n             getPropertysetOrBuilderList() {\n          if (propertysetBuilder_ != null) {\n            return propertysetBuilder_.getMessageOrBuilderList();\n          } else {\n            return java.util.Collections.unmodifiableList(propertyset_);\n          }\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder addPropertysetBuilder() {\n          return getPropertysetFieldBuilder().addBuilder(\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance());\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder addPropertysetBuilder(\n            int index) {\n          return getPropertysetFieldBuilder().addBuilder(\n              index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance());\n        }\n        /**\n         * <code>repeated .org.eclipse.tahu.protobuf.Payload.PropertySet propertyset = 1;</code>\n         */\n        public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder> \n             getPropertysetBuilderList() {\n          return getPropertysetFieldBuilder().getBuilderList();\n        }\n        private com.google.protobuf.RepeatedFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder> \n            getPropertysetFieldBuilder() {\n          if (propertysetBuilder_ == null) {\n            propertysetBuilder_ = new com.google.protobuf.RepeatedFieldBuilderV3<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder>(\n                    propertyset_,\n                    ((bitField0_ & 0x00000001) == 0x00000001),\n                    getParentForChildren(),\n                    isClean());\n            propertyset_ = null;\n          }\n          return propertysetBuilder_;\n        }\n        public final Builder setUnknownFields(\n            final com.google.protobuf.UnknownFieldSet unknownFields) {\n          return super.setUnknownFields(unknownFields);\n        }\n\n        public final Builder mergeUnknownFields(\n            final com.google.protobuf.UnknownFieldSet unknownFields) {\n          return super.mergeUnknownFields(unknownFields);\n        }\n\n\n        // @@protoc_insertion_point(builder_scope:org.eclipse.tahu.protobuf.Payload.PropertySetList)\n      }\n\n      // @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.PropertySetList)\n      private static final org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList DEFAULT_INSTANCE;\n      static {\n        DEFAULT_INSTANCE = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList();\n      }\n\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList getDefaultInstance() {\n        return DEFAULT_INSTANCE;\n      }\n\n      @java.lang.Deprecated public static final com.google.protobuf.Parser<PropertySetList>\n          PARSER = new com.google.protobuf.AbstractParser<PropertySetList>() {\n        public PropertySetList parsePartialFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n            return new PropertySetList(input, extensionRegistry);\n        }\n      };\n\n      public static com.google.protobuf.Parser<PropertySetList> parser() {\n        return PARSER;\n      }\n\n      @java.lang.Override\n      public com.google.protobuf.Parser<PropertySetList> getParserForType() {\n        return PARSER;\n      }\n\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetList getDefaultInstanceForType() {\n        return DEFAULT_INSTANCE;\n      }\n\n    }\n\n    public interface MetaDataOrBuilder extends\n        // @@protoc_insertion_point(interface_extends:org.eclipse.tahu.protobuf.Payload.MetaData)\n        com.google.protobuf.GeneratedMessageV3.\n            ExtendableMessageOrBuilder<MetaData> {\n\n      /**\n       * <pre>\n       * Bytes specific metadata\n       * </pre>\n       *\n       * <code>optional bool is_multi_part = 1;</code>\n       */\n      boolean hasIsMultiPart();\n      /**\n       * <pre>\n       * Bytes specific metadata\n       * </pre>\n       *\n       * <code>optional bool is_multi_part = 1;</code>\n       */\n      boolean getIsMultiPart();\n\n      /**\n       * <pre>\n       * General metadata\n       * </pre>\n       *\n       * <code>optional string content_type = 2;</code>\n       */\n      boolean hasContentType();\n      /**\n       * <pre>\n       * General metadata\n       * </pre>\n       *\n       * <code>optional string content_type = 2;</code>\n       */\n      java.lang.String getContentType();\n      /**\n       * <pre>\n       * General metadata\n       * </pre>\n       *\n       * <code>optional string content_type = 2;</code>\n       */\n      com.google.protobuf.ByteString\n          getContentTypeBytes();\n\n      /**\n       * <pre>\n       * File size, String size, Multi-part size, etc\n       * </pre>\n       *\n       * <code>optional uint64 size = 3;</code>\n       */\n      boolean hasSize();\n      /**\n       * <pre>\n       * File size, String size, Multi-part size, etc\n       * </pre>\n       *\n       * <code>optional uint64 size = 3;</code>\n       */\n      long getSize();\n\n      /**\n       * <pre>\n       * Sequence number for multi-part messages\n       * </pre>\n       *\n       * <code>optional uint64 seq = 4;</code>\n       */\n      boolean hasSeq();\n      /**\n       * <pre>\n       * Sequence number for multi-part messages\n       * </pre>\n       *\n       * <code>optional uint64 seq = 4;</code>\n       */\n      long getSeq();\n\n      /**\n       * <pre>\n       * File metadata\n       * </pre>\n       *\n       * <code>optional string file_name = 5;</code>\n       */\n      boolean hasFileName();\n      /**\n       * <pre>\n       * File metadata\n       * </pre>\n       *\n       * <code>optional string file_name = 5;</code>\n       */\n      java.lang.String getFileName();\n      /**\n       * <pre>\n       * File metadata\n       * </pre>\n       *\n       * <code>optional string file_name = 5;</code>\n       */\n      com.google.protobuf.ByteString\n          getFileNameBytes();\n\n      /**\n       * <pre>\n       * File type (i.e. xml, json, txt, cpp, etc)\n       * </pre>\n       *\n       * <code>optional string file_type = 6;</code>\n       */\n      boolean hasFileType();\n      /**\n       * <pre>\n       * File type (i.e. xml, json, txt, cpp, etc)\n       * </pre>\n       *\n       * <code>optional string file_type = 6;</code>\n       */\n      java.lang.String getFileType();\n      /**\n       * <pre>\n       * File type (i.e. xml, json, txt, cpp, etc)\n       * </pre>\n       *\n       * <code>optional string file_type = 6;</code>\n       */\n      com.google.protobuf.ByteString\n          getFileTypeBytes();\n\n      /**\n       * <pre>\n       * md5 of data\n       * </pre>\n       *\n       * <code>optional string md5 = 7;</code>\n       */\n      boolean hasMd5();\n      /**\n       * <pre>\n       * md5 of data\n       * </pre>\n       *\n       * <code>optional string md5 = 7;</code>\n       */\n      java.lang.String getMd5();\n      /**\n       * <pre>\n       * md5 of data\n       * </pre>\n       *\n       * <code>optional string md5 = 7;</code>\n       */\n      com.google.protobuf.ByteString\n          getMd5Bytes();\n\n      /**\n       * <pre>\n       * Catchalls and future expansion\n       * </pre>\n       *\n       * <code>optional string description = 8;</code>\n       */\n      boolean hasDescription();\n      /**\n       * <pre>\n       * Catchalls and future expansion\n       * </pre>\n       *\n       * <code>optional string description = 8;</code>\n       */\n      java.lang.String getDescription();\n      /**\n       * <pre>\n       * Catchalls and future expansion\n       * </pre>\n       *\n       * <code>optional string description = 8;</code>\n       */\n      com.google.protobuf.ByteString\n          getDescriptionBytes();\n    }\n    /**\n     * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.MetaData}\n     */\n    public  static final class MetaData extends\n        com.google.protobuf.GeneratedMessageV3.ExtendableMessage<\n          MetaData> implements\n        // @@protoc_insertion_point(message_implements:org.eclipse.tahu.protobuf.Payload.MetaData)\n        MetaDataOrBuilder {\n      // Use MetaData.newBuilder() to construct.\n      private MetaData(com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData, ?> builder) {\n        super(builder);\n      }\n      private MetaData() {\n        isMultiPart_ = false;\n        contentType_ = \"\";\n        size_ = 0L;\n        seq_ = 0L;\n        fileName_ = \"\";\n        fileType_ = \"\";\n        md5_ = \"\";\n        description_ = \"\";\n      }\n\n      @java.lang.Override\n      public final com.google.protobuf.UnknownFieldSet\n      getUnknownFields() {\n        return this.unknownFields;\n      }\n      private MetaData(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        this();\n        int mutable_bitField0_ = 0;\n        com.google.protobuf.UnknownFieldSet.Builder unknownFields =\n            com.google.protobuf.UnknownFieldSet.newBuilder();\n        try {\n          boolean done = false;\n          while (!done) {\n            int tag = input.readTag();\n            switch (tag) {\n              case 0:\n                done = true;\n                break;\n              default: {\n                if (!parseUnknownField(input, unknownFields,\n                                       extensionRegistry, tag)) {\n                  done = true;\n                }\n                break;\n              }\n              case 8: {\n                bitField0_ |= 0x00000001;\n                isMultiPart_ = input.readBool();\n                break;\n              }\n              case 18: {\n                com.google.protobuf.ByteString bs = input.readBytes();\n                bitField0_ |= 0x00000002;\n                contentType_ = bs;\n                break;\n              }\n              case 24: {\n                bitField0_ |= 0x00000004;\n                size_ = input.readUInt64();\n                break;\n              }\n              case 32: {\n                bitField0_ |= 0x00000008;\n                seq_ = input.readUInt64();\n                break;\n              }\n              case 42: {\n                com.google.protobuf.ByteString bs = input.readBytes();\n                bitField0_ |= 0x00000010;\n                fileName_ = bs;\n                break;\n              }\n              case 50: {\n                com.google.protobuf.ByteString bs = input.readBytes();\n                bitField0_ |= 0x00000020;\n                fileType_ = bs;\n                break;\n              }\n              case 58: {\n                com.google.protobuf.ByteString bs = input.readBytes();\n                bitField0_ |= 0x00000040;\n                md5_ = bs;\n                break;\n              }\n              case 66: {\n                com.google.protobuf.ByteString bs = input.readBytes();\n                bitField0_ |= 0x00000080;\n                description_ = bs;\n                break;\n              }\n            }\n          }\n        } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n          throw e.setUnfinishedMessage(this);\n        } catch (java.io.IOException e) {\n          throw new com.google.protobuf.InvalidProtocolBufferException(\n              e).setUnfinishedMessage(this);\n        } finally {\n          this.unknownFields = unknownFields.build();\n          makeExtensionsImmutable();\n        }\n      }\n      public static final com.google.protobuf.Descriptors.Descriptor\n          getDescriptor() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_MetaData_descriptor;\n      }\n\n      protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n          internalGetFieldAccessorTable() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_MetaData_fieldAccessorTable\n            .ensureFieldAccessorsInitialized(\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.Builder.class);\n      }\n\n      private int bitField0_;\n      public static final int IS_MULTI_PART_FIELD_NUMBER = 1;\n      private boolean isMultiPart_;\n      /**\n       * <pre>\n       * Bytes specific metadata\n       * </pre>\n       *\n       * <code>optional bool is_multi_part = 1;</code>\n       */\n      public boolean hasIsMultiPart() {\n        return ((bitField0_ & 0x00000001) == 0x00000001);\n      }\n      /**\n       * <pre>\n       * Bytes specific metadata\n       * </pre>\n       *\n       * <code>optional bool is_multi_part = 1;</code>\n       */\n      public boolean getIsMultiPart() {\n        return isMultiPart_;\n      }\n\n      public static final int CONTENT_TYPE_FIELD_NUMBER = 2;\n      private volatile java.lang.Object contentType_;\n      /**\n       * <pre>\n       * General metadata\n       * </pre>\n       *\n       * <code>optional string content_type = 2;</code>\n       */\n      public boolean hasContentType() {\n        return ((bitField0_ & 0x00000002) == 0x00000002);\n      }\n      /**\n       * <pre>\n       * General metadata\n       * </pre>\n       *\n       * <code>optional string content_type = 2;</code>\n       */\n      public java.lang.String getContentType() {\n        java.lang.Object ref = contentType_;\n        if (ref instanceof java.lang.String) {\n          return (java.lang.String) ref;\n        } else {\n          com.google.protobuf.ByteString bs = \n              (com.google.protobuf.ByteString) ref;\n          java.lang.String s = bs.toStringUtf8();\n          if (bs.isValidUtf8()) {\n            contentType_ = s;\n          }\n          return s;\n        }\n      }\n      /**\n       * <pre>\n       * General metadata\n       * </pre>\n       *\n       * <code>optional string content_type = 2;</code>\n       */\n      public com.google.protobuf.ByteString\n          getContentTypeBytes() {\n        java.lang.Object ref = contentType_;\n        if (ref instanceof java.lang.String) {\n          com.google.protobuf.ByteString b = \n              com.google.protobuf.ByteString.copyFromUtf8(\n                  (java.lang.String) ref);\n          contentType_ = b;\n          return b;\n        } else {\n          return (com.google.protobuf.ByteString) ref;\n        }\n      }\n\n      public static final int SIZE_FIELD_NUMBER = 3;\n      private long size_;\n      /**\n       * <pre>\n       * File size, String size, Multi-part size, etc\n       * </pre>\n       *\n       * <code>optional uint64 size = 3;</code>\n       */\n      public boolean hasSize() {\n        return ((bitField0_ & 0x00000004) == 0x00000004);\n      }\n      /**\n       * <pre>\n       * File size, String size, Multi-part size, etc\n       * </pre>\n       *\n       * <code>optional uint64 size = 3;</code>\n       */\n      public long getSize() {\n        return size_;\n      }\n\n      public static final int SEQ_FIELD_NUMBER = 4;\n      private long seq_;\n      /**\n       * <pre>\n       * Sequence number for multi-part messages\n       * </pre>\n       *\n       * <code>optional uint64 seq = 4;</code>\n       */\n      public boolean hasSeq() {\n        return ((bitField0_ & 0x00000008) == 0x00000008);\n      }\n      /**\n       * <pre>\n       * Sequence number for multi-part messages\n       * </pre>\n       *\n       * <code>optional uint64 seq = 4;</code>\n       */\n      public long getSeq() {\n        return seq_;\n      }\n\n      public static final int FILE_NAME_FIELD_NUMBER = 5;\n      private volatile java.lang.Object fileName_;\n      /**\n       * <pre>\n       * File metadata\n       * </pre>\n       *\n       * <code>optional string file_name = 5;</code>\n       */\n      public boolean hasFileName() {\n        return ((bitField0_ & 0x00000010) == 0x00000010);\n      }\n      /**\n       * <pre>\n       * File metadata\n       * </pre>\n       *\n       * <code>optional string file_name = 5;</code>\n       */\n      public java.lang.String getFileName() {\n        java.lang.Object ref = fileName_;\n        if (ref instanceof java.lang.String) {\n          return (java.lang.String) ref;\n        } else {\n          com.google.protobuf.ByteString bs = \n              (com.google.protobuf.ByteString) ref;\n          java.lang.String s = bs.toStringUtf8();\n          if (bs.isValidUtf8()) {\n            fileName_ = s;\n          }\n          return s;\n        }\n      }\n      /**\n       * <pre>\n       * File metadata\n       * </pre>\n       *\n       * <code>optional string file_name = 5;</code>\n       */\n      public com.google.protobuf.ByteString\n          getFileNameBytes() {\n        java.lang.Object ref = fileName_;\n        if (ref instanceof java.lang.String) {\n          com.google.protobuf.ByteString b = \n              com.google.protobuf.ByteString.copyFromUtf8(\n                  (java.lang.String) ref);\n          fileName_ = b;\n          return b;\n        } else {\n          return (com.google.protobuf.ByteString) ref;\n        }\n      }\n\n      public static final int FILE_TYPE_FIELD_NUMBER = 6;\n      private volatile java.lang.Object fileType_;\n      /**\n       * <pre>\n       * File type (i.e. xml, json, txt, cpp, etc)\n       * </pre>\n       *\n       * <code>optional string file_type = 6;</code>\n       */\n      public boolean hasFileType() {\n        return ((bitField0_ & 0x00000020) == 0x00000020);\n      }\n      /**\n       * <pre>\n       * File type (i.e. xml, json, txt, cpp, etc)\n       * </pre>\n       *\n       * <code>optional string file_type = 6;</code>\n       */\n      public java.lang.String getFileType() {\n        java.lang.Object ref = fileType_;\n        if (ref instanceof java.lang.String) {\n          return (java.lang.String) ref;\n        } else {\n          com.google.protobuf.ByteString bs = \n              (com.google.protobuf.ByteString) ref;\n          java.lang.String s = bs.toStringUtf8();\n          if (bs.isValidUtf8()) {\n            fileType_ = s;\n          }\n          return s;\n        }\n      }\n      /**\n       * <pre>\n       * File type (i.e. xml, json, txt, cpp, etc)\n       * </pre>\n       *\n       * <code>optional string file_type = 6;</code>\n       */\n      public com.google.protobuf.ByteString\n          getFileTypeBytes() {\n        java.lang.Object ref = fileType_;\n        if (ref instanceof java.lang.String) {\n          com.google.protobuf.ByteString b = \n              com.google.protobuf.ByteString.copyFromUtf8(\n                  (java.lang.String) ref);\n          fileType_ = b;\n          return b;\n        } else {\n          return (com.google.protobuf.ByteString) ref;\n        }\n      }\n\n      public static final int MD5_FIELD_NUMBER = 7;\n      private volatile java.lang.Object md5_;\n      /**\n       * <pre>\n       * md5 of data\n       * </pre>\n       *\n       * <code>optional string md5 = 7;</code>\n       */\n      public boolean hasMd5() {\n        return ((bitField0_ & 0x00000040) == 0x00000040);\n      }\n      /**\n       * <pre>\n       * md5 of data\n       * </pre>\n       *\n       * <code>optional string md5 = 7;</code>\n       */\n      public java.lang.String getMd5() {\n        java.lang.Object ref = md5_;\n        if (ref instanceof java.lang.String) {\n          return (java.lang.String) ref;\n        } else {\n          com.google.protobuf.ByteString bs = \n              (com.google.protobuf.ByteString) ref;\n          java.lang.String s = bs.toStringUtf8();\n          if (bs.isValidUtf8()) {\n            md5_ = s;\n          }\n          return s;\n        }\n      }\n      /**\n       * <pre>\n       * md5 of data\n       * </pre>\n       *\n       * <code>optional string md5 = 7;</code>\n       */\n      public com.google.protobuf.ByteString\n          getMd5Bytes() {\n        java.lang.Object ref = md5_;\n        if (ref instanceof java.lang.String) {\n          com.google.protobuf.ByteString b = \n              com.google.protobuf.ByteString.copyFromUtf8(\n                  (java.lang.String) ref);\n          md5_ = b;\n          return b;\n        } else {\n          return (com.google.protobuf.ByteString) ref;\n        }\n      }\n\n      public static final int DESCRIPTION_FIELD_NUMBER = 8;\n      private volatile java.lang.Object description_;\n      /**\n       * <pre>\n       * Catchalls and future expansion\n       * </pre>\n       *\n       * <code>optional string description = 8;</code>\n       */\n      public boolean hasDescription() {\n        return ((bitField0_ & 0x00000080) == 0x00000080);\n      }\n      /**\n       * <pre>\n       * Catchalls and future expansion\n       * </pre>\n       *\n       * <code>optional string description = 8;</code>\n       */\n      public java.lang.String getDescription() {\n        java.lang.Object ref = description_;\n        if (ref instanceof java.lang.String) {\n          return (java.lang.String) ref;\n        } else {\n          com.google.protobuf.ByteString bs = \n              (com.google.protobuf.ByteString) ref;\n          java.lang.String s = bs.toStringUtf8();\n          if (bs.isValidUtf8()) {\n            description_ = s;\n          }\n          return s;\n        }\n      }\n      /**\n       * <pre>\n       * Catchalls and future expansion\n       * </pre>\n       *\n       * <code>optional string description = 8;</code>\n       */\n      public com.google.protobuf.ByteString\n          getDescriptionBytes() {\n        java.lang.Object ref = description_;\n        if (ref instanceof java.lang.String) {\n          com.google.protobuf.ByteString b = \n              com.google.protobuf.ByteString.copyFromUtf8(\n                  (java.lang.String) ref);\n          description_ = b;\n          return b;\n        } else {\n          return (com.google.protobuf.ByteString) ref;\n        }\n      }\n\n      private byte memoizedIsInitialized = -1;\n      public final boolean isInitialized() {\n        byte isInitialized = memoizedIsInitialized;\n        if (isInitialized == 1) return true;\n        if (isInitialized == 0) return false;\n\n        if (!extensionsAreInitialized()) {\n          memoizedIsInitialized = 0;\n          return false;\n        }\n        memoizedIsInitialized = 1;\n        return true;\n      }\n\n      public void writeTo(com.google.protobuf.CodedOutputStream output)\n                          throws java.io.IOException {\n        com.google.protobuf.GeneratedMessageV3\n          .ExtendableMessage<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData>.ExtensionWriter\n            extensionWriter = newExtensionWriter();\n        if (((bitField0_ & 0x00000001) == 0x00000001)) {\n          output.writeBool(1, isMultiPart_);\n        }\n        if (((bitField0_ & 0x00000002) == 0x00000002)) {\n          com.google.protobuf.GeneratedMessageV3.writeString(output, 2, contentType_);\n        }\n        if (((bitField0_ & 0x00000004) == 0x00000004)) {\n          output.writeUInt64(3, size_);\n        }\n        if (((bitField0_ & 0x00000008) == 0x00000008)) {\n          output.writeUInt64(4, seq_);\n        }\n        if (((bitField0_ & 0x00000010) == 0x00000010)) {\n          com.google.protobuf.GeneratedMessageV3.writeString(output, 5, fileName_);\n        }\n        if (((bitField0_ & 0x00000020) == 0x00000020)) {\n          com.google.protobuf.GeneratedMessageV3.writeString(output, 6, fileType_);\n        }\n        if (((bitField0_ & 0x00000040) == 0x00000040)) {\n          com.google.protobuf.GeneratedMessageV3.writeString(output, 7, md5_);\n        }\n        if (((bitField0_ & 0x00000080) == 0x00000080)) {\n          com.google.protobuf.GeneratedMessageV3.writeString(output, 8, description_);\n        }\n        extensionWriter.writeUntil(536870912, output);\n        unknownFields.writeTo(output);\n      }\n\n      public int getSerializedSize() {\n        int size = memoizedSize;\n        if (size != -1) return size;\n\n        size = 0;\n        if (((bitField0_ & 0x00000001) == 0x00000001)) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeBoolSize(1, isMultiPart_);\n        }\n        if (((bitField0_ & 0x00000002) == 0x00000002)) {\n          size += com.google.protobuf.GeneratedMessageV3.computeStringSize(2, contentType_);\n        }\n        if (((bitField0_ & 0x00000004) == 0x00000004)) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeUInt64Size(3, size_);\n        }\n        if (((bitField0_ & 0x00000008) == 0x00000008)) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeUInt64Size(4, seq_);\n        }\n        if (((bitField0_ & 0x00000010) == 0x00000010)) {\n          size += com.google.protobuf.GeneratedMessageV3.computeStringSize(5, fileName_);\n        }\n        if (((bitField0_ & 0x00000020) == 0x00000020)) {\n          size += com.google.protobuf.GeneratedMessageV3.computeStringSize(6, fileType_);\n        }\n        if (((bitField0_ & 0x00000040) == 0x00000040)) {\n          size += com.google.protobuf.GeneratedMessageV3.computeStringSize(7, md5_);\n        }\n        if (((bitField0_ & 0x00000080) == 0x00000080)) {\n          size += com.google.protobuf.GeneratedMessageV3.computeStringSize(8, description_);\n        }\n        size += extensionsSerializedSize();\n        size += unknownFields.getSerializedSize();\n        memoizedSize = size;\n        return size;\n      }\n\n      private static final long serialVersionUID = 0L;\n      @java.lang.Override\n      public boolean equals(final java.lang.Object obj) {\n        if (obj == this) {\n         return true;\n        }\n        if (!(obj instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData)) {\n          return super.equals(obj);\n        }\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData other = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData) obj;\n\n        boolean result = true;\n        result = result && (hasIsMultiPart() == other.hasIsMultiPart());\n        if (hasIsMultiPart()) {\n          result = result && (getIsMultiPart()\n              == other.getIsMultiPart());\n        }\n        result = result && (hasContentType() == other.hasContentType());\n        if (hasContentType()) {\n          result = result && getContentType()\n              .equals(other.getContentType());\n        }\n        result = result && (hasSize() == other.hasSize());\n        if (hasSize()) {\n          result = result && (getSize()\n              == other.getSize());\n        }\n        result = result && (hasSeq() == other.hasSeq());\n        if (hasSeq()) {\n          result = result && (getSeq()\n              == other.getSeq());\n        }\n        result = result && (hasFileName() == other.hasFileName());\n        if (hasFileName()) {\n          result = result && getFileName()\n              .equals(other.getFileName());\n        }\n        result = result && (hasFileType() == other.hasFileType());\n        if (hasFileType()) {\n          result = result && getFileType()\n              .equals(other.getFileType());\n        }\n        result = result && (hasMd5() == other.hasMd5());\n        if (hasMd5()) {\n          result = result && getMd5()\n              .equals(other.getMd5());\n        }\n        result = result && (hasDescription() == other.hasDescription());\n        if (hasDescription()) {\n          result = result && getDescription()\n              .equals(other.getDescription());\n        }\n        result = result && unknownFields.equals(other.unknownFields);\n        result = result &&\n            getExtensionFields().equals(other.getExtensionFields());\n        return result;\n      }\n\n      @java.lang.Override\n      public int hashCode() {\n        if (memoizedHashCode != 0) {\n          return memoizedHashCode;\n        }\n        int hash = 41;\n        hash = (19 * hash) + getDescriptorForType().hashCode();\n        if (hasIsMultiPart()) {\n          hash = (37 * hash) + IS_MULTI_PART_FIELD_NUMBER;\n          hash = (53 * hash) + com.google.protobuf.Internal.hashBoolean(\n              getIsMultiPart());\n        }\n        if (hasContentType()) {\n          hash = (37 * hash) + CONTENT_TYPE_FIELD_NUMBER;\n          hash = (53 * hash) + getContentType().hashCode();\n        }\n        if (hasSize()) {\n          hash = (37 * hash) + SIZE_FIELD_NUMBER;\n          hash = (53 * hash) + com.google.protobuf.Internal.hashLong(\n              getSize());\n        }\n        if (hasSeq()) {\n          hash = (37 * hash) + SEQ_FIELD_NUMBER;\n          hash = (53 * hash) + com.google.protobuf.Internal.hashLong(\n              getSeq());\n        }\n        if (hasFileName()) {\n          hash = (37 * hash) + FILE_NAME_FIELD_NUMBER;\n          hash = (53 * hash) + getFileName().hashCode();\n        }\n        if (hasFileType()) {\n          hash = (37 * hash) + FILE_TYPE_FIELD_NUMBER;\n          hash = (53 * hash) + getFileType().hashCode();\n        }\n        if (hasMd5()) {\n          hash = (37 * hash) + MD5_FIELD_NUMBER;\n          hash = (53 * hash) + getMd5().hashCode();\n        }\n        if (hasDescription()) {\n          hash = (37 * hash) + DESCRIPTION_FIELD_NUMBER;\n          hash = (53 * hash) + getDescription().hashCode();\n        }\n        hash = hashFields(hash, getExtensionFields());\n        hash = (29 * hash) + unknownFields.hashCode();\n        memoizedHashCode = hash;\n        return hash;\n      }\n\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData parseFrom(\n          com.google.protobuf.ByteString data)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData parseFrom(\n          com.google.protobuf.ByteString data,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData parseFrom(byte[] data)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData parseFrom(\n          byte[] data,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData parseFrom(java.io.InputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData parseFrom(\n          java.io.InputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData parseDelimitedFrom(java.io.InputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseDelimitedWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData parseDelimitedFrom(\n          java.io.InputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseDelimitedWithIOException(PARSER, input, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData parseFrom(\n          com.google.protobuf.CodedInputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData parseFrom(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input, extensionRegistry);\n      }\n\n      public Builder newBuilderForType() { return newBuilder(); }\n      public static Builder newBuilder() {\n        return DEFAULT_INSTANCE.toBuilder();\n      }\n      public static Builder newBuilder(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData prototype) {\n        return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);\n      }\n      public Builder toBuilder() {\n        return this == DEFAULT_INSTANCE\n            ? new Builder() : new Builder().mergeFrom(this);\n      }\n\n      @java.lang.Override\n      protected Builder newBuilderForType(\n          com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n        Builder builder = new Builder(parent);\n        return builder;\n      }\n      /**\n       * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.MetaData}\n       */\n      public static final class Builder extends\n          com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData, Builder> implements\n          // @@protoc_insertion_point(builder_implements:org.eclipse.tahu.protobuf.Payload.MetaData)\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaDataOrBuilder {\n        public static final com.google.protobuf.Descriptors.Descriptor\n            getDescriptor() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_MetaData_descriptor;\n        }\n\n        protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n            internalGetFieldAccessorTable() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_MetaData_fieldAccessorTable\n              .ensureFieldAccessorsInitialized(\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.Builder.class);\n        }\n\n        // Construct using org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.newBuilder()\n        private Builder() {\n          maybeForceBuilderInitialization();\n        }\n\n        private Builder(\n            com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n          super(parent);\n          maybeForceBuilderInitialization();\n        }\n        private void maybeForceBuilderInitialization() {\n          if (com.google.protobuf.GeneratedMessageV3\n                  .alwaysUseFieldBuilders) {\n          }\n        }\n        public Builder clear() {\n          super.clear();\n          isMultiPart_ = false;\n          bitField0_ = (bitField0_ & ~0x00000001);\n          contentType_ = \"\";\n          bitField0_ = (bitField0_ & ~0x00000002);\n          size_ = 0L;\n          bitField0_ = (bitField0_ & ~0x00000004);\n          seq_ = 0L;\n          bitField0_ = (bitField0_ & ~0x00000008);\n          fileName_ = \"\";\n          bitField0_ = (bitField0_ & ~0x00000010);\n          fileType_ = \"\";\n          bitField0_ = (bitField0_ & ~0x00000020);\n          md5_ = \"\";\n          bitField0_ = (bitField0_ & ~0x00000040);\n          description_ = \"\";\n          bitField0_ = (bitField0_ & ~0x00000080);\n          return this;\n        }\n\n        public com.google.protobuf.Descriptors.Descriptor\n            getDescriptorForType() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_MetaData_descriptor;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData getDefaultInstanceForType() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.getDefaultInstance();\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData build() {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData result = buildPartial();\n          if (!result.isInitialized()) {\n            throw newUninitializedMessageException(result);\n          }\n          return result;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData buildPartial() {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData result = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData(this);\n          int from_bitField0_ = bitField0_;\n          int to_bitField0_ = 0;\n          if (((from_bitField0_ & 0x00000001) == 0x00000001)) {\n            to_bitField0_ |= 0x00000001;\n          }\n          result.isMultiPart_ = isMultiPart_;\n          if (((from_bitField0_ & 0x00000002) == 0x00000002)) {\n            to_bitField0_ |= 0x00000002;\n          }\n          result.contentType_ = contentType_;\n          if (((from_bitField0_ & 0x00000004) == 0x00000004)) {\n            to_bitField0_ |= 0x00000004;\n          }\n          result.size_ = size_;\n          if (((from_bitField0_ & 0x00000008) == 0x00000008)) {\n            to_bitField0_ |= 0x00000008;\n          }\n          result.seq_ = seq_;\n          if (((from_bitField0_ & 0x00000010) == 0x00000010)) {\n            to_bitField0_ |= 0x00000010;\n          }\n          result.fileName_ = fileName_;\n          if (((from_bitField0_ & 0x00000020) == 0x00000020)) {\n            to_bitField0_ |= 0x00000020;\n          }\n          result.fileType_ = fileType_;\n          if (((from_bitField0_ & 0x00000040) == 0x00000040)) {\n            to_bitField0_ |= 0x00000040;\n          }\n          result.md5_ = md5_;\n          if (((from_bitField0_ & 0x00000080) == 0x00000080)) {\n            to_bitField0_ |= 0x00000080;\n          }\n          result.description_ = description_;\n          result.bitField0_ = to_bitField0_;\n          onBuilt();\n          return result;\n        }\n\n        public Builder clone() {\n          return (Builder) super.clone();\n        }\n        public Builder setField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            Object value) {\n          return (Builder) super.setField(field, value);\n        }\n        public Builder clearField(\n            com.google.protobuf.Descriptors.FieldDescriptor field) {\n          return (Builder) super.clearField(field);\n        }\n        public Builder clearOneof(\n            com.google.protobuf.Descriptors.OneofDescriptor oneof) {\n          return (Builder) super.clearOneof(oneof);\n        }\n        public Builder setRepeatedField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            int index, Object value) {\n          return (Builder) super.setRepeatedField(field, index, value);\n        }\n        public Builder addRepeatedField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            Object value) {\n          return (Builder) super.addRepeatedField(field, value);\n        }\n        public <Type> Builder setExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData, Type> extension,\n            Type value) {\n          return (Builder) super.setExtension(extension, value);\n        }\n        public <Type> Builder setExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData, java.util.List<Type>> extension,\n            int index, Type value) {\n          return (Builder) super.setExtension(extension, index, value);\n        }\n        public <Type> Builder addExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData, java.util.List<Type>> extension,\n            Type value) {\n          return (Builder) super.addExtension(extension, value);\n        }\n        public <Type> Builder clearExtension(\n            com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData, ?> extension) {\n          return (Builder) super.clearExtension(extension);\n        }\n        public Builder mergeFrom(com.google.protobuf.Message other) {\n          if (other instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData) {\n            return mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData)other);\n          } else {\n            super.mergeFrom(other);\n            return this;\n          }\n        }\n\n        public Builder mergeFrom(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData other) {\n          if (other == org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.getDefaultInstance()) return this;\n          if (other.hasIsMultiPart()) {\n            setIsMultiPart(other.getIsMultiPart());\n          }\n          if (other.hasContentType()) {\n            bitField0_ |= 0x00000002;\n            contentType_ = other.contentType_;\n            onChanged();\n          }\n          if (other.hasSize()) {\n            setSize(other.getSize());\n          }\n          if (other.hasSeq()) {\n            setSeq(other.getSeq());\n          }\n          if (other.hasFileName()) {\n            bitField0_ |= 0x00000010;\n            fileName_ = other.fileName_;\n            onChanged();\n          }\n          if (other.hasFileType()) {\n            bitField0_ |= 0x00000020;\n            fileType_ = other.fileType_;\n            onChanged();\n          }\n          if (other.hasMd5()) {\n            bitField0_ |= 0x00000040;\n            md5_ = other.md5_;\n            onChanged();\n          }\n          if (other.hasDescription()) {\n            bitField0_ |= 0x00000080;\n            description_ = other.description_;\n            onChanged();\n          }\n          this.mergeExtensionFields(other);\n          this.mergeUnknownFields(other.unknownFields);\n          onChanged();\n          return this;\n        }\n\n        public final boolean isInitialized() {\n          if (!extensionsAreInitialized()) {\n            return false;\n          }\n          return true;\n        }\n\n        public Builder mergeFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData parsedMessage = null;\n          try {\n            parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry);\n          } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n            parsedMessage = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData) e.getUnfinishedMessage();\n            throw e.unwrapIOException();\n          } finally {\n            if (parsedMessage != null) {\n              mergeFrom(parsedMessage);\n            }\n          }\n          return this;\n        }\n        private int bitField0_;\n\n        private boolean isMultiPart_ ;\n        /**\n         * <pre>\n         * Bytes specific metadata\n         * </pre>\n         *\n         * <code>optional bool is_multi_part = 1;</code>\n         */\n        public boolean hasIsMultiPart() {\n          return ((bitField0_ & 0x00000001) == 0x00000001);\n        }\n        /**\n         * <pre>\n         * Bytes specific metadata\n         * </pre>\n         *\n         * <code>optional bool is_multi_part = 1;</code>\n         */\n        public boolean getIsMultiPart() {\n          return isMultiPart_;\n        }\n        /**\n         * <pre>\n         * Bytes specific metadata\n         * </pre>\n         *\n         * <code>optional bool is_multi_part = 1;</code>\n         */\n        public Builder setIsMultiPart(boolean value) {\n          bitField0_ |= 0x00000001;\n          isMultiPart_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Bytes specific metadata\n         * </pre>\n         *\n         * <code>optional bool is_multi_part = 1;</code>\n         */\n        public Builder clearIsMultiPart() {\n          bitField0_ = (bitField0_ & ~0x00000001);\n          isMultiPart_ = false;\n          onChanged();\n          return this;\n        }\n\n        private java.lang.Object contentType_ = \"\";\n        /**\n         * <pre>\n         * General metadata\n         * </pre>\n         *\n         * <code>optional string content_type = 2;</code>\n         */\n        public boolean hasContentType() {\n          return ((bitField0_ & 0x00000002) == 0x00000002);\n        }\n        /**\n         * <pre>\n         * General metadata\n         * </pre>\n         *\n         * <code>optional string content_type = 2;</code>\n         */\n        public java.lang.String getContentType() {\n          java.lang.Object ref = contentType_;\n          if (!(ref instanceof java.lang.String)) {\n            com.google.protobuf.ByteString bs =\n                (com.google.protobuf.ByteString) ref;\n            java.lang.String s = bs.toStringUtf8();\n            if (bs.isValidUtf8()) {\n              contentType_ = s;\n            }\n            return s;\n          } else {\n            return (java.lang.String) ref;\n          }\n        }\n        /**\n         * <pre>\n         * General metadata\n         * </pre>\n         *\n         * <code>optional string content_type = 2;</code>\n         */\n        public com.google.protobuf.ByteString\n            getContentTypeBytes() {\n          java.lang.Object ref = contentType_;\n          if (ref instanceof String) {\n            com.google.protobuf.ByteString b = \n                com.google.protobuf.ByteString.copyFromUtf8(\n                    (java.lang.String) ref);\n            contentType_ = b;\n            return b;\n          } else {\n            return (com.google.protobuf.ByteString) ref;\n          }\n        }\n        /**\n         * <pre>\n         * General metadata\n         * </pre>\n         *\n         * <code>optional string content_type = 2;</code>\n         */\n        public Builder setContentType(\n            java.lang.String value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000002;\n          contentType_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * General metadata\n         * </pre>\n         *\n         * <code>optional string content_type = 2;</code>\n         */\n        public Builder clearContentType() {\n          bitField0_ = (bitField0_ & ~0x00000002);\n          contentType_ = getDefaultInstance().getContentType();\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * General metadata\n         * </pre>\n         *\n         * <code>optional string content_type = 2;</code>\n         */\n        public Builder setContentTypeBytes(\n            com.google.protobuf.ByteString value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000002;\n          contentType_ = value;\n          onChanged();\n          return this;\n        }\n\n        private long size_ ;\n        /**\n         * <pre>\n         * File size, String size, Multi-part size, etc\n         * </pre>\n         *\n         * <code>optional uint64 size = 3;</code>\n         */\n        public boolean hasSize() {\n          return ((bitField0_ & 0x00000004) == 0x00000004);\n        }\n        /**\n         * <pre>\n         * File size, String size, Multi-part size, etc\n         * </pre>\n         *\n         * <code>optional uint64 size = 3;</code>\n         */\n        public long getSize() {\n          return size_;\n        }\n        /**\n         * <pre>\n         * File size, String size, Multi-part size, etc\n         * </pre>\n         *\n         * <code>optional uint64 size = 3;</code>\n         */\n        public Builder setSize(long value) {\n          bitField0_ |= 0x00000004;\n          size_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * File size, String size, Multi-part size, etc\n         * </pre>\n         *\n         * <code>optional uint64 size = 3;</code>\n         */\n        public Builder clearSize() {\n          bitField0_ = (bitField0_ & ~0x00000004);\n          size_ = 0L;\n          onChanged();\n          return this;\n        }\n\n        private long seq_ ;\n        /**\n         * <pre>\n         * Sequence number for multi-part messages\n         * </pre>\n         *\n         * <code>optional uint64 seq = 4;</code>\n         */\n        public boolean hasSeq() {\n          return ((bitField0_ & 0x00000008) == 0x00000008);\n        }\n        /**\n         * <pre>\n         * Sequence number for multi-part messages\n         * </pre>\n         *\n         * <code>optional uint64 seq = 4;</code>\n         */\n        public long getSeq() {\n          return seq_;\n        }\n        /**\n         * <pre>\n         * Sequence number for multi-part messages\n         * </pre>\n         *\n         * <code>optional uint64 seq = 4;</code>\n         */\n        public Builder setSeq(long value) {\n          bitField0_ |= 0x00000008;\n          seq_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Sequence number for multi-part messages\n         * </pre>\n         *\n         * <code>optional uint64 seq = 4;</code>\n         */\n        public Builder clearSeq() {\n          bitField0_ = (bitField0_ & ~0x00000008);\n          seq_ = 0L;\n          onChanged();\n          return this;\n        }\n\n        private java.lang.Object fileName_ = \"\";\n        /**\n         * <pre>\n         * File metadata\n         * </pre>\n         *\n         * <code>optional string file_name = 5;</code>\n         */\n        public boolean hasFileName() {\n          return ((bitField0_ & 0x00000010) == 0x00000010);\n        }\n        /**\n         * <pre>\n         * File metadata\n         * </pre>\n         *\n         * <code>optional string file_name = 5;</code>\n         */\n        public java.lang.String getFileName() {\n          java.lang.Object ref = fileName_;\n          if (!(ref instanceof java.lang.String)) {\n            com.google.protobuf.ByteString bs =\n                (com.google.protobuf.ByteString) ref;\n            java.lang.String s = bs.toStringUtf8();\n            if (bs.isValidUtf8()) {\n              fileName_ = s;\n            }\n            return s;\n          } else {\n            return (java.lang.String) ref;\n          }\n        }\n        /**\n         * <pre>\n         * File metadata\n         * </pre>\n         *\n         * <code>optional string file_name = 5;</code>\n         */\n        public com.google.protobuf.ByteString\n            getFileNameBytes() {\n          java.lang.Object ref = fileName_;\n          if (ref instanceof String) {\n            com.google.protobuf.ByteString b = \n                com.google.protobuf.ByteString.copyFromUtf8(\n                    (java.lang.String) ref);\n            fileName_ = b;\n            return b;\n          } else {\n            return (com.google.protobuf.ByteString) ref;\n          }\n        }\n        /**\n         * <pre>\n         * File metadata\n         * </pre>\n         *\n         * <code>optional string file_name = 5;</code>\n         */\n        public Builder setFileName(\n            java.lang.String value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000010;\n          fileName_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * File metadata\n         * </pre>\n         *\n         * <code>optional string file_name = 5;</code>\n         */\n        public Builder clearFileName() {\n          bitField0_ = (bitField0_ & ~0x00000010);\n          fileName_ = getDefaultInstance().getFileName();\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * File metadata\n         * </pre>\n         *\n         * <code>optional string file_name = 5;</code>\n         */\n        public Builder setFileNameBytes(\n            com.google.protobuf.ByteString value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000010;\n          fileName_ = value;\n          onChanged();\n          return this;\n        }\n\n        private java.lang.Object fileType_ = \"\";\n        /**\n         * <pre>\n         * File type (i.e. xml, json, txt, cpp, etc)\n         * </pre>\n         *\n         * <code>optional string file_type = 6;</code>\n         */\n        public boolean hasFileType() {\n          return ((bitField0_ & 0x00000020) == 0x00000020);\n        }\n        /**\n         * <pre>\n         * File type (i.e. xml, json, txt, cpp, etc)\n         * </pre>\n         *\n         * <code>optional string file_type = 6;</code>\n         */\n        public java.lang.String getFileType() {\n          java.lang.Object ref = fileType_;\n          if (!(ref instanceof java.lang.String)) {\n            com.google.protobuf.ByteString bs =\n                (com.google.protobuf.ByteString) ref;\n            java.lang.String s = bs.toStringUtf8();\n            if (bs.isValidUtf8()) {\n              fileType_ = s;\n            }\n            return s;\n          } else {\n            return (java.lang.String) ref;\n          }\n        }\n        /**\n         * <pre>\n         * File type (i.e. xml, json, txt, cpp, etc)\n         * </pre>\n         *\n         * <code>optional string file_type = 6;</code>\n         */\n        public com.google.protobuf.ByteString\n            getFileTypeBytes() {\n          java.lang.Object ref = fileType_;\n          if (ref instanceof String) {\n            com.google.protobuf.ByteString b = \n                com.google.protobuf.ByteString.copyFromUtf8(\n                    (java.lang.String) ref);\n            fileType_ = b;\n            return b;\n          } else {\n            return (com.google.protobuf.ByteString) ref;\n          }\n        }\n        /**\n         * <pre>\n         * File type (i.e. xml, json, txt, cpp, etc)\n         * </pre>\n         *\n         * <code>optional string file_type = 6;</code>\n         */\n        public Builder setFileType(\n            java.lang.String value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000020;\n          fileType_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * File type (i.e. xml, json, txt, cpp, etc)\n         * </pre>\n         *\n         * <code>optional string file_type = 6;</code>\n         */\n        public Builder clearFileType() {\n          bitField0_ = (bitField0_ & ~0x00000020);\n          fileType_ = getDefaultInstance().getFileType();\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * File type (i.e. xml, json, txt, cpp, etc)\n         * </pre>\n         *\n         * <code>optional string file_type = 6;</code>\n         */\n        public Builder setFileTypeBytes(\n            com.google.protobuf.ByteString value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000020;\n          fileType_ = value;\n          onChanged();\n          return this;\n        }\n\n        private java.lang.Object md5_ = \"\";\n        /**\n         * <pre>\n         * md5 of data\n         * </pre>\n         *\n         * <code>optional string md5 = 7;</code>\n         */\n        public boolean hasMd5() {\n          return ((bitField0_ & 0x00000040) == 0x00000040);\n        }\n        /**\n         * <pre>\n         * md5 of data\n         * </pre>\n         *\n         * <code>optional string md5 = 7;</code>\n         */\n        public java.lang.String getMd5() {\n          java.lang.Object ref = md5_;\n          if (!(ref instanceof java.lang.String)) {\n            com.google.protobuf.ByteString bs =\n                (com.google.protobuf.ByteString) ref;\n            java.lang.String s = bs.toStringUtf8();\n            if (bs.isValidUtf8()) {\n              md5_ = s;\n            }\n            return s;\n          } else {\n            return (java.lang.String) ref;\n          }\n        }\n        /**\n         * <pre>\n         * md5 of data\n         * </pre>\n         *\n         * <code>optional string md5 = 7;</code>\n         */\n        public com.google.protobuf.ByteString\n            getMd5Bytes() {\n          java.lang.Object ref = md5_;\n          if (ref instanceof String) {\n            com.google.protobuf.ByteString b = \n                com.google.protobuf.ByteString.copyFromUtf8(\n                    (java.lang.String) ref);\n            md5_ = b;\n            return b;\n          } else {\n            return (com.google.protobuf.ByteString) ref;\n          }\n        }\n        /**\n         * <pre>\n         * md5 of data\n         * </pre>\n         *\n         * <code>optional string md5 = 7;</code>\n         */\n        public Builder setMd5(\n            java.lang.String value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000040;\n          md5_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * md5 of data\n         * </pre>\n         *\n         * <code>optional string md5 = 7;</code>\n         */\n        public Builder clearMd5() {\n          bitField0_ = (bitField0_ & ~0x00000040);\n          md5_ = getDefaultInstance().getMd5();\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * md5 of data\n         * </pre>\n         *\n         * <code>optional string md5 = 7;</code>\n         */\n        public Builder setMd5Bytes(\n            com.google.protobuf.ByteString value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000040;\n          md5_ = value;\n          onChanged();\n          return this;\n        }\n\n        private java.lang.Object description_ = \"\";\n        /**\n         * <pre>\n         * Catchalls and future expansion\n         * </pre>\n         *\n         * <code>optional string description = 8;</code>\n         */\n        public boolean hasDescription() {\n          return ((bitField0_ & 0x00000080) == 0x00000080);\n        }\n        /**\n         * <pre>\n         * Catchalls and future expansion\n         * </pre>\n         *\n         * <code>optional string description = 8;</code>\n         */\n        public java.lang.String getDescription() {\n          java.lang.Object ref = description_;\n          if (!(ref instanceof java.lang.String)) {\n            com.google.protobuf.ByteString bs =\n                (com.google.protobuf.ByteString) ref;\n            java.lang.String s = bs.toStringUtf8();\n            if (bs.isValidUtf8()) {\n              description_ = s;\n            }\n            return s;\n          } else {\n            return (java.lang.String) ref;\n          }\n        }\n        /**\n         * <pre>\n         * Catchalls and future expansion\n         * </pre>\n         *\n         * <code>optional string description = 8;</code>\n         */\n        public com.google.protobuf.ByteString\n            getDescriptionBytes() {\n          java.lang.Object ref = description_;\n          if (ref instanceof String) {\n            com.google.protobuf.ByteString b = \n                com.google.protobuf.ByteString.copyFromUtf8(\n                    (java.lang.String) ref);\n            description_ = b;\n            return b;\n          } else {\n            return (com.google.protobuf.ByteString) ref;\n          }\n        }\n        /**\n         * <pre>\n         * Catchalls and future expansion\n         * </pre>\n         *\n         * <code>optional string description = 8;</code>\n         */\n        public Builder setDescription(\n            java.lang.String value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000080;\n          description_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Catchalls and future expansion\n         * </pre>\n         *\n         * <code>optional string description = 8;</code>\n         */\n        public Builder clearDescription() {\n          bitField0_ = (bitField0_ & ~0x00000080);\n          description_ = getDefaultInstance().getDescription();\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Catchalls and future expansion\n         * </pre>\n         *\n         * <code>optional string description = 8;</code>\n         */\n        public Builder setDescriptionBytes(\n            com.google.protobuf.ByteString value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000080;\n          description_ = value;\n          onChanged();\n          return this;\n        }\n        public final Builder setUnknownFields(\n            final com.google.protobuf.UnknownFieldSet unknownFields) {\n          return super.setUnknownFields(unknownFields);\n        }\n\n        public final Builder mergeUnknownFields(\n            final com.google.protobuf.UnknownFieldSet unknownFields) {\n          return super.mergeUnknownFields(unknownFields);\n        }\n\n\n        // @@protoc_insertion_point(builder_scope:org.eclipse.tahu.protobuf.Payload.MetaData)\n      }\n\n      // @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.MetaData)\n      private static final org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData DEFAULT_INSTANCE;\n      static {\n        DEFAULT_INSTANCE = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData();\n      }\n\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData getDefaultInstance() {\n        return DEFAULT_INSTANCE;\n      }\n\n      @java.lang.Deprecated public static final com.google.protobuf.Parser<MetaData>\n          PARSER = new com.google.protobuf.AbstractParser<MetaData>() {\n        public MetaData parsePartialFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n            return new MetaData(input, extensionRegistry);\n        }\n      };\n\n      public static com.google.protobuf.Parser<MetaData> parser() {\n        return PARSER;\n      }\n\n      @java.lang.Override\n      public com.google.protobuf.Parser<MetaData> getParserForType() {\n        return PARSER;\n      }\n\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData getDefaultInstanceForType() {\n        return DEFAULT_INSTANCE;\n      }\n\n    }\n\n    public interface MetricOrBuilder extends\n        // @@protoc_insertion_point(interface_extends:org.eclipse.tahu.protobuf.Payload.Metric)\n        com.google.protobuf.MessageOrBuilder {\n\n      /**\n       * <pre>\n       * Metric name - should only be included on birth\n       * </pre>\n       *\n       * <code>optional string name = 1;</code>\n       */\n      boolean hasName();\n      /**\n       * <pre>\n       * Metric name - should only be included on birth\n       * </pre>\n       *\n       * <code>optional string name = 1;</code>\n       */\n      java.lang.String getName();\n      /**\n       * <pre>\n       * Metric name - should only be included on birth\n       * </pre>\n       *\n       * <code>optional string name = 1;</code>\n       */\n      com.google.protobuf.ByteString\n          getNameBytes();\n\n      /**\n       * <pre>\n       * Metric alias - tied to name on birth and included in all later DATA messages\n       * </pre>\n       *\n       * <code>optional uint64 alias = 2;</code>\n       */\n      boolean hasAlias();\n      /**\n       * <pre>\n       * Metric alias - tied to name on birth and included in all later DATA messages\n       * </pre>\n       *\n       * <code>optional uint64 alias = 2;</code>\n       */\n      long getAlias();\n\n      /**\n       * <pre>\n       * Timestamp associated with data acquisition time\n       * </pre>\n       *\n       * <code>optional uint64 timestamp = 3;</code>\n       */\n      boolean hasTimestamp();\n      /**\n       * <pre>\n       * Timestamp associated with data acquisition time\n       * </pre>\n       *\n       * <code>optional uint64 timestamp = 3;</code>\n       */\n      long getTimestamp();\n\n      /**\n       * <pre>\n       * DataType of the metric/tag value\n       * </pre>\n       *\n       * <code>optional uint32 datatype = 4;</code>\n       */\n      boolean hasDatatype();\n      /**\n       * <pre>\n       * DataType of the metric/tag value\n       * </pre>\n       *\n       * <code>optional uint32 datatype = 4;</code>\n       */\n      int getDatatype();\n\n      /**\n       * <pre>\n       * If this is historical data and should not update real time tag\n       * </pre>\n       *\n       * <code>optional bool is_historical = 5;</code>\n       */\n      boolean hasIsHistorical();\n      /**\n       * <pre>\n       * If this is historical data and should not update real time tag\n       * </pre>\n       *\n       * <code>optional bool is_historical = 5;</code>\n       */\n      boolean getIsHistorical();\n\n      /**\n       * <pre>\n       * Tells consuming clients such as MQTT Engine to not store this as a tag\n       * </pre>\n       *\n       * <code>optional bool is_transient = 6;</code>\n       */\n      boolean hasIsTransient();\n      /**\n       * <pre>\n       * Tells consuming clients such as MQTT Engine to not store this as a tag\n       * </pre>\n       *\n       * <code>optional bool is_transient = 6;</code>\n       */\n      boolean getIsTransient();\n\n      /**\n       * <pre>\n       * If this is null - explicitly say so rather than using -1, false, etc for some datatypes.\n       * </pre>\n       *\n       * <code>optional bool is_null = 7;</code>\n       */\n      boolean hasIsNull();\n      /**\n       * <pre>\n       * If this is null - explicitly say so rather than using -1, false, etc for some datatypes.\n       * </pre>\n       *\n       * <code>optional bool is_null = 7;</code>\n       */\n      boolean getIsNull();\n\n      /**\n       * <pre>\n       * Metadata for the payload\n       * </pre>\n       *\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.MetaData metadata = 8;</code>\n       */\n      boolean hasMetadata();\n      /**\n       * <pre>\n       * Metadata for the payload\n       * </pre>\n       *\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.MetaData metadata = 8;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData getMetadata();\n      /**\n       * <pre>\n       * Metadata for the payload\n       * </pre>\n       *\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.MetaData metadata = 8;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaDataOrBuilder getMetadataOrBuilder();\n\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet properties = 9;</code>\n       */\n      boolean hasProperties();\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet properties = 9;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet getProperties();\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet properties = 9;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder getPropertiesOrBuilder();\n\n      /**\n       * <code>optional uint32 int_value = 10;</code>\n       */\n      boolean hasIntValue();\n      /**\n       * <code>optional uint32 int_value = 10;</code>\n       */\n      int getIntValue();\n\n      /**\n       * <code>optional uint64 long_value = 11;</code>\n       */\n      boolean hasLongValue();\n      /**\n       * <code>optional uint64 long_value = 11;</code>\n       */\n      long getLongValue();\n\n      /**\n       * <code>optional float float_value = 12;</code>\n       */\n      boolean hasFloatValue();\n      /**\n       * <code>optional float float_value = 12;</code>\n       */\n      float getFloatValue();\n\n      /**\n       * <code>optional double double_value = 13;</code>\n       */\n      boolean hasDoubleValue();\n      /**\n       * <code>optional double double_value = 13;</code>\n       */\n      double getDoubleValue();\n\n      /**\n       * <code>optional bool boolean_value = 14;</code>\n       */\n      boolean hasBooleanValue();\n      /**\n       * <code>optional bool boolean_value = 14;</code>\n       */\n      boolean getBooleanValue();\n\n      /**\n       * <code>optional string string_value = 15;</code>\n       */\n      boolean hasStringValue();\n      /**\n       * <code>optional string string_value = 15;</code>\n       */\n      java.lang.String getStringValue();\n      /**\n       * <code>optional string string_value = 15;</code>\n       */\n      com.google.protobuf.ByteString\n          getStringValueBytes();\n\n      /**\n       * <pre>\n       * Bytes, File\n       * </pre>\n       *\n       * <code>optional bytes bytes_value = 16;</code>\n       */\n      boolean hasBytesValue();\n      /**\n       * <pre>\n       * Bytes, File\n       * </pre>\n       *\n       * <code>optional bytes bytes_value = 16;</code>\n       */\n      com.google.protobuf.ByteString getBytesValue();\n\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet dataset_value = 17;</code>\n       */\n      boolean hasDatasetValue();\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet dataset_value = 17;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet getDatasetValue();\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet dataset_value = 17;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSetOrBuilder getDatasetValueOrBuilder();\n\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.Template template_value = 18;</code>\n       */\n      boolean hasTemplateValue();\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.Template template_value = 18;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template getTemplateValue();\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.Template template_value = 18;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.TemplateOrBuilder getTemplateValueOrBuilder();\n\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension extension_value = 19;</code>\n       */\n      boolean hasExtensionValue();\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension extension_value = 19;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension getExtensionValue();\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension extension_value = 19;</code>\n       */\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtensionOrBuilder getExtensionValueOrBuilder();\n\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.ValueCase getValueCase();\n    }\n    /**\n     * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.Metric}\n     */\n    public  static final class Metric extends\n        com.google.protobuf.GeneratedMessageV3 implements\n        // @@protoc_insertion_point(message_implements:org.eclipse.tahu.protobuf.Payload.Metric)\n        MetricOrBuilder {\n      // Use Metric.newBuilder() to construct.\n      private Metric(com.google.protobuf.GeneratedMessageV3.Builder<?> builder) {\n        super(builder);\n      }\n      private Metric() {\n        name_ = \"\";\n        alias_ = 0L;\n        timestamp_ = 0L;\n        datatype_ = 0;\n        isHistorical_ = false;\n        isTransient_ = false;\n        isNull_ = false;\n      }\n\n      @java.lang.Override\n      public final com.google.protobuf.UnknownFieldSet\n      getUnknownFields() {\n        return this.unknownFields;\n      }\n      private Metric(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        this();\n        int mutable_bitField0_ = 0;\n        com.google.protobuf.UnknownFieldSet.Builder unknownFields =\n            com.google.protobuf.UnknownFieldSet.newBuilder();\n        try {\n          boolean done = false;\n          while (!done) {\n            int tag = input.readTag();\n            switch (tag) {\n              case 0:\n                done = true;\n                break;\n              default: {\n                if (!parseUnknownField(input, unknownFields,\n                                       extensionRegistry, tag)) {\n                  done = true;\n                }\n                break;\n              }\n              case 10: {\n                com.google.protobuf.ByteString bs = input.readBytes();\n                bitField0_ |= 0x00000001;\n                name_ = bs;\n                break;\n              }\n              case 16: {\n                bitField0_ |= 0x00000002;\n                alias_ = input.readUInt64();\n                break;\n              }\n              case 24: {\n                bitField0_ |= 0x00000004;\n                timestamp_ = input.readUInt64();\n                break;\n              }\n              case 32: {\n                bitField0_ |= 0x00000008;\n                datatype_ = input.readUInt32();\n                break;\n              }\n              case 40: {\n                bitField0_ |= 0x00000010;\n                isHistorical_ = input.readBool();\n                break;\n              }\n              case 48: {\n                bitField0_ |= 0x00000020;\n                isTransient_ = input.readBool();\n                break;\n              }\n              case 56: {\n                bitField0_ |= 0x00000040;\n                isNull_ = input.readBool();\n                break;\n              }\n              case 66: {\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.Builder subBuilder = null;\n                if (((bitField0_ & 0x00000080) == 0x00000080)) {\n                  subBuilder = metadata_.toBuilder();\n                }\n                metadata_ = input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.PARSER, extensionRegistry);\n                if (subBuilder != null) {\n                  subBuilder.mergeFrom(metadata_);\n                  metadata_ = subBuilder.buildPartial();\n                }\n                bitField0_ |= 0x00000080;\n                break;\n              }\n              case 74: {\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder subBuilder = null;\n                if (((bitField0_ & 0x00000100) == 0x00000100)) {\n                  subBuilder = properties_.toBuilder();\n                }\n                properties_ = input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.PARSER, extensionRegistry);\n                if (subBuilder != null) {\n                  subBuilder.mergeFrom(properties_);\n                  properties_ = subBuilder.buildPartial();\n                }\n                bitField0_ |= 0x00000100;\n                break;\n              }\n              case 80: {\n                valueCase_ = 10;\n                value_ = input.readUInt32();\n                break;\n              }\n              case 88: {\n                valueCase_ = 11;\n                value_ = input.readUInt64();\n                break;\n              }\n              case 101: {\n                valueCase_ = 12;\n                value_ = input.readFloat();\n                break;\n              }\n              case 105: {\n                valueCase_ = 13;\n                value_ = input.readDouble();\n                break;\n              }\n              case 112: {\n                valueCase_ = 14;\n                value_ = input.readBool();\n                break;\n              }\n              case 122: {\n                com.google.protobuf.ByteString bs = input.readBytes();\n                valueCase_ = 15;\n                value_ = bs;\n                break;\n              }\n              case 130: {\n                valueCase_ = 16;\n                value_ = input.readBytes();\n                break;\n              }\n              case 138: {\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Builder subBuilder = null;\n                if (valueCase_ == 17) {\n                  subBuilder = ((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet) value_).toBuilder();\n                }\n                value_ =\n                    input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.PARSER, extensionRegistry);\n                if (subBuilder != null) {\n                  subBuilder.mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet) value_);\n                  value_ = subBuilder.buildPartial();\n                }\n                valueCase_ = 17;\n                break;\n              }\n              case 146: {\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Builder subBuilder = null;\n                if (valueCase_ == 18) {\n                  subBuilder = ((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template) value_).toBuilder();\n                }\n                value_ =\n                    input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.PARSER, extensionRegistry);\n                if (subBuilder != null) {\n                  subBuilder.mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template) value_);\n                  value_ = subBuilder.buildPartial();\n                }\n                valueCase_ = 18;\n                break;\n              }\n              case 154: {\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.Builder subBuilder = null;\n                if (valueCase_ == 19) {\n                  subBuilder = ((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension) value_).toBuilder();\n                }\n                value_ =\n                    input.readMessage(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.PARSER, extensionRegistry);\n                if (subBuilder != null) {\n                  subBuilder.mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension) value_);\n                  value_ = subBuilder.buildPartial();\n                }\n                valueCase_ = 19;\n                break;\n              }\n            }\n          }\n        } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n          throw e.setUnfinishedMessage(this);\n        } catch (java.io.IOException e) {\n          throw new com.google.protobuf.InvalidProtocolBufferException(\n              e).setUnfinishedMessage(this);\n        } finally {\n          this.unknownFields = unknownFields.build();\n          makeExtensionsImmutable();\n        }\n      }\n      public static final com.google.protobuf.Descriptors.Descriptor\n          getDescriptor() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Metric_descriptor;\n      }\n\n      protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n          internalGetFieldAccessorTable() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Metric_fieldAccessorTable\n            .ensureFieldAccessorsInitialized(\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder.class);\n      }\n\n      public interface MetricValueExtensionOrBuilder extends\n          // @@protoc_insertion_point(interface_extends:org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension)\n          com.google.protobuf.GeneratedMessageV3.\n              ExtendableMessageOrBuilder<MetricValueExtension> {\n      }\n      /**\n       * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension}\n       */\n      public  static final class MetricValueExtension extends\n          com.google.protobuf.GeneratedMessageV3.ExtendableMessage<\n            MetricValueExtension> implements\n          // @@protoc_insertion_point(message_implements:org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension)\n          MetricValueExtensionOrBuilder {\n        // Use MetricValueExtension.newBuilder() to construct.\n        private MetricValueExtension(com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension, ?> builder) {\n          super(builder);\n        }\n        private MetricValueExtension() {\n        }\n\n        @java.lang.Override\n        public final com.google.protobuf.UnknownFieldSet\n        getUnknownFields() {\n          return this.unknownFields;\n        }\n        private MetricValueExtension(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          this();\n          com.google.protobuf.UnknownFieldSet.Builder unknownFields =\n              com.google.protobuf.UnknownFieldSet.newBuilder();\n          try {\n            boolean done = false;\n            while (!done) {\n              int tag = input.readTag();\n              switch (tag) {\n                case 0:\n                  done = true;\n                  break;\n                default: {\n                  if (!parseUnknownField(input, unknownFields,\n                                         extensionRegistry, tag)) {\n                    done = true;\n                  }\n                  break;\n                }\n              }\n            }\n          } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n            throw e.setUnfinishedMessage(this);\n          } catch (java.io.IOException e) {\n            throw new com.google.protobuf.InvalidProtocolBufferException(\n                e).setUnfinishedMessage(this);\n          } finally {\n            this.unknownFields = unknownFields.build();\n            makeExtensionsImmutable();\n          }\n        }\n        public static final com.google.protobuf.Descriptors.Descriptor\n            getDescriptor() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_descriptor;\n        }\n\n        protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n            internalGetFieldAccessorTable() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_fieldAccessorTable\n              .ensureFieldAccessorsInitialized(\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.Builder.class);\n        }\n\n        private byte memoizedIsInitialized = -1;\n        public final boolean isInitialized() {\n          byte isInitialized = memoizedIsInitialized;\n          if (isInitialized == 1) return true;\n          if (isInitialized == 0) return false;\n\n          if (!extensionsAreInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n          memoizedIsInitialized = 1;\n          return true;\n        }\n\n        public void writeTo(com.google.protobuf.CodedOutputStream output)\n                            throws java.io.IOException {\n          com.google.protobuf.GeneratedMessageV3\n            .ExtendableMessage<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension>.ExtensionWriter\n              extensionWriter = newExtensionWriter();\n          extensionWriter.writeUntil(536870912, output);\n          unknownFields.writeTo(output);\n        }\n\n        public int getSerializedSize() {\n          int size = memoizedSize;\n          if (size != -1) return size;\n\n          size = 0;\n          size += extensionsSerializedSize();\n          size += unknownFields.getSerializedSize();\n          memoizedSize = size;\n          return size;\n        }\n\n        private static final long serialVersionUID = 0L;\n        @java.lang.Override\n        public boolean equals(final java.lang.Object obj) {\n          if (obj == this) {\n           return true;\n          }\n          if (!(obj instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension)) {\n            return super.equals(obj);\n          }\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension other = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension) obj;\n\n          boolean result = true;\n          result = result && unknownFields.equals(other.unknownFields);\n          result = result &&\n              getExtensionFields().equals(other.getExtensionFields());\n          return result;\n        }\n\n        @java.lang.Override\n        public int hashCode() {\n          if (memoizedHashCode != 0) {\n            return memoizedHashCode;\n          }\n          int hash = 41;\n          hash = (19 * hash) + getDescriptorForType().hashCode();\n          hash = hashFields(hash, getExtensionFields());\n          hash = (29 * hash) + unknownFields.hashCode();\n          memoizedHashCode = hash;\n          return hash;\n        }\n\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension parseFrom(\n            com.google.protobuf.ByteString data)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension parseFrom(\n            com.google.protobuf.ByteString data,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension parseFrom(byte[] data)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension parseFrom(\n            byte[] data,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n          return PARSER.parseFrom(data, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension parseFrom(java.io.InputStream input)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension parseFrom(\n            java.io.InputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension parseDelimitedFrom(java.io.InputStream input)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseDelimitedWithIOException(PARSER, input);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension parseDelimitedFrom(\n            java.io.InputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseDelimitedWithIOException(PARSER, input, extensionRegistry);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension parseFrom(\n            com.google.protobuf.CodedInputStream input)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input);\n        }\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension parseFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          return com.google.protobuf.GeneratedMessageV3\n              .parseWithIOException(PARSER, input, extensionRegistry);\n        }\n\n        public Builder newBuilderForType() { return newBuilder(); }\n        public static Builder newBuilder() {\n          return DEFAULT_INSTANCE.toBuilder();\n        }\n        public static Builder newBuilder(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension prototype) {\n          return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);\n        }\n        public Builder toBuilder() {\n          return this == DEFAULT_INSTANCE\n              ? new Builder() : new Builder().mergeFrom(this);\n        }\n\n        @java.lang.Override\n        protected Builder newBuilderForType(\n            com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n          Builder builder = new Builder(parent);\n          return builder;\n        }\n        /**\n         * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension}\n         */\n        public static final class Builder extends\n            com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension, Builder> implements\n            // @@protoc_insertion_point(builder_implements:org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension)\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtensionOrBuilder {\n          public static final com.google.protobuf.Descriptors.Descriptor\n              getDescriptor() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_descriptor;\n          }\n\n          protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n              internalGetFieldAccessorTable() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_fieldAccessorTable\n                .ensureFieldAccessorsInitialized(\n                    org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.Builder.class);\n          }\n\n          // Construct using org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.newBuilder()\n          private Builder() {\n            maybeForceBuilderInitialization();\n          }\n\n          private Builder(\n              com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n            super(parent);\n            maybeForceBuilderInitialization();\n          }\n          private void maybeForceBuilderInitialization() {\n            if (com.google.protobuf.GeneratedMessageV3\n                    .alwaysUseFieldBuilders) {\n            }\n          }\n          public Builder clear() {\n            super.clear();\n            return this;\n          }\n\n          public com.google.protobuf.Descriptors.Descriptor\n              getDescriptorForType() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_descriptor;\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension getDefaultInstanceForType() {\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.getDefaultInstance();\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension build() {\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension result = buildPartial();\n            if (!result.isInitialized()) {\n              throw newUninitializedMessageException(result);\n            }\n            return result;\n          }\n\n          public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension buildPartial() {\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension result = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension(this);\n            onBuilt();\n            return result;\n          }\n\n          public Builder clone() {\n            return (Builder) super.clone();\n          }\n          public Builder setField(\n              com.google.protobuf.Descriptors.FieldDescriptor field,\n              Object value) {\n            return (Builder) super.setField(field, value);\n          }\n          public Builder clearField(\n              com.google.protobuf.Descriptors.FieldDescriptor field) {\n            return (Builder) super.clearField(field);\n          }\n          public Builder clearOneof(\n              com.google.protobuf.Descriptors.OneofDescriptor oneof) {\n            return (Builder) super.clearOneof(oneof);\n          }\n          public Builder setRepeatedField(\n              com.google.protobuf.Descriptors.FieldDescriptor field,\n              int index, Object value) {\n            return (Builder) super.setRepeatedField(field, index, value);\n          }\n          public Builder addRepeatedField(\n              com.google.protobuf.Descriptors.FieldDescriptor field,\n              Object value) {\n            return (Builder) super.addRepeatedField(field, value);\n          }\n          public <Type> Builder setExtension(\n              com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension, Type> extension,\n              Type value) {\n            return (Builder) super.setExtension(extension, value);\n          }\n          public <Type> Builder setExtension(\n              com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension, java.util.List<Type>> extension,\n              int index, Type value) {\n            return (Builder) super.setExtension(extension, index, value);\n          }\n          public <Type> Builder addExtension(\n              com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension, java.util.List<Type>> extension,\n              Type value) {\n            return (Builder) super.addExtension(extension, value);\n          }\n          public <Type> Builder clearExtension(\n              com.google.protobuf.GeneratedMessage.GeneratedExtension<\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension, ?> extension) {\n            return (Builder) super.clearExtension(extension);\n          }\n          public Builder mergeFrom(com.google.protobuf.Message other) {\n            if (other instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension) {\n              return mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension)other);\n            } else {\n              super.mergeFrom(other);\n              return this;\n            }\n          }\n\n          public Builder mergeFrom(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension other) {\n            if (other == org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.getDefaultInstance()) return this;\n            this.mergeExtensionFields(other);\n            this.mergeUnknownFields(other.unknownFields);\n            onChanged();\n            return this;\n          }\n\n          public final boolean isInitialized() {\n            if (!extensionsAreInitialized()) {\n              return false;\n            }\n            return true;\n          }\n\n          public Builder mergeFrom(\n              com.google.protobuf.CodedInputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws java.io.IOException {\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension parsedMessage = null;\n            try {\n              parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry);\n            } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n              parsedMessage = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension) e.getUnfinishedMessage();\n              throw e.unwrapIOException();\n            } finally {\n              if (parsedMessage != null) {\n                mergeFrom(parsedMessage);\n              }\n            }\n            return this;\n          }\n          public final Builder setUnknownFields(\n              final com.google.protobuf.UnknownFieldSet unknownFields) {\n            return super.setUnknownFields(unknownFields);\n          }\n\n          public final Builder mergeUnknownFields(\n              final com.google.protobuf.UnknownFieldSet unknownFields) {\n            return super.mergeUnknownFields(unknownFields);\n          }\n\n\n          // @@protoc_insertion_point(builder_scope:org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension)\n        }\n\n        // @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension)\n        private static final org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension DEFAULT_INSTANCE;\n        static {\n          DEFAULT_INSTANCE = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension();\n        }\n\n        public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension getDefaultInstance() {\n          return DEFAULT_INSTANCE;\n        }\n\n        @java.lang.Deprecated public static final com.google.protobuf.Parser<MetricValueExtension>\n            PARSER = new com.google.protobuf.AbstractParser<MetricValueExtension>() {\n          public MetricValueExtension parsePartialFrom(\n              com.google.protobuf.CodedInputStream input,\n              com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n              throws com.google.protobuf.InvalidProtocolBufferException {\n              return new MetricValueExtension(input, extensionRegistry);\n          }\n        };\n\n        public static com.google.protobuf.Parser<MetricValueExtension> parser() {\n          return PARSER;\n        }\n\n        @java.lang.Override\n        public com.google.protobuf.Parser<MetricValueExtension> getParserForType() {\n          return PARSER;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension getDefaultInstanceForType() {\n          return DEFAULT_INSTANCE;\n        }\n\n      }\n\n      private int bitField0_;\n      private int valueCase_ = 0;\n      private java.lang.Object value_;\n      public enum ValueCase\n          implements com.google.protobuf.Internal.EnumLite {\n        INT_VALUE(10),\n        LONG_VALUE(11),\n        FLOAT_VALUE(12),\n        DOUBLE_VALUE(13),\n        BOOLEAN_VALUE(14),\n        STRING_VALUE(15),\n        BYTES_VALUE(16),\n        DATASET_VALUE(17),\n        TEMPLATE_VALUE(18),\n        EXTENSION_VALUE(19),\n        VALUE_NOT_SET(0);\n        private final int value;\n        private ValueCase(int value) {\n          this.value = value;\n        }\n        /**\n         * @deprecated Use {@link #forNumber(int)} instead.\n         */\n        @java.lang.Deprecated\n        public static ValueCase valueOf(int value) {\n          return forNumber(value);\n        }\n\n        public static ValueCase forNumber(int value) {\n          switch (value) {\n            case 10: return INT_VALUE;\n            case 11: return LONG_VALUE;\n            case 12: return FLOAT_VALUE;\n            case 13: return DOUBLE_VALUE;\n            case 14: return BOOLEAN_VALUE;\n            case 15: return STRING_VALUE;\n            case 16: return BYTES_VALUE;\n            case 17: return DATASET_VALUE;\n            case 18: return TEMPLATE_VALUE;\n            case 19: return EXTENSION_VALUE;\n            case 0: return VALUE_NOT_SET;\n            default: return null;\n          }\n        }\n        public int getNumber() {\n          return this.value;\n        }\n      };\n\n      public ValueCase\n      getValueCase() {\n        return ValueCase.forNumber(\n            valueCase_);\n      }\n\n      public static final int NAME_FIELD_NUMBER = 1;\n      private volatile java.lang.Object name_;\n      /**\n       * <pre>\n       * Metric name - should only be included on birth\n       * </pre>\n       *\n       * <code>optional string name = 1;</code>\n       */\n      public boolean hasName() {\n        return ((bitField0_ & 0x00000001) == 0x00000001);\n      }\n      /**\n       * <pre>\n       * Metric name - should only be included on birth\n       * </pre>\n       *\n       * <code>optional string name = 1;</code>\n       */\n      public java.lang.String getName() {\n        java.lang.Object ref = name_;\n        if (ref instanceof java.lang.String) {\n          return (java.lang.String) ref;\n        } else {\n          com.google.protobuf.ByteString bs = \n              (com.google.protobuf.ByteString) ref;\n          java.lang.String s = bs.toStringUtf8();\n          if (bs.isValidUtf8()) {\n            name_ = s;\n          }\n          return s;\n        }\n      }\n      /**\n       * <pre>\n       * Metric name - should only be included on birth\n       * </pre>\n       *\n       * <code>optional string name = 1;</code>\n       */\n      public com.google.protobuf.ByteString\n          getNameBytes() {\n        java.lang.Object ref = name_;\n        if (ref instanceof java.lang.String) {\n          com.google.protobuf.ByteString b = \n              com.google.protobuf.ByteString.copyFromUtf8(\n                  (java.lang.String) ref);\n          name_ = b;\n          return b;\n        } else {\n          return (com.google.protobuf.ByteString) ref;\n        }\n      }\n\n      public static final int ALIAS_FIELD_NUMBER = 2;\n      private long alias_;\n      /**\n       * <pre>\n       * Metric alias - tied to name on birth and included in all later DATA messages\n       * </pre>\n       *\n       * <code>optional uint64 alias = 2;</code>\n       */\n      public boolean hasAlias() {\n        return ((bitField0_ & 0x00000002) == 0x00000002);\n      }\n      /**\n       * <pre>\n       * Metric alias - tied to name on birth and included in all later DATA messages\n       * </pre>\n       *\n       * <code>optional uint64 alias = 2;</code>\n       */\n      public long getAlias() {\n        return alias_;\n      }\n\n      public static final int TIMESTAMP_FIELD_NUMBER = 3;\n      private long timestamp_;\n      /**\n       * <pre>\n       * Timestamp associated with data acquisition time\n       * </pre>\n       *\n       * <code>optional uint64 timestamp = 3;</code>\n       */\n      public boolean hasTimestamp() {\n        return ((bitField0_ & 0x00000004) == 0x00000004);\n      }\n      /**\n       * <pre>\n       * Timestamp associated with data acquisition time\n       * </pre>\n       *\n       * <code>optional uint64 timestamp = 3;</code>\n       */\n      public long getTimestamp() {\n        return timestamp_;\n      }\n\n      public static final int DATATYPE_FIELD_NUMBER = 4;\n      private int datatype_;\n      /**\n       * <pre>\n       * DataType of the metric/tag value\n       * </pre>\n       *\n       * <code>optional uint32 datatype = 4;</code>\n       */\n      public boolean hasDatatype() {\n        return ((bitField0_ & 0x00000008) == 0x00000008);\n      }\n      /**\n       * <pre>\n       * DataType of the metric/tag value\n       * </pre>\n       *\n       * <code>optional uint32 datatype = 4;</code>\n       */\n      public int getDatatype() {\n        return datatype_;\n      }\n\n      public static final int IS_HISTORICAL_FIELD_NUMBER = 5;\n      private boolean isHistorical_;\n      /**\n       * <pre>\n       * If this is historical data and should not update real time tag\n       * </pre>\n       *\n       * <code>optional bool is_historical = 5;</code>\n       */\n      public boolean hasIsHistorical() {\n        return ((bitField0_ & 0x00000010) == 0x00000010);\n      }\n      /**\n       * <pre>\n       * If this is historical data and should not update real time tag\n       * </pre>\n       *\n       * <code>optional bool is_historical = 5;</code>\n       */\n      public boolean getIsHistorical() {\n        return isHistorical_;\n      }\n\n      public static final int IS_TRANSIENT_FIELD_NUMBER = 6;\n      private boolean isTransient_;\n      /**\n       * <pre>\n       * Tells consuming clients such as MQTT Engine to not store this as a tag\n       * </pre>\n       *\n       * <code>optional bool is_transient = 6;</code>\n       */\n      public boolean hasIsTransient() {\n        return ((bitField0_ & 0x00000020) == 0x00000020);\n      }\n      /**\n       * <pre>\n       * Tells consuming clients such as MQTT Engine to not store this as a tag\n       * </pre>\n       *\n       * <code>optional bool is_transient = 6;</code>\n       */\n      public boolean getIsTransient() {\n        return isTransient_;\n      }\n\n      public static final int IS_NULL_FIELD_NUMBER = 7;\n      private boolean isNull_;\n      /**\n       * <pre>\n       * If this is null - explicitly say so rather than using -1, false, etc for some datatypes.\n       * </pre>\n       *\n       * <code>optional bool is_null = 7;</code>\n       */\n      public boolean hasIsNull() {\n        return ((bitField0_ & 0x00000040) == 0x00000040);\n      }\n      /**\n       * <pre>\n       * If this is null - explicitly say so rather than using -1, false, etc for some datatypes.\n       * </pre>\n       *\n       * <code>optional bool is_null = 7;</code>\n       */\n      public boolean getIsNull() {\n        return isNull_;\n      }\n\n      public static final int METADATA_FIELD_NUMBER = 8;\n      private org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData metadata_;\n      /**\n       * <pre>\n       * Metadata for the payload\n       * </pre>\n       *\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.MetaData metadata = 8;</code>\n       */\n      public boolean hasMetadata() {\n        return ((bitField0_ & 0x00000080) == 0x00000080);\n      }\n      /**\n       * <pre>\n       * Metadata for the payload\n       * </pre>\n       *\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.MetaData metadata = 8;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData getMetadata() {\n        return metadata_ == null ? org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.getDefaultInstance() : metadata_;\n      }\n      /**\n       * <pre>\n       * Metadata for the payload\n       * </pre>\n       *\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.MetaData metadata = 8;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaDataOrBuilder getMetadataOrBuilder() {\n        return metadata_ == null ? org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.getDefaultInstance() : metadata_;\n      }\n\n      public static final int PROPERTIES_FIELD_NUMBER = 9;\n      private org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet properties_;\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet properties = 9;</code>\n       */\n      public boolean hasProperties() {\n        return ((bitField0_ & 0x00000100) == 0x00000100);\n      }\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet properties = 9;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet getProperties() {\n        return properties_ == null ? org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance() : properties_;\n      }\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet properties = 9;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder getPropertiesOrBuilder() {\n        return properties_ == null ? org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance() : properties_;\n      }\n\n      public static final int INT_VALUE_FIELD_NUMBER = 10;\n      /**\n       * <code>optional uint32 int_value = 10;</code>\n       */\n      public boolean hasIntValue() {\n        return valueCase_ == 10;\n      }\n      /**\n       * <code>optional uint32 int_value = 10;</code>\n       */\n      public int getIntValue() {\n        if (valueCase_ == 10) {\n          return (java.lang.Integer) value_;\n        }\n        return 0;\n      }\n\n      public static final int LONG_VALUE_FIELD_NUMBER = 11;\n      /**\n       * <code>optional uint64 long_value = 11;</code>\n       */\n      public boolean hasLongValue() {\n        return valueCase_ == 11;\n      }\n      /**\n       * <code>optional uint64 long_value = 11;</code>\n       */\n      public long getLongValue() {\n        if (valueCase_ == 11) {\n          return (java.lang.Long) value_;\n        }\n        return 0L;\n      }\n\n      public static final int FLOAT_VALUE_FIELD_NUMBER = 12;\n      /**\n       * <code>optional float float_value = 12;</code>\n       */\n      public boolean hasFloatValue() {\n        return valueCase_ == 12;\n      }\n      /**\n       * <code>optional float float_value = 12;</code>\n       */\n      public float getFloatValue() {\n        if (valueCase_ == 12) {\n          return (java.lang.Float) value_;\n        }\n        return 0F;\n      }\n\n      public static final int DOUBLE_VALUE_FIELD_NUMBER = 13;\n      /**\n       * <code>optional double double_value = 13;</code>\n       */\n      public boolean hasDoubleValue() {\n        return valueCase_ == 13;\n      }\n      /**\n       * <code>optional double double_value = 13;</code>\n       */\n      public double getDoubleValue() {\n        if (valueCase_ == 13) {\n          return (java.lang.Double) value_;\n        }\n        return 0D;\n      }\n\n      public static final int BOOLEAN_VALUE_FIELD_NUMBER = 14;\n      /**\n       * <code>optional bool boolean_value = 14;</code>\n       */\n      public boolean hasBooleanValue() {\n        return valueCase_ == 14;\n      }\n      /**\n       * <code>optional bool boolean_value = 14;</code>\n       */\n      public boolean getBooleanValue() {\n        if (valueCase_ == 14) {\n          return (java.lang.Boolean) value_;\n        }\n        return false;\n      }\n\n      public static final int STRING_VALUE_FIELD_NUMBER = 15;\n      /**\n       * <code>optional string string_value = 15;</code>\n       */\n      public boolean hasStringValue() {\n        return valueCase_ == 15;\n      }\n      /**\n       * <code>optional string string_value = 15;</code>\n       */\n      public java.lang.String getStringValue() {\n        java.lang.Object ref = \"\";\n        if (valueCase_ == 15) {\n          ref = value_;\n        }\n        if (ref instanceof java.lang.String) {\n          return (java.lang.String) ref;\n        } else {\n          com.google.protobuf.ByteString bs = \n              (com.google.protobuf.ByteString) ref;\n          java.lang.String s = bs.toStringUtf8();\n          if (bs.isValidUtf8() && (valueCase_ == 15)) {\n            value_ = s;\n          }\n          return s;\n        }\n      }\n      /**\n       * <code>optional string string_value = 15;</code>\n       */\n      public com.google.protobuf.ByteString\n          getStringValueBytes() {\n        java.lang.Object ref = \"\";\n        if (valueCase_ == 15) {\n          ref = value_;\n        }\n        if (ref instanceof java.lang.String) {\n          com.google.protobuf.ByteString b = \n              com.google.protobuf.ByteString.copyFromUtf8(\n                  (java.lang.String) ref);\n          if (valueCase_ == 15) {\n            value_ = b;\n          }\n          return b;\n        } else {\n          return (com.google.protobuf.ByteString) ref;\n        }\n      }\n\n      public static final int BYTES_VALUE_FIELD_NUMBER = 16;\n      /**\n       * <pre>\n       * Bytes, File\n       * </pre>\n       *\n       * <code>optional bytes bytes_value = 16;</code>\n       */\n      public boolean hasBytesValue() {\n        return valueCase_ == 16;\n      }\n      /**\n       * <pre>\n       * Bytes, File\n       * </pre>\n       *\n       * <code>optional bytes bytes_value = 16;</code>\n       */\n      public com.google.protobuf.ByteString getBytesValue() {\n        if (valueCase_ == 16) {\n          return (com.google.protobuf.ByteString) value_;\n        }\n        return com.google.protobuf.ByteString.EMPTY;\n      }\n\n      public static final int DATASET_VALUE_FIELD_NUMBER = 17;\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet dataset_value = 17;</code>\n       */\n      public boolean hasDatasetValue() {\n        return valueCase_ == 17;\n      }\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet dataset_value = 17;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet getDatasetValue() {\n        if (valueCase_ == 17) {\n           return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet) value_;\n        }\n        return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.getDefaultInstance();\n      }\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet dataset_value = 17;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSetOrBuilder getDatasetValueOrBuilder() {\n        if (valueCase_ == 17) {\n           return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet) value_;\n        }\n        return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.getDefaultInstance();\n      }\n\n      public static final int TEMPLATE_VALUE_FIELD_NUMBER = 18;\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.Template template_value = 18;</code>\n       */\n      public boolean hasTemplateValue() {\n        return valueCase_ == 18;\n      }\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.Template template_value = 18;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template getTemplateValue() {\n        if (valueCase_ == 18) {\n           return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template) value_;\n        }\n        return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.getDefaultInstance();\n      }\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.Template template_value = 18;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.TemplateOrBuilder getTemplateValueOrBuilder() {\n        if (valueCase_ == 18) {\n           return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template) value_;\n        }\n        return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.getDefaultInstance();\n      }\n\n      public static final int EXTENSION_VALUE_FIELD_NUMBER = 19;\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension extension_value = 19;</code>\n       */\n      public boolean hasExtensionValue() {\n        return valueCase_ == 19;\n      }\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension extension_value = 19;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension getExtensionValue() {\n        if (valueCase_ == 19) {\n           return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension) value_;\n        }\n        return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.getDefaultInstance();\n      }\n      /**\n       * <code>optional .org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension extension_value = 19;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtensionOrBuilder getExtensionValueOrBuilder() {\n        if (valueCase_ == 19) {\n           return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension) value_;\n        }\n        return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.getDefaultInstance();\n      }\n\n      private byte memoizedIsInitialized = -1;\n      public final boolean isInitialized() {\n        byte isInitialized = memoizedIsInitialized;\n        if (isInitialized == 1) return true;\n        if (isInitialized == 0) return false;\n\n        if (hasMetadata()) {\n          if (!getMetadata().isInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n        }\n        if (hasProperties()) {\n          if (!getProperties().isInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n        }\n        if (hasDatasetValue()) {\n          if (!getDatasetValue().isInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n        }\n        if (hasTemplateValue()) {\n          if (!getTemplateValue().isInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n        }\n        if (hasExtensionValue()) {\n          if (!getExtensionValue().isInitialized()) {\n            memoizedIsInitialized = 0;\n            return false;\n          }\n        }\n        memoizedIsInitialized = 1;\n        return true;\n      }\n\n      public void writeTo(com.google.protobuf.CodedOutputStream output)\n                          throws java.io.IOException {\n        if (((bitField0_ & 0x00000001) == 0x00000001)) {\n          com.google.protobuf.GeneratedMessageV3.writeString(output, 1, name_);\n        }\n        if (((bitField0_ & 0x00000002) == 0x00000002)) {\n          output.writeUInt64(2, alias_);\n        }\n        if (((bitField0_ & 0x00000004) == 0x00000004)) {\n          output.writeUInt64(3, timestamp_);\n        }\n        if (((bitField0_ & 0x00000008) == 0x00000008)) {\n          output.writeUInt32(4, datatype_);\n        }\n        if (((bitField0_ & 0x00000010) == 0x00000010)) {\n          output.writeBool(5, isHistorical_);\n        }\n        if (((bitField0_ & 0x00000020) == 0x00000020)) {\n          output.writeBool(6, isTransient_);\n        }\n        if (((bitField0_ & 0x00000040) == 0x00000040)) {\n          output.writeBool(7, isNull_);\n        }\n        if (((bitField0_ & 0x00000080) == 0x00000080)) {\n          output.writeMessage(8, getMetadata());\n        }\n        if (((bitField0_ & 0x00000100) == 0x00000100)) {\n          output.writeMessage(9, getProperties());\n        }\n        if (valueCase_ == 10) {\n          output.writeUInt32(\n              10, (int)((java.lang.Integer) value_));\n        }\n        if (valueCase_ == 11) {\n          output.writeUInt64(\n              11, (long)((java.lang.Long) value_));\n        }\n        if (valueCase_ == 12) {\n          output.writeFloat(\n              12, (float)((java.lang.Float) value_));\n        }\n        if (valueCase_ == 13) {\n          output.writeDouble(\n              13, (double)((java.lang.Double) value_));\n        }\n        if (valueCase_ == 14) {\n          output.writeBool(\n              14, (boolean)((java.lang.Boolean) value_));\n        }\n        if (valueCase_ == 15) {\n          com.google.protobuf.GeneratedMessageV3.writeString(output, 15, value_);\n        }\n        if (valueCase_ == 16) {\n          output.writeBytes(\n              16, (com.google.protobuf.ByteString)((com.google.protobuf.ByteString) value_));\n        }\n        if (valueCase_ == 17) {\n          output.writeMessage(17, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet) value_);\n        }\n        if (valueCase_ == 18) {\n          output.writeMessage(18, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template) value_);\n        }\n        if (valueCase_ == 19) {\n          output.writeMessage(19, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension) value_);\n        }\n        unknownFields.writeTo(output);\n      }\n\n      public int getSerializedSize() {\n        int size = memoizedSize;\n        if (size != -1) return size;\n\n        size = 0;\n        if (((bitField0_ & 0x00000001) == 0x00000001)) {\n          size += com.google.protobuf.GeneratedMessageV3.computeStringSize(1, name_);\n        }\n        if (((bitField0_ & 0x00000002) == 0x00000002)) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeUInt64Size(2, alias_);\n        }\n        if (((bitField0_ & 0x00000004) == 0x00000004)) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeUInt64Size(3, timestamp_);\n        }\n        if (((bitField0_ & 0x00000008) == 0x00000008)) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeUInt32Size(4, datatype_);\n        }\n        if (((bitField0_ & 0x00000010) == 0x00000010)) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeBoolSize(5, isHistorical_);\n        }\n        if (((bitField0_ & 0x00000020) == 0x00000020)) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeBoolSize(6, isTransient_);\n        }\n        if (((bitField0_ & 0x00000040) == 0x00000040)) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeBoolSize(7, isNull_);\n        }\n        if (((bitField0_ & 0x00000080) == 0x00000080)) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeMessageSize(8, getMetadata());\n        }\n        if (((bitField0_ & 0x00000100) == 0x00000100)) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeMessageSize(9, getProperties());\n        }\n        if (valueCase_ == 10) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeUInt32Size(\n                10, (int)((java.lang.Integer) value_));\n        }\n        if (valueCase_ == 11) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeUInt64Size(\n                11, (long)((java.lang.Long) value_));\n        }\n        if (valueCase_ == 12) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeFloatSize(\n                12, (float)((java.lang.Float) value_));\n        }\n        if (valueCase_ == 13) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeDoubleSize(\n                13, (double)((java.lang.Double) value_));\n        }\n        if (valueCase_ == 14) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeBoolSize(\n                14, (boolean)((java.lang.Boolean) value_));\n        }\n        if (valueCase_ == 15) {\n          size += com.google.protobuf.GeneratedMessageV3.computeStringSize(15, value_);\n        }\n        if (valueCase_ == 16) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeBytesSize(\n                16, (com.google.protobuf.ByteString)((com.google.protobuf.ByteString) value_));\n        }\n        if (valueCase_ == 17) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeMessageSize(17, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet) value_);\n        }\n        if (valueCase_ == 18) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeMessageSize(18, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template) value_);\n        }\n        if (valueCase_ == 19) {\n          size += com.google.protobuf.CodedOutputStream\n            .computeMessageSize(19, (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension) value_);\n        }\n        size += unknownFields.getSerializedSize();\n        memoizedSize = size;\n        return size;\n      }\n\n      private static final long serialVersionUID = 0L;\n      @java.lang.Override\n      public boolean equals(final java.lang.Object obj) {\n        if (obj == this) {\n         return true;\n        }\n        if (!(obj instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric)) {\n          return super.equals(obj);\n        }\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric other = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric) obj;\n\n        boolean result = true;\n        result = result && (hasName() == other.hasName());\n        if (hasName()) {\n          result = result && getName()\n              .equals(other.getName());\n        }\n        result = result && (hasAlias() == other.hasAlias());\n        if (hasAlias()) {\n          result = result && (getAlias()\n              == other.getAlias());\n        }\n        result = result && (hasTimestamp() == other.hasTimestamp());\n        if (hasTimestamp()) {\n          result = result && (getTimestamp()\n              == other.getTimestamp());\n        }\n        result = result && (hasDatatype() == other.hasDatatype());\n        if (hasDatatype()) {\n          result = result && (getDatatype()\n              == other.getDatatype());\n        }\n        result = result && (hasIsHistorical() == other.hasIsHistorical());\n        if (hasIsHistorical()) {\n          result = result && (getIsHistorical()\n              == other.getIsHistorical());\n        }\n        result = result && (hasIsTransient() == other.hasIsTransient());\n        if (hasIsTransient()) {\n          result = result && (getIsTransient()\n              == other.getIsTransient());\n        }\n        result = result && (hasIsNull() == other.hasIsNull());\n        if (hasIsNull()) {\n          result = result && (getIsNull()\n              == other.getIsNull());\n        }\n        result = result && (hasMetadata() == other.hasMetadata());\n        if (hasMetadata()) {\n          result = result && getMetadata()\n              .equals(other.getMetadata());\n        }\n        result = result && (hasProperties() == other.hasProperties());\n        if (hasProperties()) {\n          result = result && getProperties()\n              .equals(other.getProperties());\n        }\n        result = result && getValueCase().equals(\n            other.getValueCase());\n        if (!result) return false;\n        switch (valueCase_) {\n          case 10:\n            result = result && (getIntValue()\n                == other.getIntValue());\n            break;\n          case 11:\n            result = result && (getLongValue()\n                == other.getLongValue());\n            break;\n          case 12:\n            result = result && (\n                java.lang.Float.floatToIntBits(getFloatValue())\n                == java.lang.Float.floatToIntBits(\n                    other.getFloatValue()));\n            break;\n          case 13:\n            result = result && (\n                java.lang.Double.doubleToLongBits(getDoubleValue())\n                == java.lang.Double.doubleToLongBits(\n                    other.getDoubleValue()));\n            break;\n          case 14:\n            result = result && (getBooleanValue()\n                == other.getBooleanValue());\n            break;\n          case 15:\n            result = result && getStringValue()\n                .equals(other.getStringValue());\n            break;\n          case 16:\n            result = result && getBytesValue()\n                .equals(other.getBytesValue());\n            break;\n          case 17:\n            result = result && getDatasetValue()\n                .equals(other.getDatasetValue());\n            break;\n          case 18:\n            result = result && getTemplateValue()\n                .equals(other.getTemplateValue());\n            break;\n          case 19:\n            result = result && getExtensionValue()\n                .equals(other.getExtensionValue());\n            break;\n          case 0:\n          default:\n        }\n        result = result && unknownFields.equals(other.unknownFields);\n        return result;\n      }\n\n      @java.lang.Override\n      public int hashCode() {\n        if (memoizedHashCode != 0) {\n          return memoizedHashCode;\n        }\n        int hash = 41;\n        hash = (19 * hash) + getDescriptorForType().hashCode();\n        if (hasName()) {\n          hash = (37 * hash) + NAME_FIELD_NUMBER;\n          hash = (53 * hash) + getName().hashCode();\n        }\n        if (hasAlias()) {\n          hash = (37 * hash) + ALIAS_FIELD_NUMBER;\n          hash = (53 * hash) + com.google.protobuf.Internal.hashLong(\n              getAlias());\n        }\n        if (hasTimestamp()) {\n          hash = (37 * hash) + TIMESTAMP_FIELD_NUMBER;\n          hash = (53 * hash) + com.google.protobuf.Internal.hashLong(\n              getTimestamp());\n        }\n        if (hasDatatype()) {\n          hash = (37 * hash) + DATATYPE_FIELD_NUMBER;\n          hash = (53 * hash) + getDatatype();\n        }\n        if (hasIsHistorical()) {\n          hash = (37 * hash) + IS_HISTORICAL_FIELD_NUMBER;\n          hash = (53 * hash) + com.google.protobuf.Internal.hashBoolean(\n              getIsHistorical());\n        }\n        if (hasIsTransient()) {\n          hash = (37 * hash) + IS_TRANSIENT_FIELD_NUMBER;\n          hash = (53 * hash) + com.google.protobuf.Internal.hashBoolean(\n              getIsTransient());\n        }\n        if (hasIsNull()) {\n          hash = (37 * hash) + IS_NULL_FIELD_NUMBER;\n          hash = (53 * hash) + com.google.protobuf.Internal.hashBoolean(\n              getIsNull());\n        }\n        if (hasMetadata()) {\n          hash = (37 * hash) + METADATA_FIELD_NUMBER;\n          hash = (53 * hash) + getMetadata().hashCode();\n        }\n        if (hasProperties()) {\n          hash = (37 * hash) + PROPERTIES_FIELD_NUMBER;\n          hash = (53 * hash) + getProperties().hashCode();\n        }\n        switch (valueCase_) {\n          case 10:\n            hash = (37 * hash) + INT_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + getIntValue();\n            break;\n          case 11:\n            hash = (37 * hash) + LONG_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + com.google.protobuf.Internal.hashLong(\n                getLongValue());\n            break;\n          case 12:\n            hash = (37 * hash) + FLOAT_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + java.lang.Float.floatToIntBits(\n                getFloatValue());\n            break;\n          case 13:\n            hash = (37 * hash) + DOUBLE_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + com.google.protobuf.Internal.hashLong(\n                java.lang.Double.doubleToLongBits(getDoubleValue()));\n            break;\n          case 14:\n            hash = (37 * hash) + BOOLEAN_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + com.google.protobuf.Internal.hashBoolean(\n                getBooleanValue());\n            break;\n          case 15:\n            hash = (37 * hash) + STRING_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + getStringValue().hashCode();\n            break;\n          case 16:\n            hash = (37 * hash) + BYTES_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + getBytesValue().hashCode();\n            break;\n          case 17:\n            hash = (37 * hash) + DATASET_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + getDatasetValue().hashCode();\n            break;\n          case 18:\n            hash = (37 * hash) + TEMPLATE_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + getTemplateValue().hashCode();\n            break;\n          case 19:\n            hash = (37 * hash) + EXTENSION_VALUE_FIELD_NUMBER;\n            hash = (53 * hash) + getExtensionValue().hashCode();\n            break;\n          case 0:\n          default:\n        }\n        hash = (29 * hash) + unknownFields.hashCode();\n        memoizedHashCode = hash;\n        return hash;\n      }\n\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric parseFrom(\n          com.google.protobuf.ByteString data)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric parseFrom(\n          com.google.protobuf.ByteString data,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric parseFrom(byte[] data)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric parseFrom(\n          byte[] data,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n        return PARSER.parseFrom(data, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric parseFrom(java.io.InputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric parseFrom(\n          java.io.InputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric parseDelimitedFrom(java.io.InputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseDelimitedWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric parseDelimitedFrom(\n          java.io.InputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseDelimitedWithIOException(PARSER, input, extensionRegistry);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric parseFrom(\n          com.google.protobuf.CodedInputStream input)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input);\n      }\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric parseFrom(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        return com.google.protobuf.GeneratedMessageV3\n            .parseWithIOException(PARSER, input, extensionRegistry);\n      }\n\n      public Builder newBuilderForType() { return newBuilder(); }\n      public static Builder newBuilder() {\n        return DEFAULT_INSTANCE.toBuilder();\n      }\n      public static Builder newBuilder(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric prototype) {\n        return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);\n      }\n      public Builder toBuilder() {\n        return this == DEFAULT_INSTANCE\n            ? new Builder() : new Builder().mergeFrom(this);\n      }\n\n      @java.lang.Override\n      protected Builder newBuilderForType(\n          com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n        Builder builder = new Builder(parent);\n        return builder;\n      }\n      /**\n       * Protobuf type {@code org.eclipse.tahu.protobuf.Payload.Metric}\n       */\n      public static final class Builder extends\n          com.google.protobuf.GeneratedMessageV3.Builder<Builder> implements\n          // @@protoc_insertion_point(builder_implements:org.eclipse.tahu.protobuf.Payload.Metric)\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder {\n        public static final com.google.protobuf.Descriptors.Descriptor\n            getDescriptor() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Metric_descriptor;\n        }\n\n        protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n            internalGetFieldAccessorTable() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Metric_fieldAccessorTable\n              .ensureFieldAccessorsInitialized(\n                  org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder.class);\n        }\n\n        // Construct using org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.newBuilder()\n        private Builder() {\n          maybeForceBuilderInitialization();\n        }\n\n        private Builder(\n            com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n          super(parent);\n          maybeForceBuilderInitialization();\n        }\n        private void maybeForceBuilderInitialization() {\n          if (com.google.protobuf.GeneratedMessageV3\n                  .alwaysUseFieldBuilders) {\n            getMetadataFieldBuilder();\n            getPropertiesFieldBuilder();\n          }\n        }\n        public Builder clear() {\n          super.clear();\n          name_ = \"\";\n          bitField0_ = (bitField0_ & ~0x00000001);\n          alias_ = 0L;\n          bitField0_ = (bitField0_ & ~0x00000002);\n          timestamp_ = 0L;\n          bitField0_ = (bitField0_ & ~0x00000004);\n          datatype_ = 0;\n          bitField0_ = (bitField0_ & ~0x00000008);\n          isHistorical_ = false;\n          bitField0_ = (bitField0_ & ~0x00000010);\n          isTransient_ = false;\n          bitField0_ = (bitField0_ & ~0x00000020);\n          isNull_ = false;\n          bitField0_ = (bitField0_ & ~0x00000040);\n          if (metadataBuilder_ == null) {\n            metadata_ = null;\n          } else {\n            metadataBuilder_.clear();\n          }\n          bitField0_ = (bitField0_ & ~0x00000080);\n          if (propertiesBuilder_ == null) {\n            properties_ = null;\n          } else {\n            propertiesBuilder_.clear();\n          }\n          bitField0_ = (bitField0_ & ~0x00000100);\n          valueCase_ = 0;\n          value_ = null;\n          return this;\n        }\n\n        public com.google.protobuf.Descriptors.Descriptor\n            getDescriptorForType() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_Metric_descriptor;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric getDefaultInstanceForType() {\n          return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.getDefaultInstance();\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric build() {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric result = buildPartial();\n          if (!result.isInitialized()) {\n            throw newUninitializedMessageException(result);\n          }\n          return result;\n        }\n\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric buildPartial() {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric result = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric(this);\n          int from_bitField0_ = bitField0_;\n          int to_bitField0_ = 0;\n          if (((from_bitField0_ & 0x00000001) == 0x00000001)) {\n            to_bitField0_ |= 0x00000001;\n          }\n          result.name_ = name_;\n          if (((from_bitField0_ & 0x00000002) == 0x00000002)) {\n            to_bitField0_ |= 0x00000002;\n          }\n          result.alias_ = alias_;\n          if (((from_bitField0_ & 0x00000004) == 0x00000004)) {\n            to_bitField0_ |= 0x00000004;\n          }\n          result.timestamp_ = timestamp_;\n          if (((from_bitField0_ & 0x00000008) == 0x00000008)) {\n            to_bitField0_ |= 0x00000008;\n          }\n          result.datatype_ = datatype_;\n          if (((from_bitField0_ & 0x00000010) == 0x00000010)) {\n            to_bitField0_ |= 0x00000010;\n          }\n          result.isHistorical_ = isHistorical_;\n          if (((from_bitField0_ & 0x00000020) == 0x00000020)) {\n            to_bitField0_ |= 0x00000020;\n          }\n          result.isTransient_ = isTransient_;\n          if (((from_bitField0_ & 0x00000040) == 0x00000040)) {\n            to_bitField0_ |= 0x00000040;\n          }\n          result.isNull_ = isNull_;\n          if (((from_bitField0_ & 0x00000080) == 0x00000080)) {\n            to_bitField0_ |= 0x00000080;\n          }\n          if (metadataBuilder_ == null) {\n            result.metadata_ = metadata_;\n          } else {\n            result.metadata_ = metadataBuilder_.build();\n          }\n          if (((from_bitField0_ & 0x00000100) == 0x00000100)) {\n            to_bitField0_ |= 0x00000100;\n          }\n          if (propertiesBuilder_ == null) {\n            result.properties_ = properties_;\n          } else {\n            result.properties_ = propertiesBuilder_.build();\n          }\n          if (valueCase_ == 10) {\n            result.value_ = value_;\n          }\n          if (valueCase_ == 11) {\n            result.value_ = value_;\n          }\n          if (valueCase_ == 12) {\n            result.value_ = value_;\n          }\n          if (valueCase_ == 13) {\n            result.value_ = value_;\n          }\n          if (valueCase_ == 14) {\n            result.value_ = value_;\n          }\n          if (valueCase_ == 15) {\n            result.value_ = value_;\n          }\n          if (valueCase_ == 16) {\n            result.value_ = value_;\n          }\n          if (valueCase_ == 17) {\n            if (datasetValueBuilder_ == null) {\n              result.value_ = value_;\n            } else {\n              result.value_ = datasetValueBuilder_.build();\n            }\n          }\n          if (valueCase_ == 18) {\n            if (templateValueBuilder_ == null) {\n              result.value_ = value_;\n            } else {\n              result.value_ = templateValueBuilder_.build();\n            }\n          }\n          if (valueCase_ == 19) {\n            if (extensionValueBuilder_ == null) {\n              result.value_ = value_;\n            } else {\n              result.value_ = extensionValueBuilder_.build();\n            }\n          }\n          result.bitField0_ = to_bitField0_;\n          result.valueCase_ = valueCase_;\n          onBuilt();\n          return result;\n        }\n\n        public Builder clone() {\n          return (Builder) super.clone();\n        }\n        public Builder setField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            Object value) {\n          return (Builder) super.setField(field, value);\n        }\n        public Builder clearField(\n            com.google.protobuf.Descriptors.FieldDescriptor field) {\n          return (Builder) super.clearField(field);\n        }\n        public Builder clearOneof(\n            com.google.protobuf.Descriptors.OneofDescriptor oneof) {\n          return (Builder) super.clearOneof(oneof);\n        }\n        public Builder setRepeatedField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            int index, Object value) {\n          return (Builder) super.setRepeatedField(field, index, value);\n        }\n        public Builder addRepeatedField(\n            com.google.protobuf.Descriptors.FieldDescriptor field,\n            Object value) {\n          return (Builder) super.addRepeatedField(field, value);\n        }\n        public Builder mergeFrom(com.google.protobuf.Message other) {\n          if (other instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric) {\n            return mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric)other);\n          } else {\n            super.mergeFrom(other);\n            return this;\n          }\n        }\n\n        public Builder mergeFrom(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric other) {\n          if (other == org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.getDefaultInstance()) return this;\n          if (other.hasName()) {\n            bitField0_ |= 0x00000001;\n            name_ = other.name_;\n            onChanged();\n          }\n          if (other.hasAlias()) {\n            setAlias(other.getAlias());\n          }\n          if (other.hasTimestamp()) {\n            setTimestamp(other.getTimestamp());\n          }\n          if (other.hasDatatype()) {\n            setDatatype(other.getDatatype());\n          }\n          if (other.hasIsHistorical()) {\n            setIsHistorical(other.getIsHistorical());\n          }\n          if (other.hasIsTransient()) {\n            setIsTransient(other.getIsTransient());\n          }\n          if (other.hasIsNull()) {\n            setIsNull(other.getIsNull());\n          }\n          if (other.hasMetadata()) {\n            mergeMetadata(other.getMetadata());\n          }\n          if (other.hasProperties()) {\n            mergeProperties(other.getProperties());\n          }\n          switch (other.getValueCase()) {\n            case INT_VALUE: {\n              setIntValue(other.getIntValue());\n              break;\n            }\n            case LONG_VALUE: {\n              setLongValue(other.getLongValue());\n              break;\n            }\n            case FLOAT_VALUE: {\n              setFloatValue(other.getFloatValue());\n              break;\n            }\n            case DOUBLE_VALUE: {\n              setDoubleValue(other.getDoubleValue());\n              break;\n            }\n            case BOOLEAN_VALUE: {\n              setBooleanValue(other.getBooleanValue());\n              break;\n            }\n            case STRING_VALUE: {\n              valueCase_ = 15;\n              value_ = other.value_;\n              onChanged();\n              break;\n            }\n            case BYTES_VALUE: {\n              setBytesValue(other.getBytesValue());\n              break;\n            }\n            case DATASET_VALUE: {\n              mergeDatasetValue(other.getDatasetValue());\n              break;\n            }\n            case TEMPLATE_VALUE: {\n              mergeTemplateValue(other.getTemplateValue());\n              break;\n            }\n            case EXTENSION_VALUE: {\n              mergeExtensionValue(other.getExtensionValue());\n              break;\n            }\n            case VALUE_NOT_SET: {\n              break;\n            }\n          }\n          this.mergeUnknownFields(other.unknownFields);\n          onChanged();\n          return this;\n        }\n\n        public final boolean isInitialized() {\n          if (hasMetadata()) {\n            if (!getMetadata().isInitialized()) {\n              return false;\n            }\n          }\n          if (hasProperties()) {\n            if (!getProperties().isInitialized()) {\n              return false;\n            }\n          }\n          if (hasDatasetValue()) {\n            if (!getDatasetValue().isInitialized()) {\n              return false;\n            }\n          }\n          if (hasTemplateValue()) {\n            if (!getTemplateValue().isInitialized()) {\n              return false;\n            }\n          }\n          if (hasExtensionValue()) {\n            if (!getExtensionValue().isInitialized()) {\n              return false;\n            }\n          }\n          return true;\n        }\n\n        public Builder mergeFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws java.io.IOException {\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric parsedMessage = null;\n          try {\n            parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry);\n          } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n            parsedMessage = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric) e.getUnfinishedMessage();\n            throw e.unwrapIOException();\n          } finally {\n            if (parsedMessage != null) {\n              mergeFrom(parsedMessage);\n            }\n          }\n          return this;\n        }\n        private int valueCase_ = 0;\n        private java.lang.Object value_;\n        public ValueCase\n            getValueCase() {\n          return ValueCase.forNumber(\n              valueCase_);\n        }\n\n        public Builder clearValue() {\n          valueCase_ = 0;\n          value_ = null;\n          onChanged();\n          return this;\n        }\n\n        private int bitField0_;\n\n        private java.lang.Object name_ = \"\";\n        /**\n         * <pre>\n         * Metric name - should only be included on birth\n         * </pre>\n         *\n         * <code>optional string name = 1;</code>\n         */\n        public boolean hasName() {\n          return ((bitField0_ & 0x00000001) == 0x00000001);\n        }\n        /**\n         * <pre>\n         * Metric name - should only be included on birth\n         * </pre>\n         *\n         * <code>optional string name = 1;</code>\n         */\n        public java.lang.String getName() {\n          java.lang.Object ref = name_;\n          if (!(ref instanceof java.lang.String)) {\n            com.google.protobuf.ByteString bs =\n                (com.google.protobuf.ByteString) ref;\n            java.lang.String s = bs.toStringUtf8();\n            if (bs.isValidUtf8()) {\n              name_ = s;\n            }\n            return s;\n          } else {\n            return (java.lang.String) ref;\n          }\n        }\n        /**\n         * <pre>\n         * Metric name - should only be included on birth\n         * </pre>\n         *\n         * <code>optional string name = 1;</code>\n         */\n        public com.google.protobuf.ByteString\n            getNameBytes() {\n          java.lang.Object ref = name_;\n          if (ref instanceof String) {\n            com.google.protobuf.ByteString b = \n                com.google.protobuf.ByteString.copyFromUtf8(\n                    (java.lang.String) ref);\n            name_ = b;\n            return b;\n          } else {\n            return (com.google.protobuf.ByteString) ref;\n          }\n        }\n        /**\n         * <pre>\n         * Metric name - should only be included on birth\n         * </pre>\n         *\n         * <code>optional string name = 1;</code>\n         */\n        public Builder setName(\n            java.lang.String value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000001;\n          name_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Metric name - should only be included on birth\n         * </pre>\n         *\n         * <code>optional string name = 1;</code>\n         */\n        public Builder clearName() {\n          bitField0_ = (bitField0_ & ~0x00000001);\n          name_ = getDefaultInstance().getName();\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Metric name - should only be included on birth\n         * </pre>\n         *\n         * <code>optional string name = 1;</code>\n         */\n        public Builder setNameBytes(\n            com.google.protobuf.ByteString value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000001;\n          name_ = value;\n          onChanged();\n          return this;\n        }\n\n        private long alias_ ;\n        /**\n         * <pre>\n         * Metric alias - tied to name on birth and included in all later DATA messages\n         * </pre>\n         *\n         * <code>optional uint64 alias = 2;</code>\n         */\n        public boolean hasAlias() {\n          return ((bitField0_ & 0x00000002) == 0x00000002);\n        }\n        /**\n         * <pre>\n         * Metric alias - tied to name on birth and included in all later DATA messages\n         * </pre>\n         *\n         * <code>optional uint64 alias = 2;</code>\n         */\n        public long getAlias() {\n          return alias_;\n        }\n        /**\n         * <pre>\n         * Metric alias - tied to name on birth and included in all later DATA messages\n         * </pre>\n         *\n         * <code>optional uint64 alias = 2;</code>\n         */\n        public Builder setAlias(long value) {\n          bitField0_ |= 0x00000002;\n          alias_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Metric alias - tied to name on birth and included in all later DATA messages\n         * </pre>\n         *\n         * <code>optional uint64 alias = 2;</code>\n         */\n        public Builder clearAlias() {\n          bitField0_ = (bitField0_ & ~0x00000002);\n          alias_ = 0L;\n          onChanged();\n          return this;\n        }\n\n        private long timestamp_ ;\n        /**\n         * <pre>\n         * Timestamp associated with data acquisition time\n         * </pre>\n         *\n         * <code>optional uint64 timestamp = 3;</code>\n         */\n        public boolean hasTimestamp() {\n          return ((bitField0_ & 0x00000004) == 0x00000004);\n        }\n        /**\n         * <pre>\n         * Timestamp associated with data acquisition time\n         * </pre>\n         *\n         * <code>optional uint64 timestamp = 3;</code>\n         */\n        public long getTimestamp() {\n          return timestamp_;\n        }\n        /**\n         * <pre>\n         * Timestamp associated with data acquisition time\n         * </pre>\n         *\n         * <code>optional uint64 timestamp = 3;</code>\n         */\n        public Builder setTimestamp(long value) {\n          bitField0_ |= 0x00000004;\n          timestamp_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Timestamp associated with data acquisition time\n         * </pre>\n         *\n         * <code>optional uint64 timestamp = 3;</code>\n         */\n        public Builder clearTimestamp() {\n          bitField0_ = (bitField0_ & ~0x00000004);\n          timestamp_ = 0L;\n          onChanged();\n          return this;\n        }\n\n        private int datatype_ ;\n        /**\n         * <pre>\n         * DataType of the metric/tag value\n         * </pre>\n         *\n         * <code>optional uint32 datatype = 4;</code>\n         */\n        public boolean hasDatatype() {\n          return ((bitField0_ & 0x00000008) == 0x00000008);\n        }\n        /**\n         * <pre>\n         * DataType of the metric/tag value\n         * </pre>\n         *\n         * <code>optional uint32 datatype = 4;</code>\n         */\n        public int getDatatype() {\n          return datatype_;\n        }\n        /**\n         * <pre>\n         * DataType of the metric/tag value\n         * </pre>\n         *\n         * <code>optional uint32 datatype = 4;</code>\n         */\n        public Builder setDatatype(int value) {\n          bitField0_ |= 0x00000008;\n          datatype_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * DataType of the metric/tag value\n         * </pre>\n         *\n         * <code>optional uint32 datatype = 4;</code>\n         */\n        public Builder clearDatatype() {\n          bitField0_ = (bitField0_ & ~0x00000008);\n          datatype_ = 0;\n          onChanged();\n          return this;\n        }\n\n        private boolean isHistorical_ ;\n        /**\n         * <pre>\n         * If this is historical data and should not update real time tag\n         * </pre>\n         *\n         * <code>optional bool is_historical = 5;</code>\n         */\n        public boolean hasIsHistorical() {\n          return ((bitField0_ & 0x00000010) == 0x00000010);\n        }\n        /**\n         * <pre>\n         * If this is historical data and should not update real time tag\n         * </pre>\n         *\n         * <code>optional bool is_historical = 5;</code>\n         */\n        public boolean getIsHistorical() {\n          return isHistorical_;\n        }\n        /**\n         * <pre>\n         * If this is historical data and should not update real time tag\n         * </pre>\n         *\n         * <code>optional bool is_historical = 5;</code>\n         */\n        public Builder setIsHistorical(boolean value) {\n          bitField0_ |= 0x00000010;\n          isHistorical_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * If this is historical data and should not update real time tag\n         * </pre>\n         *\n         * <code>optional bool is_historical = 5;</code>\n         */\n        public Builder clearIsHistorical() {\n          bitField0_ = (bitField0_ & ~0x00000010);\n          isHistorical_ = false;\n          onChanged();\n          return this;\n        }\n\n        private boolean isTransient_ ;\n        /**\n         * <pre>\n         * Tells consuming clients such as MQTT Engine to not store this as a tag\n         * </pre>\n         *\n         * <code>optional bool is_transient = 6;</code>\n         */\n        public boolean hasIsTransient() {\n          return ((bitField0_ & 0x00000020) == 0x00000020);\n        }\n        /**\n         * <pre>\n         * Tells consuming clients such as MQTT Engine to not store this as a tag\n         * </pre>\n         *\n         * <code>optional bool is_transient = 6;</code>\n         */\n        public boolean getIsTransient() {\n          return isTransient_;\n        }\n        /**\n         * <pre>\n         * Tells consuming clients such as MQTT Engine to not store this as a tag\n         * </pre>\n         *\n         * <code>optional bool is_transient = 6;</code>\n         */\n        public Builder setIsTransient(boolean value) {\n          bitField0_ |= 0x00000020;\n          isTransient_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Tells consuming clients such as MQTT Engine to not store this as a tag\n         * </pre>\n         *\n         * <code>optional bool is_transient = 6;</code>\n         */\n        public Builder clearIsTransient() {\n          bitField0_ = (bitField0_ & ~0x00000020);\n          isTransient_ = false;\n          onChanged();\n          return this;\n        }\n\n        private boolean isNull_ ;\n        /**\n         * <pre>\n         * If this is null - explicitly say so rather than using -1, false, etc for some datatypes.\n         * </pre>\n         *\n         * <code>optional bool is_null = 7;</code>\n         */\n        public boolean hasIsNull() {\n          return ((bitField0_ & 0x00000040) == 0x00000040);\n        }\n        /**\n         * <pre>\n         * If this is null - explicitly say so rather than using -1, false, etc for some datatypes.\n         * </pre>\n         *\n         * <code>optional bool is_null = 7;</code>\n         */\n        public boolean getIsNull() {\n          return isNull_;\n        }\n        /**\n         * <pre>\n         * If this is null - explicitly say so rather than using -1, false, etc for some datatypes.\n         * </pre>\n         *\n         * <code>optional bool is_null = 7;</code>\n         */\n        public Builder setIsNull(boolean value) {\n          bitField0_ |= 0x00000040;\n          isNull_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * If this is null - explicitly say so rather than using -1, false, etc for some datatypes.\n         * </pre>\n         *\n         * <code>optional bool is_null = 7;</code>\n         */\n        public Builder clearIsNull() {\n          bitField0_ = (bitField0_ & ~0x00000040);\n          isNull_ = false;\n          onChanged();\n          return this;\n        }\n\n        private org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData metadata_ = null;\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaDataOrBuilder> metadataBuilder_;\n        /**\n         * <pre>\n         * Metadata for the payload\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.MetaData metadata = 8;</code>\n         */\n        public boolean hasMetadata() {\n          return ((bitField0_ & 0x00000080) == 0x00000080);\n        }\n        /**\n         * <pre>\n         * Metadata for the payload\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.MetaData metadata = 8;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData getMetadata() {\n          if (metadataBuilder_ == null) {\n            return metadata_ == null ? org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.getDefaultInstance() : metadata_;\n          } else {\n            return metadataBuilder_.getMessage();\n          }\n        }\n        /**\n         * <pre>\n         * Metadata for the payload\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.MetaData metadata = 8;</code>\n         */\n        public Builder setMetadata(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData value) {\n          if (metadataBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            metadata_ = value;\n            onChanged();\n          } else {\n            metadataBuilder_.setMessage(value);\n          }\n          bitField0_ |= 0x00000080;\n          return this;\n        }\n        /**\n         * <pre>\n         * Metadata for the payload\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.MetaData metadata = 8;</code>\n         */\n        public Builder setMetadata(\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.Builder builderForValue) {\n          if (metadataBuilder_ == null) {\n            metadata_ = builderForValue.build();\n            onChanged();\n          } else {\n            metadataBuilder_.setMessage(builderForValue.build());\n          }\n          bitField0_ |= 0x00000080;\n          return this;\n        }\n        /**\n         * <pre>\n         * Metadata for the payload\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.MetaData metadata = 8;</code>\n         */\n        public Builder mergeMetadata(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData value) {\n          if (metadataBuilder_ == null) {\n            if (((bitField0_ & 0x00000080) == 0x00000080) &&\n                metadata_ != null &&\n                metadata_ != org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.getDefaultInstance()) {\n              metadata_ =\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.newBuilder(metadata_).mergeFrom(value).buildPartial();\n            } else {\n              metadata_ = value;\n            }\n            onChanged();\n          } else {\n            metadataBuilder_.mergeFrom(value);\n          }\n          bitField0_ |= 0x00000080;\n          return this;\n        }\n        /**\n         * <pre>\n         * Metadata for the payload\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.MetaData metadata = 8;</code>\n         */\n        public Builder clearMetadata() {\n          if (metadataBuilder_ == null) {\n            metadata_ = null;\n            onChanged();\n          } else {\n            metadataBuilder_.clear();\n          }\n          bitField0_ = (bitField0_ & ~0x00000080);\n          return this;\n        }\n        /**\n         * <pre>\n         * Metadata for the payload\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.MetaData metadata = 8;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.Builder getMetadataBuilder() {\n          bitField0_ |= 0x00000080;\n          onChanged();\n          return getMetadataFieldBuilder().getBuilder();\n        }\n        /**\n         * <pre>\n         * Metadata for the payload\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.MetaData metadata = 8;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaDataOrBuilder getMetadataOrBuilder() {\n          if (metadataBuilder_ != null) {\n            return metadataBuilder_.getMessageOrBuilder();\n          } else {\n            return metadata_ == null ?\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.getDefaultInstance() : metadata_;\n          }\n        }\n        /**\n         * <pre>\n         * Metadata for the payload\n         * </pre>\n         *\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.MetaData metadata = 8;</code>\n         */\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaDataOrBuilder> \n            getMetadataFieldBuilder() {\n          if (metadataBuilder_ == null) {\n            metadataBuilder_ = new com.google.protobuf.SingleFieldBuilderV3<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaData.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetaDataOrBuilder>(\n                    getMetadata(),\n                    getParentForChildren(),\n                    isClean());\n            metadata_ = null;\n          }\n          return metadataBuilder_;\n        }\n\n        private org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet properties_ = null;\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder> propertiesBuilder_;\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet properties = 9;</code>\n         */\n        public boolean hasProperties() {\n          return ((bitField0_ & 0x00000100) == 0x00000100);\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet properties = 9;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet getProperties() {\n          if (propertiesBuilder_ == null) {\n            return properties_ == null ? org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance() : properties_;\n          } else {\n            return propertiesBuilder_.getMessage();\n          }\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet properties = 9;</code>\n         */\n        public Builder setProperties(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet value) {\n          if (propertiesBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            properties_ = value;\n            onChanged();\n          } else {\n            propertiesBuilder_.setMessage(value);\n          }\n          bitField0_ |= 0x00000100;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet properties = 9;</code>\n         */\n        public Builder setProperties(\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder builderForValue) {\n          if (propertiesBuilder_ == null) {\n            properties_ = builderForValue.build();\n            onChanged();\n          } else {\n            propertiesBuilder_.setMessage(builderForValue.build());\n          }\n          bitField0_ |= 0x00000100;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet properties = 9;</code>\n         */\n        public Builder mergeProperties(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet value) {\n          if (propertiesBuilder_ == null) {\n            if (((bitField0_ & 0x00000100) == 0x00000100) &&\n                properties_ != null &&\n                properties_ != org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance()) {\n              properties_ =\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.newBuilder(properties_).mergeFrom(value).buildPartial();\n            } else {\n              properties_ = value;\n            }\n            onChanged();\n          } else {\n            propertiesBuilder_.mergeFrom(value);\n          }\n          bitField0_ |= 0x00000100;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet properties = 9;</code>\n         */\n        public Builder clearProperties() {\n          if (propertiesBuilder_ == null) {\n            properties_ = null;\n            onChanged();\n          } else {\n            propertiesBuilder_.clear();\n          }\n          bitField0_ = (bitField0_ & ~0x00000100);\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet properties = 9;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder getPropertiesBuilder() {\n          bitField0_ |= 0x00000100;\n          onChanged();\n          return getPropertiesFieldBuilder().getBuilder();\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet properties = 9;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder getPropertiesOrBuilder() {\n          if (propertiesBuilder_ != null) {\n            return propertiesBuilder_.getMessageOrBuilder();\n          } else {\n            return properties_ == null ?\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.getDefaultInstance() : properties_;\n          }\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.PropertySet properties = 9;</code>\n         */\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder> \n            getPropertiesFieldBuilder() {\n          if (propertiesBuilder_ == null) {\n            propertiesBuilder_ = new com.google.protobuf.SingleFieldBuilderV3<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySet.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.PropertySetOrBuilder>(\n                    getProperties(),\n                    getParentForChildren(),\n                    isClean());\n            properties_ = null;\n          }\n          return propertiesBuilder_;\n        }\n\n        /**\n         * <code>optional uint32 int_value = 10;</code>\n         */\n        public boolean hasIntValue() {\n          return valueCase_ == 10;\n        }\n        /**\n         * <code>optional uint32 int_value = 10;</code>\n         */\n        public int getIntValue() {\n          if (valueCase_ == 10) {\n            return (java.lang.Integer) value_;\n          }\n          return 0;\n        }\n        /**\n         * <code>optional uint32 int_value = 10;</code>\n         */\n        public Builder setIntValue(int value) {\n          valueCase_ = 10;\n          value_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional uint32 int_value = 10;</code>\n         */\n        public Builder clearIntValue() {\n          if (valueCase_ == 10) {\n            valueCase_ = 0;\n            value_ = null;\n            onChanged();\n          }\n          return this;\n        }\n\n        /**\n         * <code>optional uint64 long_value = 11;</code>\n         */\n        public boolean hasLongValue() {\n          return valueCase_ == 11;\n        }\n        /**\n         * <code>optional uint64 long_value = 11;</code>\n         */\n        public long getLongValue() {\n          if (valueCase_ == 11) {\n            return (java.lang.Long) value_;\n          }\n          return 0L;\n        }\n        /**\n         * <code>optional uint64 long_value = 11;</code>\n         */\n        public Builder setLongValue(long value) {\n          valueCase_ = 11;\n          value_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional uint64 long_value = 11;</code>\n         */\n        public Builder clearLongValue() {\n          if (valueCase_ == 11) {\n            valueCase_ = 0;\n            value_ = null;\n            onChanged();\n          }\n          return this;\n        }\n\n        /**\n         * <code>optional float float_value = 12;</code>\n         */\n        public boolean hasFloatValue() {\n          return valueCase_ == 12;\n        }\n        /**\n         * <code>optional float float_value = 12;</code>\n         */\n        public float getFloatValue() {\n          if (valueCase_ == 12) {\n            return (java.lang.Float) value_;\n          }\n          return 0F;\n        }\n        /**\n         * <code>optional float float_value = 12;</code>\n         */\n        public Builder setFloatValue(float value) {\n          valueCase_ = 12;\n          value_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional float float_value = 12;</code>\n         */\n        public Builder clearFloatValue() {\n          if (valueCase_ == 12) {\n            valueCase_ = 0;\n            value_ = null;\n            onChanged();\n          }\n          return this;\n        }\n\n        /**\n         * <code>optional double double_value = 13;</code>\n         */\n        public boolean hasDoubleValue() {\n          return valueCase_ == 13;\n        }\n        /**\n         * <code>optional double double_value = 13;</code>\n         */\n        public double getDoubleValue() {\n          if (valueCase_ == 13) {\n            return (java.lang.Double) value_;\n          }\n          return 0D;\n        }\n        /**\n         * <code>optional double double_value = 13;</code>\n         */\n        public Builder setDoubleValue(double value) {\n          valueCase_ = 13;\n          value_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional double double_value = 13;</code>\n         */\n        public Builder clearDoubleValue() {\n          if (valueCase_ == 13) {\n            valueCase_ = 0;\n            value_ = null;\n            onChanged();\n          }\n          return this;\n        }\n\n        /**\n         * <code>optional bool boolean_value = 14;</code>\n         */\n        public boolean hasBooleanValue() {\n          return valueCase_ == 14;\n        }\n        /**\n         * <code>optional bool boolean_value = 14;</code>\n         */\n        public boolean getBooleanValue() {\n          if (valueCase_ == 14) {\n            return (java.lang.Boolean) value_;\n          }\n          return false;\n        }\n        /**\n         * <code>optional bool boolean_value = 14;</code>\n         */\n        public Builder setBooleanValue(boolean value) {\n          valueCase_ = 14;\n          value_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional bool boolean_value = 14;</code>\n         */\n        public Builder clearBooleanValue() {\n          if (valueCase_ == 14) {\n            valueCase_ = 0;\n            value_ = null;\n            onChanged();\n          }\n          return this;\n        }\n\n        /**\n         * <code>optional string string_value = 15;</code>\n         */\n        public boolean hasStringValue() {\n          return valueCase_ == 15;\n        }\n        /**\n         * <code>optional string string_value = 15;</code>\n         */\n        public java.lang.String getStringValue() {\n          java.lang.Object ref = \"\";\n          if (valueCase_ == 15) {\n            ref = value_;\n          }\n          if (!(ref instanceof java.lang.String)) {\n            com.google.protobuf.ByteString bs =\n                (com.google.protobuf.ByteString) ref;\n            java.lang.String s = bs.toStringUtf8();\n            if (valueCase_ == 15) {\n              if (bs.isValidUtf8()) {\n                value_ = s;\n              }\n            }\n            return s;\n          } else {\n            return (java.lang.String) ref;\n          }\n        }\n        /**\n         * <code>optional string string_value = 15;</code>\n         */\n        public com.google.protobuf.ByteString\n            getStringValueBytes() {\n          java.lang.Object ref = \"\";\n          if (valueCase_ == 15) {\n            ref = value_;\n          }\n          if (ref instanceof String) {\n            com.google.protobuf.ByteString b = \n                com.google.protobuf.ByteString.copyFromUtf8(\n                    (java.lang.String) ref);\n            if (valueCase_ == 15) {\n              value_ = b;\n            }\n            return b;\n          } else {\n            return (com.google.protobuf.ByteString) ref;\n          }\n        }\n        /**\n         * <code>optional string string_value = 15;</code>\n         */\n        public Builder setStringValue(\n            java.lang.String value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  valueCase_ = 15;\n          value_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <code>optional string string_value = 15;</code>\n         */\n        public Builder clearStringValue() {\n          if (valueCase_ == 15) {\n            valueCase_ = 0;\n            value_ = null;\n            onChanged();\n          }\n          return this;\n        }\n        /**\n         * <code>optional string string_value = 15;</code>\n         */\n        public Builder setStringValueBytes(\n            com.google.protobuf.ByteString value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  valueCase_ = 15;\n          value_ = value;\n          onChanged();\n          return this;\n        }\n\n        /**\n         * <pre>\n         * Bytes, File\n         * </pre>\n         *\n         * <code>optional bytes bytes_value = 16;</code>\n         */\n        public boolean hasBytesValue() {\n          return valueCase_ == 16;\n        }\n        /**\n         * <pre>\n         * Bytes, File\n         * </pre>\n         *\n         * <code>optional bytes bytes_value = 16;</code>\n         */\n        public com.google.protobuf.ByteString getBytesValue() {\n          if (valueCase_ == 16) {\n            return (com.google.protobuf.ByteString) value_;\n          }\n          return com.google.protobuf.ByteString.EMPTY;\n        }\n        /**\n         * <pre>\n         * Bytes, File\n         * </pre>\n         *\n         * <code>optional bytes bytes_value = 16;</code>\n         */\n        public Builder setBytesValue(com.google.protobuf.ByteString value) {\n          if (value == null) {\n    throw new NullPointerException();\n  }\n  valueCase_ = 16;\n          value_ = value;\n          onChanged();\n          return this;\n        }\n        /**\n         * <pre>\n         * Bytes, File\n         * </pre>\n         *\n         * <code>optional bytes bytes_value = 16;</code>\n         */\n        public Builder clearBytesValue() {\n          if (valueCase_ == 16) {\n            valueCase_ = 0;\n            value_ = null;\n            onChanged();\n          }\n          return this;\n        }\n\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSetOrBuilder> datasetValueBuilder_;\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet dataset_value = 17;</code>\n         */\n        public boolean hasDatasetValue() {\n          return valueCase_ == 17;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet dataset_value = 17;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet getDatasetValue() {\n          if (datasetValueBuilder_ == null) {\n            if (valueCase_ == 17) {\n              return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet) value_;\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.getDefaultInstance();\n          } else {\n            if (valueCase_ == 17) {\n              return datasetValueBuilder_.getMessage();\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.getDefaultInstance();\n          }\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet dataset_value = 17;</code>\n         */\n        public Builder setDatasetValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet value) {\n          if (datasetValueBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            value_ = value;\n            onChanged();\n          } else {\n            datasetValueBuilder_.setMessage(value);\n          }\n          valueCase_ = 17;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet dataset_value = 17;</code>\n         */\n        public Builder setDatasetValue(\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Builder builderForValue) {\n          if (datasetValueBuilder_ == null) {\n            value_ = builderForValue.build();\n            onChanged();\n          } else {\n            datasetValueBuilder_.setMessage(builderForValue.build());\n          }\n          valueCase_ = 17;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet dataset_value = 17;</code>\n         */\n        public Builder mergeDatasetValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet value) {\n          if (datasetValueBuilder_ == null) {\n            if (valueCase_ == 17 &&\n                value_ != org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.getDefaultInstance()) {\n              value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.newBuilder((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet) value_)\n                  .mergeFrom(value).buildPartial();\n            } else {\n              value_ = value;\n            }\n            onChanged();\n          } else {\n            if (valueCase_ == 17) {\n              datasetValueBuilder_.mergeFrom(value);\n            }\n            datasetValueBuilder_.setMessage(value);\n          }\n          valueCase_ = 17;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet dataset_value = 17;</code>\n         */\n        public Builder clearDatasetValue() {\n          if (datasetValueBuilder_ == null) {\n            if (valueCase_ == 17) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n          } else {\n            if (valueCase_ == 17) {\n              valueCase_ = 0;\n              value_ = null;\n            }\n            datasetValueBuilder_.clear();\n          }\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet dataset_value = 17;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Builder getDatasetValueBuilder() {\n          return getDatasetValueFieldBuilder().getBuilder();\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet dataset_value = 17;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSetOrBuilder getDatasetValueOrBuilder() {\n          if ((valueCase_ == 17) && (datasetValueBuilder_ != null)) {\n            return datasetValueBuilder_.getMessageOrBuilder();\n          } else {\n            if (valueCase_ == 17) {\n              return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet) value_;\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.getDefaultInstance();\n          }\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.DataSet dataset_value = 17;</code>\n         */\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSetOrBuilder> \n            getDatasetValueFieldBuilder() {\n          if (datasetValueBuilder_ == null) {\n            if (!(valueCase_ == 17)) {\n              value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.getDefaultInstance();\n            }\n            datasetValueBuilder_ = new com.google.protobuf.SingleFieldBuilderV3<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSetOrBuilder>(\n                    (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.DataSet) value_,\n                    getParentForChildren(),\n                    isClean());\n            value_ = null;\n          }\n          valueCase_ = 17;\n          onChanged();;\n          return datasetValueBuilder_;\n        }\n\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.TemplateOrBuilder> templateValueBuilder_;\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Template template_value = 18;</code>\n         */\n        public boolean hasTemplateValue() {\n          return valueCase_ == 18;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Template template_value = 18;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template getTemplateValue() {\n          if (templateValueBuilder_ == null) {\n            if (valueCase_ == 18) {\n              return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template) value_;\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.getDefaultInstance();\n          } else {\n            if (valueCase_ == 18) {\n              return templateValueBuilder_.getMessage();\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.getDefaultInstance();\n          }\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Template template_value = 18;</code>\n         */\n        public Builder setTemplateValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template value) {\n          if (templateValueBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            value_ = value;\n            onChanged();\n          } else {\n            templateValueBuilder_.setMessage(value);\n          }\n          valueCase_ = 18;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Template template_value = 18;</code>\n         */\n        public Builder setTemplateValue(\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Builder builderForValue) {\n          if (templateValueBuilder_ == null) {\n            value_ = builderForValue.build();\n            onChanged();\n          } else {\n            templateValueBuilder_.setMessage(builderForValue.build());\n          }\n          valueCase_ = 18;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Template template_value = 18;</code>\n         */\n        public Builder mergeTemplateValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template value) {\n          if (templateValueBuilder_ == null) {\n            if (valueCase_ == 18 &&\n                value_ != org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.getDefaultInstance()) {\n              value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.newBuilder((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template) value_)\n                  .mergeFrom(value).buildPartial();\n            } else {\n              value_ = value;\n            }\n            onChanged();\n          } else {\n            if (valueCase_ == 18) {\n              templateValueBuilder_.mergeFrom(value);\n            }\n            templateValueBuilder_.setMessage(value);\n          }\n          valueCase_ = 18;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Template template_value = 18;</code>\n         */\n        public Builder clearTemplateValue() {\n          if (templateValueBuilder_ == null) {\n            if (valueCase_ == 18) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n          } else {\n            if (valueCase_ == 18) {\n              valueCase_ = 0;\n              value_ = null;\n            }\n            templateValueBuilder_.clear();\n          }\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Template template_value = 18;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Builder getTemplateValueBuilder() {\n          return getTemplateValueFieldBuilder().getBuilder();\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Template template_value = 18;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.TemplateOrBuilder getTemplateValueOrBuilder() {\n          if ((valueCase_ == 18) && (templateValueBuilder_ != null)) {\n            return templateValueBuilder_.getMessageOrBuilder();\n          } else {\n            if (valueCase_ == 18) {\n              return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template) value_;\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.getDefaultInstance();\n          }\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Template template_value = 18;</code>\n         */\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.TemplateOrBuilder> \n            getTemplateValueFieldBuilder() {\n          if (templateValueBuilder_ == null) {\n            if (!(valueCase_ == 18)) {\n              value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.getDefaultInstance();\n            }\n            templateValueBuilder_ = new com.google.protobuf.SingleFieldBuilderV3<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.TemplateOrBuilder>(\n                    (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Template) value_,\n                    getParentForChildren(),\n                    isClean());\n            value_ = null;\n          }\n          valueCase_ = 18;\n          onChanged();;\n          return templateValueBuilder_;\n        }\n\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtensionOrBuilder> extensionValueBuilder_;\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension extension_value = 19;</code>\n         */\n        public boolean hasExtensionValue() {\n          return valueCase_ == 19;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension extension_value = 19;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension getExtensionValue() {\n          if (extensionValueBuilder_ == null) {\n            if (valueCase_ == 19) {\n              return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension) value_;\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.getDefaultInstance();\n          } else {\n            if (valueCase_ == 19) {\n              return extensionValueBuilder_.getMessage();\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.getDefaultInstance();\n          }\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension extension_value = 19;</code>\n         */\n        public Builder setExtensionValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension value) {\n          if (extensionValueBuilder_ == null) {\n            if (value == null) {\n              throw new NullPointerException();\n            }\n            value_ = value;\n            onChanged();\n          } else {\n            extensionValueBuilder_.setMessage(value);\n          }\n          valueCase_ = 19;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension extension_value = 19;</code>\n         */\n        public Builder setExtensionValue(\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.Builder builderForValue) {\n          if (extensionValueBuilder_ == null) {\n            value_ = builderForValue.build();\n            onChanged();\n          } else {\n            extensionValueBuilder_.setMessage(builderForValue.build());\n          }\n          valueCase_ = 19;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension extension_value = 19;</code>\n         */\n        public Builder mergeExtensionValue(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension value) {\n          if (extensionValueBuilder_ == null) {\n            if (valueCase_ == 19 &&\n                value_ != org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.getDefaultInstance()) {\n              value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.newBuilder((org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension) value_)\n                  .mergeFrom(value).buildPartial();\n            } else {\n              value_ = value;\n            }\n            onChanged();\n          } else {\n            if (valueCase_ == 19) {\n              extensionValueBuilder_.mergeFrom(value);\n            }\n            extensionValueBuilder_.setMessage(value);\n          }\n          valueCase_ = 19;\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension extension_value = 19;</code>\n         */\n        public Builder clearExtensionValue() {\n          if (extensionValueBuilder_ == null) {\n            if (valueCase_ == 19) {\n              valueCase_ = 0;\n              value_ = null;\n              onChanged();\n            }\n          } else {\n            if (valueCase_ == 19) {\n              valueCase_ = 0;\n              value_ = null;\n            }\n            extensionValueBuilder_.clear();\n          }\n          return this;\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension extension_value = 19;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.Builder getExtensionValueBuilder() {\n          return getExtensionValueFieldBuilder().getBuilder();\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension extension_value = 19;</code>\n         */\n        public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtensionOrBuilder getExtensionValueOrBuilder() {\n          if ((valueCase_ == 19) && (extensionValueBuilder_ != null)) {\n            return extensionValueBuilder_.getMessageOrBuilder();\n          } else {\n            if (valueCase_ == 19) {\n              return (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension) value_;\n            }\n            return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.getDefaultInstance();\n          }\n        }\n        /**\n         * <code>optional .org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension extension_value = 19;</code>\n         */\n        private com.google.protobuf.SingleFieldBuilderV3<\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtensionOrBuilder> \n            getExtensionValueFieldBuilder() {\n          if (extensionValueBuilder_ == null) {\n            if (!(valueCase_ == 19)) {\n              value_ = org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.getDefaultInstance();\n            }\n            extensionValueBuilder_ = new com.google.protobuf.SingleFieldBuilderV3<\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtensionOrBuilder>(\n                    (org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.MetricValueExtension) value_,\n                    getParentForChildren(),\n                    isClean());\n            value_ = null;\n          }\n          valueCase_ = 19;\n          onChanged();;\n          return extensionValueBuilder_;\n        }\n        public final Builder setUnknownFields(\n            final com.google.protobuf.UnknownFieldSet unknownFields) {\n          return super.setUnknownFields(unknownFields);\n        }\n\n        public final Builder mergeUnknownFields(\n            final com.google.protobuf.UnknownFieldSet unknownFields) {\n          return super.mergeUnknownFields(unknownFields);\n        }\n\n\n        // @@protoc_insertion_point(builder_scope:org.eclipse.tahu.protobuf.Payload.Metric)\n      }\n\n      // @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.Metric)\n      private static final org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric DEFAULT_INSTANCE;\n      static {\n        DEFAULT_INSTANCE = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric();\n      }\n\n      public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric getDefaultInstance() {\n        return DEFAULT_INSTANCE;\n      }\n\n      @java.lang.Deprecated public static final com.google.protobuf.Parser<Metric>\n          PARSER = new com.google.protobuf.AbstractParser<Metric>() {\n        public Metric parsePartialFrom(\n            com.google.protobuf.CodedInputStream input,\n            com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n            throws com.google.protobuf.InvalidProtocolBufferException {\n            return new Metric(input, extensionRegistry);\n        }\n      };\n\n      public static com.google.protobuf.Parser<Metric> parser() {\n        return PARSER;\n      }\n\n      @java.lang.Override\n      public com.google.protobuf.Parser<Metric> getParserForType() {\n        return PARSER;\n      }\n\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric getDefaultInstanceForType() {\n        return DEFAULT_INSTANCE;\n      }\n\n    }\n\n    private int bitField0_;\n    public static final int TIMESTAMP_FIELD_NUMBER = 1;\n    private long timestamp_;\n    /**\n     * <pre>\n     * Timestamp at message sending time\n     * </pre>\n     *\n     * <code>optional uint64 timestamp = 1;</code>\n     */\n    public boolean hasTimestamp() {\n      return ((bitField0_ & 0x00000001) == 0x00000001);\n    }\n    /**\n     * <pre>\n     * Timestamp at message sending time\n     * </pre>\n     *\n     * <code>optional uint64 timestamp = 1;</code>\n     */\n    public long getTimestamp() {\n      return timestamp_;\n    }\n\n    public static final int METRICS_FIELD_NUMBER = 2;\n    private java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric> metrics_;\n    /**\n     * <pre>\n     * Repeated forever - no limit in Google Protobufs\n     * </pre>\n     *\n     * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n     */\n    public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric> getMetricsList() {\n      return metrics_;\n    }\n    /**\n     * <pre>\n     * Repeated forever - no limit in Google Protobufs\n     * </pre>\n     *\n     * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n     */\n    public java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder> \n        getMetricsOrBuilderList() {\n      return metrics_;\n    }\n    /**\n     * <pre>\n     * Repeated forever - no limit in Google Protobufs\n     * </pre>\n     *\n     * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n     */\n    public int getMetricsCount() {\n      return metrics_.size();\n    }\n    /**\n     * <pre>\n     * Repeated forever - no limit in Google Protobufs\n     * </pre>\n     *\n     * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n     */\n    public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric getMetrics(int index) {\n      return metrics_.get(index);\n    }\n    /**\n     * <pre>\n     * Repeated forever - no limit in Google Protobufs\n     * </pre>\n     *\n     * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n     */\n    public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder getMetricsOrBuilder(\n        int index) {\n      return metrics_.get(index);\n    }\n\n    public static final int SEQ_FIELD_NUMBER = 3;\n    private long seq_;\n    /**\n     * <pre>\n     * Sequence number\n     * </pre>\n     *\n     * <code>optional uint64 seq = 3;</code>\n     */\n    public boolean hasSeq() {\n      return ((bitField0_ & 0x00000002) == 0x00000002);\n    }\n    /**\n     * <pre>\n     * Sequence number\n     * </pre>\n     *\n     * <code>optional uint64 seq = 3;</code>\n     */\n    public long getSeq() {\n      return seq_;\n    }\n\n    public static final int UUID_FIELD_NUMBER = 4;\n    private volatile java.lang.Object uuid_;\n    /**\n     * <pre>\n     * UUID to track message type in terms of schema definitions\n     * </pre>\n     *\n     * <code>optional string uuid = 4;</code>\n     */\n    public boolean hasUuid() {\n      return ((bitField0_ & 0x00000004) == 0x00000004);\n    }\n    /**\n     * <pre>\n     * UUID to track message type in terms of schema definitions\n     * </pre>\n     *\n     * <code>optional string uuid = 4;</code>\n     */\n    public java.lang.String getUuid() {\n      java.lang.Object ref = uuid_;\n      if (ref instanceof java.lang.String) {\n        return (java.lang.String) ref;\n      } else {\n        com.google.protobuf.ByteString bs = \n            (com.google.protobuf.ByteString) ref;\n        java.lang.String s = bs.toStringUtf8();\n        if (bs.isValidUtf8()) {\n          uuid_ = s;\n        }\n        return s;\n      }\n    }\n    /**\n     * <pre>\n     * UUID to track message type in terms of schema definitions\n     * </pre>\n     *\n     * <code>optional string uuid = 4;</code>\n     */\n    public com.google.protobuf.ByteString\n        getUuidBytes() {\n      java.lang.Object ref = uuid_;\n      if (ref instanceof java.lang.String) {\n        com.google.protobuf.ByteString b = \n            com.google.protobuf.ByteString.copyFromUtf8(\n                (java.lang.String) ref);\n        uuid_ = b;\n        return b;\n      } else {\n        return (com.google.protobuf.ByteString) ref;\n      }\n    }\n\n    public static final int BODY_FIELD_NUMBER = 5;\n    private com.google.protobuf.ByteString body_;\n    /**\n     * <pre>\n     * To optionally bypass the whole definition above\n     * </pre>\n     *\n     * <code>optional bytes body = 5;</code>\n     */\n    public boolean hasBody() {\n      return ((bitField0_ & 0x00000008) == 0x00000008);\n    }\n    /**\n     * <pre>\n     * To optionally bypass the whole definition above\n     * </pre>\n     *\n     * <code>optional bytes body = 5;</code>\n     */\n    public com.google.protobuf.ByteString getBody() {\n      return body_;\n    }\n\n    private byte memoizedIsInitialized = -1;\n    public final boolean isInitialized() {\n      byte isInitialized = memoizedIsInitialized;\n      if (isInitialized == 1) return true;\n      if (isInitialized == 0) return false;\n\n      for (int i = 0; i < getMetricsCount(); i++) {\n        if (!getMetrics(i).isInitialized()) {\n          memoizedIsInitialized = 0;\n          return false;\n        }\n      }\n      if (!extensionsAreInitialized()) {\n        memoizedIsInitialized = 0;\n        return false;\n      }\n      memoizedIsInitialized = 1;\n      return true;\n    }\n\n    public void writeTo(com.google.protobuf.CodedOutputStream output)\n                        throws java.io.IOException {\n      com.google.protobuf.GeneratedMessageV3\n        .ExtendableMessage<org.eclipse.tahu.protobuf.SparkplugBProto.Payload>.ExtensionWriter\n          extensionWriter = newExtensionWriter();\n      if (((bitField0_ & 0x00000001) == 0x00000001)) {\n        output.writeUInt64(1, timestamp_);\n      }\n      for (int i = 0; i < metrics_.size(); i++) {\n        output.writeMessage(2, metrics_.get(i));\n      }\n      if (((bitField0_ & 0x00000002) == 0x00000002)) {\n        output.writeUInt64(3, seq_);\n      }\n      if (((bitField0_ & 0x00000004) == 0x00000004)) {\n        com.google.protobuf.GeneratedMessageV3.writeString(output, 4, uuid_);\n      }\n      if (((bitField0_ & 0x00000008) == 0x00000008)) {\n        output.writeBytes(5, body_);\n      }\n      extensionWriter.writeUntil(536870912, output);\n      unknownFields.writeTo(output);\n    }\n\n    public int getSerializedSize() {\n      int size = memoizedSize;\n      if (size != -1) return size;\n\n      size = 0;\n      if (((bitField0_ & 0x00000001) == 0x00000001)) {\n        size += com.google.protobuf.CodedOutputStream\n          .computeUInt64Size(1, timestamp_);\n      }\n      for (int i = 0; i < metrics_.size(); i++) {\n        size += com.google.protobuf.CodedOutputStream\n          .computeMessageSize(2, metrics_.get(i));\n      }\n      if (((bitField0_ & 0x00000002) == 0x00000002)) {\n        size += com.google.protobuf.CodedOutputStream\n          .computeUInt64Size(3, seq_);\n      }\n      if (((bitField0_ & 0x00000004) == 0x00000004)) {\n        size += com.google.protobuf.GeneratedMessageV3.computeStringSize(4, uuid_);\n      }\n      if (((bitField0_ & 0x00000008) == 0x00000008)) {\n        size += com.google.protobuf.CodedOutputStream\n          .computeBytesSize(5, body_);\n      }\n      size += extensionsSerializedSize();\n      size += unknownFields.getSerializedSize();\n      memoizedSize = size;\n      return size;\n    }\n\n    private static final long serialVersionUID = 0L;\n    @java.lang.Override\n    public boolean equals(final java.lang.Object obj) {\n      if (obj == this) {\n       return true;\n      }\n      if (!(obj instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload)) {\n        return super.equals(obj);\n      }\n      org.eclipse.tahu.protobuf.SparkplugBProto.Payload other = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload) obj;\n\n      boolean result = true;\n      result = result && (hasTimestamp() == other.hasTimestamp());\n      if (hasTimestamp()) {\n        result = result && (getTimestamp()\n            == other.getTimestamp());\n      }\n      result = result && getMetricsList()\n          .equals(other.getMetricsList());\n      result = result && (hasSeq() == other.hasSeq());\n      if (hasSeq()) {\n        result = result && (getSeq()\n            == other.getSeq());\n      }\n      result = result && (hasUuid() == other.hasUuid());\n      if (hasUuid()) {\n        result = result && getUuid()\n            .equals(other.getUuid());\n      }\n      result = result && (hasBody() == other.hasBody());\n      if (hasBody()) {\n        result = result && getBody()\n            .equals(other.getBody());\n      }\n      result = result && unknownFields.equals(other.unknownFields);\n      result = result &&\n          getExtensionFields().equals(other.getExtensionFields());\n      return result;\n    }\n\n    @java.lang.Override\n    public int hashCode() {\n      if (memoizedHashCode != 0) {\n        return memoizedHashCode;\n      }\n      int hash = 41;\n      hash = (19 * hash) + getDescriptorForType().hashCode();\n      if (hasTimestamp()) {\n        hash = (37 * hash) + TIMESTAMP_FIELD_NUMBER;\n        hash = (53 * hash) + com.google.protobuf.Internal.hashLong(\n            getTimestamp());\n      }\n      if (getMetricsCount() > 0) {\n        hash = (37 * hash) + METRICS_FIELD_NUMBER;\n        hash = (53 * hash) + getMetricsList().hashCode();\n      }\n      if (hasSeq()) {\n        hash = (37 * hash) + SEQ_FIELD_NUMBER;\n        hash = (53 * hash) + com.google.protobuf.Internal.hashLong(\n            getSeq());\n      }\n      if (hasUuid()) {\n        hash = (37 * hash) + UUID_FIELD_NUMBER;\n        hash = (53 * hash) + getUuid().hashCode();\n      }\n      if (hasBody()) {\n        hash = (37 * hash) + BODY_FIELD_NUMBER;\n        hash = (53 * hash) + getBody().hashCode();\n      }\n      hash = hashFields(hash, getExtensionFields());\n      hash = (29 * hash) + unknownFields.hashCode();\n      memoizedHashCode = hash;\n      return hash;\n    }\n\n    public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload parseFrom(\n        com.google.protobuf.ByteString data)\n        throws com.google.protobuf.InvalidProtocolBufferException {\n      return PARSER.parseFrom(data);\n    }\n    public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload parseFrom(\n        com.google.protobuf.ByteString data,\n        com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n        throws com.google.protobuf.InvalidProtocolBufferException {\n      return PARSER.parseFrom(data, extensionRegistry);\n    }\n    public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload parseFrom(byte[] data)\n        throws com.google.protobuf.InvalidProtocolBufferException {\n      return PARSER.parseFrom(data);\n    }\n    public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload parseFrom(\n        byte[] data,\n        com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n        throws com.google.protobuf.InvalidProtocolBufferException {\n      return PARSER.parseFrom(data, extensionRegistry);\n    }\n    public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload parseFrom(java.io.InputStream input)\n        throws java.io.IOException {\n      return com.google.protobuf.GeneratedMessageV3\n          .parseWithIOException(PARSER, input);\n    }\n    public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload parseFrom(\n        java.io.InputStream input,\n        com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n        throws java.io.IOException {\n      return com.google.protobuf.GeneratedMessageV3\n          .parseWithIOException(PARSER, input, extensionRegistry);\n    }\n    public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload parseDelimitedFrom(java.io.InputStream input)\n        throws java.io.IOException {\n      return com.google.protobuf.GeneratedMessageV3\n          .parseDelimitedWithIOException(PARSER, input);\n    }\n    public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload parseDelimitedFrom(\n        java.io.InputStream input,\n        com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n        throws java.io.IOException {\n      return com.google.protobuf.GeneratedMessageV3\n          .parseDelimitedWithIOException(PARSER, input, extensionRegistry);\n    }\n    public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload parseFrom(\n        com.google.protobuf.CodedInputStream input)\n        throws java.io.IOException {\n      return com.google.protobuf.GeneratedMessageV3\n          .parseWithIOException(PARSER, input);\n    }\n    public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload parseFrom(\n        com.google.protobuf.CodedInputStream input,\n        com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n        throws java.io.IOException {\n      return com.google.protobuf.GeneratedMessageV3\n          .parseWithIOException(PARSER, input, extensionRegistry);\n    }\n\n    public Builder newBuilderForType() { return newBuilder(); }\n    public static Builder newBuilder() {\n      return DEFAULT_INSTANCE.toBuilder();\n    }\n    public static Builder newBuilder(org.eclipse.tahu.protobuf.SparkplugBProto.Payload prototype) {\n      return DEFAULT_INSTANCE.toBuilder().mergeFrom(prototype);\n    }\n    public Builder toBuilder() {\n      return this == DEFAULT_INSTANCE\n          ? new Builder() : new Builder().mergeFrom(this);\n    }\n\n    @java.lang.Override\n    protected Builder newBuilderForType(\n        com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n      Builder builder = new Builder(parent);\n      return builder;\n    }\n    /**\n     * Protobuf type {@code org.eclipse.tahu.protobuf.Payload}\n     */\n    public static final class Builder extends\n        com.google.protobuf.GeneratedMessageV3.ExtendableBuilder<\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload, Builder> implements\n        // @@protoc_insertion_point(builder_implements:org.eclipse.tahu.protobuf.Payload)\n        org.eclipse.tahu.protobuf.SparkplugBProto.PayloadOrBuilder {\n      public static final com.google.protobuf.Descriptors.Descriptor\n          getDescriptor() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_descriptor;\n      }\n\n      protected com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n          internalGetFieldAccessorTable() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_fieldAccessorTable\n            .ensureFieldAccessorsInitialized(\n                org.eclipse.tahu.protobuf.SparkplugBProto.Payload.class, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Builder.class);\n      }\n\n      // Construct using org.eclipse.tahu.protobuf.SparkplugBProto.Payload.newBuilder()\n      private Builder() {\n        maybeForceBuilderInitialization();\n      }\n\n      private Builder(\n          com.google.protobuf.GeneratedMessageV3.BuilderParent parent) {\n        super(parent);\n        maybeForceBuilderInitialization();\n      }\n      private void maybeForceBuilderInitialization() {\n        if (com.google.protobuf.GeneratedMessageV3\n                .alwaysUseFieldBuilders) {\n          getMetricsFieldBuilder();\n        }\n      }\n      public Builder clear() {\n        super.clear();\n        timestamp_ = 0L;\n        bitField0_ = (bitField0_ & ~0x00000001);\n        if (metricsBuilder_ == null) {\n          metrics_ = java.util.Collections.emptyList();\n          bitField0_ = (bitField0_ & ~0x00000002);\n        } else {\n          metricsBuilder_.clear();\n        }\n        seq_ = 0L;\n        bitField0_ = (bitField0_ & ~0x00000004);\n        uuid_ = \"\";\n        bitField0_ = (bitField0_ & ~0x00000008);\n        body_ = com.google.protobuf.ByteString.EMPTY;\n        bitField0_ = (bitField0_ & ~0x00000010);\n        return this;\n      }\n\n      public com.google.protobuf.Descriptors.Descriptor\n          getDescriptorForType() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.internal_static_org_eclipse_tahu_protobuf_Payload_descriptor;\n      }\n\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload getDefaultInstanceForType() {\n        return org.eclipse.tahu.protobuf.SparkplugBProto.Payload.getDefaultInstance();\n      }\n\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload build() {\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload result = buildPartial();\n        if (!result.isInitialized()) {\n          throw newUninitializedMessageException(result);\n        }\n        return result;\n      }\n\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload buildPartial() {\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload result = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload(this);\n        int from_bitField0_ = bitField0_;\n        int to_bitField0_ = 0;\n        if (((from_bitField0_ & 0x00000001) == 0x00000001)) {\n          to_bitField0_ |= 0x00000001;\n        }\n        result.timestamp_ = timestamp_;\n        if (metricsBuilder_ == null) {\n          if (((bitField0_ & 0x00000002) == 0x00000002)) {\n            metrics_ = java.util.Collections.unmodifiableList(metrics_);\n            bitField0_ = (bitField0_ & ~0x00000002);\n          }\n          result.metrics_ = metrics_;\n        } else {\n          result.metrics_ = metricsBuilder_.build();\n        }\n        if (((from_bitField0_ & 0x00000004) == 0x00000004)) {\n          to_bitField0_ |= 0x00000002;\n        }\n        result.seq_ = seq_;\n        if (((from_bitField0_ & 0x00000008) == 0x00000008)) {\n          to_bitField0_ |= 0x00000004;\n        }\n        result.uuid_ = uuid_;\n        if (((from_bitField0_ & 0x00000010) == 0x00000010)) {\n          to_bitField0_ |= 0x00000008;\n        }\n        result.body_ = body_;\n        result.bitField0_ = to_bitField0_;\n        onBuilt();\n        return result;\n      }\n\n      public Builder clone() {\n        return (Builder) super.clone();\n      }\n      public Builder setField(\n          com.google.protobuf.Descriptors.FieldDescriptor field,\n          Object value) {\n        return (Builder) super.setField(field, value);\n      }\n      public Builder clearField(\n          com.google.protobuf.Descriptors.FieldDescriptor field) {\n        return (Builder) super.clearField(field);\n      }\n      public Builder clearOneof(\n          com.google.protobuf.Descriptors.OneofDescriptor oneof) {\n        return (Builder) super.clearOneof(oneof);\n      }\n      public Builder setRepeatedField(\n          com.google.protobuf.Descriptors.FieldDescriptor field,\n          int index, Object value) {\n        return (Builder) super.setRepeatedField(field, index, value);\n      }\n      public Builder addRepeatedField(\n          com.google.protobuf.Descriptors.FieldDescriptor field,\n          Object value) {\n        return (Builder) super.addRepeatedField(field, value);\n      }\n      public <Type> Builder setExtension(\n          com.google.protobuf.GeneratedMessage.GeneratedExtension<\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload, Type> extension,\n          Type value) {\n        return (Builder) super.setExtension(extension, value);\n      }\n      public <Type> Builder setExtension(\n          com.google.protobuf.GeneratedMessage.GeneratedExtension<\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload, java.util.List<Type>> extension,\n          int index, Type value) {\n        return (Builder) super.setExtension(extension, index, value);\n      }\n      public <Type> Builder addExtension(\n          com.google.protobuf.GeneratedMessage.GeneratedExtension<\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload, java.util.List<Type>> extension,\n          Type value) {\n        return (Builder) super.addExtension(extension, value);\n      }\n      public <Type> Builder clearExtension(\n          com.google.protobuf.GeneratedMessage.GeneratedExtension<\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload, ?> extension) {\n        return (Builder) super.clearExtension(extension);\n      }\n      public Builder mergeFrom(com.google.protobuf.Message other) {\n        if (other instanceof org.eclipse.tahu.protobuf.SparkplugBProto.Payload) {\n          return mergeFrom((org.eclipse.tahu.protobuf.SparkplugBProto.Payload)other);\n        } else {\n          super.mergeFrom(other);\n          return this;\n        }\n      }\n\n      public Builder mergeFrom(org.eclipse.tahu.protobuf.SparkplugBProto.Payload other) {\n        if (other == org.eclipse.tahu.protobuf.SparkplugBProto.Payload.getDefaultInstance()) return this;\n        if (other.hasTimestamp()) {\n          setTimestamp(other.getTimestamp());\n        }\n        if (metricsBuilder_ == null) {\n          if (!other.metrics_.isEmpty()) {\n            if (metrics_.isEmpty()) {\n              metrics_ = other.metrics_;\n              bitField0_ = (bitField0_ & ~0x00000002);\n            } else {\n              ensureMetricsIsMutable();\n              metrics_.addAll(other.metrics_);\n            }\n            onChanged();\n          }\n        } else {\n          if (!other.metrics_.isEmpty()) {\n            if (metricsBuilder_.isEmpty()) {\n              metricsBuilder_.dispose();\n              metricsBuilder_ = null;\n              metrics_ = other.metrics_;\n              bitField0_ = (bitField0_ & ~0x00000002);\n              metricsBuilder_ = \n                com.google.protobuf.GeneratedMessageV3.alwaysUseFieldBuilders ?\n                   getMetricsFieldBuilder() : null;\n            } else {\n              metricsBuilder_.addAllMessages(other.metrics_);\n            }\n          }\n        }\n        if (other.hasSeq()) {\n          setSeq(other.getSeq());\n        }\n        if (other.hasUuid()) {\n          bitField0_ |= 0x00000008;\n          uuid_ = other.uuid_;\n          onChanged();\n        }\n        if (other.hasBody()) {\n          setBody(other.getBody());\n        }\n        this.mergeExtensionFields(other);\n        this.mergeUnknownFields(other.unknownFields);\n        onChanged();\n        return this;\n      }\n\n      public final boolean isInitialized() {\n        for (int i = 0; i < getMetricsCount(); i++) {\n          if (!getMetrics(i).isInitialized()) {\n            return false;\n          }\n        }\n        if (!extensionsAreInitialized()) {\n          return false;\n        }\n        return true;\n      }\n\n      public Builder mergeFrom(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws java.io.IOException {\n        org.eclipse.tahu.protobuf.SparkplugBProto.Payload parsedMessage = null;\n        try {\n          parsedMessage = PARSER.parsePartialFrom(input, extensionRegistry);\n        } catch (com.google.protobuf.InvalidProtocolBufferException e) {\n          parsedMessage = (org.eclipse.tahu.protobuf.SparkplugBProto.Payload) e.getUnfinishedMessage();\n          throw e.unwrapIOException();\n        } finally {\n          if (parsedMessage != null) {\n            mergeFrom(parsedMessage);\n          }\n        }\n        return this;\n      }\n      private int bitField0_;\n\n      private long timestamp_ ;\n      /**\n       * <pre>\n       * Timestamp at message sending time\n       * </pre>\n       *\n       * <code>optional uint64 timestamp = 1;</code>\n       */\n      public boolean hasTimestamp() {\n        return ((bitField0_ & 0x00000001) == 0x00000001);\n      }\n      /**\n       * <pre>\n       * Timestamp at message sending time\n       * </pre>\n       *\n       * <code>optional uint64 timestamp = 1;</code>\n       */\n      public long getTimestamp() {\n        return timestamp_;\n      }\n      /**\n       * <pre>\n       * Timestamp at message sending time\n       * </pre>\n       *\n       * <code>optional uint64 timestamp = 1;</code>\n       */\n      public Builder setTimestamp(long value) {\n        bitField0_ |= 0x00000001;\n        timestamp_ = value;\n        onChanged();\n        return this;\n      }\n      /**\n       * <pre>\n       * Timestamp at message sending time\n       * </pre>\n       *\n       * <code>optional uint64 timestamp = 1;</code>\n       */\n      public Builder clearTimestamp() {\n        bitField0_ = (bitField0_ & ~0x00000001);\n        timestamp_ = 0L;\n        onChanged();\n        return this;\n      }\n\n      private java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric> metrics_ =\n        java.util.Collections.emptyList();\n      private void ensureMetricsIsMutable() {\n        if (!((bitField0_ & 0x00000002) == 0x00000002)) {\n          metrics_ = new java.util.ArrayList<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric>(metrics_);\n          bitField0_ |= 0x00000002;\n         }\n      }\n\n      private com.google.protobuf.RepeatedFieldBuilderV3<\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder> metricsBuilder_;\n\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric> getMetricsList() {\n        if (metricsBuilder_ == null) {\n          return java.util.Collections.unmodifiableList(metrics_);\n        } else {\n          return metricsBuilder_.getMessageList();\n        }\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public int getMetricsCount() {\n        if (metricsBuilder_ == null) {\n          return metrics_.size();\n        } else {\n          return metricsBuilder_.getCount();\n        }\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric getMetrics(int index) {\n        if (metricsBuilder_ == null) {\n          return metrics_.get(index);\n        } else {\n          return metricsBuilder_.getMessage(index);\n        }\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public Builder setMetrics(\n          int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric value) {\n        if (metricsBuilder_ == null) {\n          if (value == null) {\n            throw new NullPointerException();\n          }\n          ensureMetricsIsMutable();\n          metrics_.set(index, value);\n          onChanged();\n        } else {\n          metricsBuilder_.setMessage(index, value);\n        }\n        return this;\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public Builder setMetrics(\n          int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder builderForValue) {\n        if (metricsBuilder_ == null) {\n          ensureMetricsIsMutable();\n          metrics_.set(index, builderForValue.build());\n          onChanged();\n        } else {\n          metricsBuilder_.setMessage(index, builderForValue.build());\n        }\n        return this;\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public Builder addMetrics(org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric value) {\n        if (metricsBuilder_ == null) {\n          if (value == null) {\n            throw new NullPointerException();\n          }\n          ensureMetricsIsMutable();\n          metrics_.add(value);\n          onChanged();\n        } else {\n          metricsBuilder_.addMessage(value);\n        }\n        return this;\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public Builder addMetrics(\n          int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric value) {\n        if (metricsBuilder_ == null) {\n          if (value == null) {\n            throw new NullPointerException();\n          }\n          ensureMetricsIsMutable();\n          metrics_.add(index, value);\n          onChanged();\n        } else {\n          metricsBuilder_.addMessage(index, value);\n        }\n        return this;\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public Builder addMetrics(\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder builderForValue) {\n        if (metricsBuilder_ == null) {\n          ensureMetricsIsMutable();\n          metrics_.add(builderForValue.build());\n          onChanged();\n        } else {\n          metricsBuilder_.addMessage(builderForValue.build());\n        }\n        return this;\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public Builder addMetrics(\n          int index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder builderForValue) {\n        if (metricsBuilder_ == null) {\n          ensureMetricsIsMutable();\n          metrics_.add(index, builderForValue.build());\n          onChanged();\n        } else {\n          metricsBuilder_.addMessage(index, builderForValue.build());\n        }\n        return this;\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public Builder addAllMetrics(\n          java.lang.Iterable<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric> values) {\n        if (metricsBuilder_ == null) {\n          ensureMetricsIsMutable();\n          com.google.protobuf.AbstractMessageLite.Builder.addAll(\n              values, metrics_);\n          onChanged();\n        } else {\n          metricsBuilder_.addAllMessages(values);\n        }\n        return this;\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public Builder clearMetrics() {\n        if (metricsBuilder_ == null) {\n          metrics_ = java.util.Collections.emptyList();\n          bitField0_ = (bitField0_ & ~0x00000002);\n          onChanged();\n        } else {\n          metricsBuilder_.clear();\n        }\n        return this;\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public Builder removeMetrics(int index) {\n        if (metricsBuilder_ == null) {\n          ensureMetricsIsMutable();\n          metrics_.remove(index);\n          onChanged();\n        } else {\n          metricsBuilder_.remove(index);\n        }\n        return this;\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder getMetricsBuilder(\n          int index) {\n        return getMetricsFieldBuilder().getBuilder(index);\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder getMetricsOrBuilder(\n          int index) {\n        if (metricsBuilder_ == null) {\n          return metrics_.get(index);  } else {\n          return metricsBuilder_.getMessageOrBuilder(index);\n        }\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public java.util.List<? extends org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder> \n           getMetricsOrBuilderList() {\n        if (metricsBuilder_ != null) {\n          return metricsBuilder_.getMessageOrBuilderList();\n        } else {\n          return java.util.Collections.unmodifiableList(metrics_);\n        }\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder addMetricsBuilder() {\n        return getMetricsFieldBuilder().addBuilder(\n            org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.getDefaultInstance());\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder addMetricsBuilder(\n          int index) {\n        return getMetricsFieldBuilder().addBuilder(\n            index, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.getDefaultInstance());\n      }\n      /**\n       * <pre>\n       * Repeated forever - no limit in Google Protobufs\n       * </pre>\n       *\n       * <code>repeated .org.eclipse.tahu.protobuf.Payload.Metric metrics = 2;</code>\n       */\n      public java.util.List<org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder> \n           getMetricsBuilderList() {\n        return getMetricsFieldBuilder().getBuilderList();\n      }\n      private com.google.protobuf.RepeatedFieldBuilderV3<\n          org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder> \n          getMetricsFieldBuilder() {\n        if (metricsBuilder_ == null) {\n          metricsBuilder_ = new com.google.protobuf.RepeatedFieldBuilderV3<\n              org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.Metric.Builder, org.eclipse.tahu.protobuf.SparkplugBProto.Payload.MetricOrBuilder>(\n                  metrics_,\n                  ((bitField0_ & 0x00000002) == 0x00000002),\n                  getParentForChildren(),\n                  isClean());\n          metrics_ = null;\n        }\n        return metricsBuilder_;\n      }\n\n      private long seq_ ;\n      /**\n       * <pre>\n       * Sequence number\n       * </pre>\n       *\n       * <code>optional uint64 seq = 3;</code>\n       */\n      public boolean hasSeq() {\n        return ((bitField0_ & 0x00000004) == 0x00000004);\n      }\n      /**\n       * <pre>\n       * Sequence number\n       * </pre>\n       *\n       * <code>optional uint64 seq = 3;</code>\n       */\n      public long getSeq() {\n        return seq_;\n      }\n      /**\n       * <pre>\n       * Sequence number\n       * </pre>\n       *\n       * <code>optional uint64 seq = 3;</code>\n       */\n      public Builder setSeq(long value) {\n        bitField0_ |= 0x00000004;\n        seq_ = value;\n        onChanged();\n        return this;\n      }\n      /**\n       * <pre>\n       * Sequence number\n       * </pre>\n       *\n       * <code>optional uint64 seq = 3;</code>\n       */\n      public Builder clearSeq() {\n        bitField0_ = (bitField0_ & ~0x00000004);\n        seq_ = 0L;\n        onChanged();\n        return this;\n      }\n\n      private java.lang.Object uuid_ = \"\";\n      /**\n       * <pre>\n       * UUID to track message type in terms of schema definitions\n       * </pre>\n       *\n       * <code>optional string uuid = 4;</code>\n       */\n      public boolean hasUuid() {\n        return ((bitField0_ & 0x00000008) == 0x00000008);\n      }\n      /**\n       * <pre>\n       * UUID to track message type in terms of schema definitions\n       * </pre>\n       *\n       * <code>optional string uuid = 4;</code>\n       */\n      public java.lang.String getUuid() {\n        java.lang.Object ref = uuid_;\n        if (!(ref instanceof java.lang.String)) {\n          com.google.protobuf.ByteString bs =\n              (com.google.protobuf.ByteString) ref;\n          java.lang.String s = bs.toStringUtf8();\n          if (bs.isValidUtf8()) {\n            uuid_ = s;\n          }\n          return s;\n        } else {\n          return (java.lang.String) ref;\n        }\n      }\n      /**\n       * <pre>\n       * UUID to track message type in terms of schema definitions\n       * </pre>\n       *\n       * <code>optional string uuid = 4;</code>\n       */\n      public com.google.protobuf.ByteString\n          getUuidBytes() {\n        java.lang.Object ref = uuid_;\n        if (ref instanceof String) {\n          com.google.protobuf.ByteString b = \n              com.google.protobuf.ByteString.copyFromUtf8(\n                  (java.lang.String) ref);\n          uuid_ = b;\n          return b;\n        } else {\n          return (com.google.protobuf.ByteString) ref;\n        }\n      }\n      /**\n       * <pre>\n       * UUID to track message type in terms of schema definitions\n       * </pre>\n       *\n       * <code>optional string uuid = 4;</code>\n       */\n      public Builder setUuid(\n          java.lang.String value) {\n        if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000008;\n        uuid_ = value;\n        onChanged();\n        return this;\n      }\n      /**\n       * <pre>\n       * UUID to track message type in terms of schema definitions\n       * </pre>\n       *\n       * <code>optional string uuid = 4;</code>\n       */\n      public Builder clearUuid() {\n        bitField0_ = (bitField0_ & ~0x00000008);\n        uuid_ = getDefaultInstance().getUuid();\n        onChanged();\n        return this;\n      }\n      /**\n       * <pre>\n       * UUID to track message type in terms of schema definitions\n       * </pre>\n       *\n       * <code>optional string uuid = 4;</code>\n       */\n      public Builder setUuidBytes(\n          com.google.protobuf.ByteString value) {\n        if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000008;\n        uuid_ = value;\n        onChanged();\n        return this;\n      }\n\n      private com.google.protobuf.ByteString body_ = com.google.protobuf.ByteString.EMPTY;\n      /**\n       * <pre>\n       * To optionally bypass the whole definition above\n       * </pre>\n       *\n       * <code>optional bytes body = 5;</code>\n       */\n      public boolean hasBody() {\n        return ((bitField0_ & 0x00000010) == 0x00000010);\n      }\n      /**\n       * <pre>\n       * To optionally bypass the whole definition above\n       * </pre>\n       *\n       * <code>optional bytes body = 5;</code>\n       */\n      public com.google.protobuf.ByteString getBody() {\n        return body_;\n      }\n      /**\n       * <pre>\n       * To optionally bypass the whole definition above\n       * </pre>\n       *\n       * <code>optional bytes body = 5;</code>\n       */\n      public Builder setBody(com.google.protobuf.ByteString value) {\n        if (value == null) {\n    throw new NullPointerException();\n  }\n  bitField0_ |= 0x00000010;\n        body_ = value;\n        onChanged();\n        return this;\n      }\n      /**\n       * <pre>\n       * To optionally bypass the whole definition above\n       * </pre>\n       *\n       * <code>optional bytes body = 5;</code>\n       */\n      public Builder clearBody() {\n        bitField0_ = (bitField0_ & ~0x00000010);\n        body_ = getDefaultInstance().getBody();\n        onChanged();\n        return this;\n      }\n      public final Builder setUnknownFields(\n          final com.google.protobuf.UnknownFieldSet unknownFields) {\n        return super.setUnknownFields(unknownFields);\n      }\n\n      public final Builder mergeUnknownFields(\n          final com.google.protobuf.UnknownFieldSet unknownFields) {\n        return super.mergeUnknownFields(unknownFields);\n      }\n\n\n      // @@protoc_insertion_point(builder_scope:org.eclipse.tahu.protobuf.Payload)\n    }\n\n    // @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload)\n    private static final org.eclipse.tahu.protobuf.SparkplugBProto.Payload DEFAULT_INSTANCE;\n    static {\n      DEFAULT_INSTANCE = new org.eclipse.tahu.protobuf.SparkplugBProto.Payload();\n    }\n\n    public static org.eclipse.tahu.protobuf.SparkplugBProto.Payload getDefaultInstance() {\n      return DEFAULT_INSTANCE;\n    }\n\n    @java.lang.Deprecated public static final com.google.protobuf.Parser<Payload>\n        PARSER = new com.google.protobuf.AbstractParser<Payload>() {\n      public Payload parsePartialFrom(\n          com.google.protobuf.CodedInputStream input,\n          com.google.protobuf.ExtensionRegistryLite extensionRegistry)\n          throws com.google.protobuf.InvalidProtocolBufferException {\n          return new Payload(input, extensionRegistry);\n      }\n    };\n\n    public static com.google.protobuf.Parser<Payload> parser() {\n      return PARSER;\n    }\n\n    @java.lang.Override\n    public com.google.protobuf.Parser<Payload> getParserForType() {\n      return PARSER;\n    }\n\n    public org.eclipse.tahu.protobuf.SparkplugBProto.Payload getDefaultInstanceForType() {\n      return DEFAULT_INSTANCE;\n    }\n\n  }\n\n  private static final com.google.protobuf.Descriptors.Descriptor\n    internal_static_org_eclipse_tahu_protobuf_Payload_descriptor;\n  private static final \n    com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n      internal_static_org_eclipse_tahu_protobuf_Payload_fieldAccessorTable;\n  private static final com.google.protobuf.Descriptors.Descriptor\n    internal_static_org_eclipse_tahu_protobuf_Payload_Template_descriptor;\n  private static final \n    com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n      internal_static_org_eclipse_tahu_protobuf_Payload_Template_fieldAccessorTable;\n  private static final com.google.protobuf.Descriptors.Descriptor\n    internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_descriptor;\n  private static final \n    com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n      internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_fieldAccessorTable;\n  private static final com.google.protobuf.Descriptors.Descriptor\n    internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_descriptor;\n  private static final \n    com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n      internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_fieldAccessorTable;\n  private static final com.google.protobuf.Descriptors.Descriptor\n    internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_descriptor;\n  private static final \n    com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n      internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_fieldAccessorTable;\n  private static final com.google.protobuf.Descriptors.Descriptor\n    internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_descriptor;\n  private static final \n    com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n      internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_fieldAccessorTable;\n  private static final com.google.protobuf.Descriptors.Descriptor\n    internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_descriptor;\n  private static final \n    com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n      internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_fieldAccessorTable;\n  private static final com.google.protobuf.Descriptors.Descriptor\n    internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_Row_descriptor;\n  private static final \n    com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n      internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_Row_fieldAccessorTable;\n  private static final com.google.protobuf.Descriptors.Descriptor\n    internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_descriptor;\n  private static final \n    com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n      internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_fieldAccessorTable;\n  private static final com.google.protobuf.Descriptors.Descriptor\n    internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_descriptor;\n  private static final \n    com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n      internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_fieldAccessorTable;\n  private static final com.google.protobuf.Descriptors.Descriptor\n    internal_static_org_eclipse_tahu_protobuf_Payload_PropertySet_descriptor;\n  private static final \n    com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n      internal_static_org_eclipse_tahu_protobuf_Payload_PropertySet_fieldAccessorTable;\n  private static final com.google.protobuf.Descriptors.Descriptor\n    internal_static_org_eclipse_tahu_protobuf_Payload_PropertySetList_descriptor;\n  private static final \n    com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n      internal_static_org_eclipse_tahu_protobuf_Payload_PropertySetList_fieldAccessorTable;\n  private static final com.google.protobuf.Descriptors.Descriptor\n    internal_static_org_eclipse_tahu_protobuf_Payload_MetaData_descriptor;\n  private static final \n    com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n      internal_static_org_eclipse_tahu_protobuf_Payload_MetaData_fieldAccessorTable;\n  private static final com.google.protobuf.Descriptors.Descriptor\n    internal_static_org_eclipse_tahu_protobuf_Payload_Metric_descriptor;\n  private static final \n    com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n      internal_static_org_eclipse_tahu_protobuf_Payload_Metric_fieldAccessorTable;\n  private static final com.google.protobuf.Descriptors.Descriptor\n    internal_static_org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_descriptor;\n  private static final \n    com.google.protobuf.GeneratedMessageV3.FieldAccessorTable\n      internal_static_org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_fieldAccessorTable;\n\n  public static com.google.protobuf.Descriptors.FileDescriptor\n      getDescriptor() {\n    return descriptor;\n  }\n  private static  com.google.protobuf.Descriptors.FileDescriptor\n      descriptor;\n  static {\n    java.lang.String[] descriptorData = {\n      \"\\n\\035sparkplug_b/sparkplug_b.proto\\022\\031org.ecl\" +\n      \"ipse.tahu.protobuf\\\"\\356\\025\\n\\007Payload\\022\\021\\n\\ttimest\" +\n      \"amp\\030\\001 \\001(\\004\\022:\\n\\007metrics\\030\\002 \\003(\\0132).org.eclipse\" +\n      \".tahu.protobuf.Payload.Metric\\022\\013\\n\\003seq\\030\\003 \\001\" +\n      \"(\\004\\022\\014\\n\\004uuid\\030\\004 \\001(\\t\\022\\014\\n\\004body\\030\\005 \\001(\\014\\032\\246\\004\\n\\010Templ\" +\n      \"ate\\022\\017\\n\\007version\\030\\001 \\001(\\t\\022:\\n\\007metrics\\030\\002 \\003(\\0132).\" +\n      \"org.eclipse.tahu.protobuf.Payload.Metric\" +\n      \"\\022I\\n\\nparameters\\030\\003 \\003(\\01325.org.eclipse.tahu.\" +\n      \"protobuf.Payload.Template.Parameter\\022\\024\\n\\014t\" +\n      \"emplate_ref\\030\\004 \\001(\\t\\022\\025\\n\\ris_definition\\030\\005 \\001(\\010\",\n      \"\\032\\312\\002\\n\\tParameter\\022\\014\\n\\004name\\030\\001 \\001(\\t\\022\\014\\n\\004type\\030\\002 \\001\" +\n      \"(\\r\\022\\023\\n\\tint_value\\030\\003 \\001(\\rH\\000\\022\\024\\n\\nlong_value\\030\\004 \" +\n      \"\\001(\\004H\\000\\022\\025\\n\\013float_value\\030\\005 \\001(\\002H\\000\\022\\026\\n\\014double_v\" +\n      \"alue\\030\\006 \\001(\\001H\\000\\022\\027\\n\\rboolean_value\\030\\007 \\001(\\010H\\000\\022\\026\\n\" +\n      \"\\014string_value\\030\\010 \\001(\\tH\\000\\022h\\n\\017extension_value\" +\n      \"\\030\\t \\001(\\0132M.org.eclipse.tahu.protobuf.Paylo\" +\n      \"ad.Template.Parameter.ParameterValueExte\" +\n      \"nsionH\\000\\032#\\n\\027ParameterValueExtension*\\010\\010\\001\\020\\200\" +\n      \"\\200\\200\\200\\002B\\007\\n\\005value*\\010\\010\\006\\020\\200\\200\\200\\200\\002\\032\\227\\004\\n\\007DataSet\\022\\026\\n\\016n\" +\n      \"um_of_columns\\030\\001 \\001(\\004\\022\\017\\n\\007columns\\030\\002 \\003(\\t\\022\\r\\n\\005\",\n      \"types\\030\\003 \\003(\\r\\022<\\n\\004rows\\030\\004 \\003(\\0132..org.eclipse.\" +\n      \"tahu.protobuf.Payload.DataSet.Row\\032\\257\\002\\n\\014Da\" +\n      \"taSetValue\\022\\023\\n\\tint_value\\030\\001 \\001(\\rH\\000\\022\\024\\n\\nlong_\" +\n      \"value\\030\\002 \\001(\\004H\\000\\022\\025\\n\\013float_value\\030\\003 \\001(\\002H\\000\\022\\026\\n\\014\" +\n      \"double_value\\030\\004 \\001(\\001H\\000\\022\\027\\n\\rboolean_value\\030\\005 \" +\n      \"\\001(\\010H\\000\\022\\026\\n\\014string_value\\030\\006 \\001(\\tH\\000\\022h\\n\\017extensi\" +\n      \"on_value\\030\\007 \\001(\\0132M.org.eclipse.tahu.protob\" +\n      \"uf.Payload.DataSet.DataSetValue.DataSetV\" +\n      \"alueExtensionH\\000\\032!\\n\\025DataSetValueExtension\" +\n      \"*\\010\\010\\001\\020\\200\\200\\200\\200\\002B\\007\\n\\005value\\032Z\\n\\003Row\\022I\\n\\010elements\\030\\001\",\n      \" \\003(\\01327.org.eclipse.tahu.protobuf.Payload\" +\n      \".DataSet.DataSetValue*\\010\\010\\002\\020\\200\\200\\200\\200\\002*\\010\\010\\005\\020\\200\\200\\200\\200\" +\n      \"\\002\\032\\351\\003\\n\\rPropertyValue\\022\\014\\n\\004type\\030\\001 \\001(\\r\\022\\017\\n\\007is_\" +\n      \"null\\030\\002 \\001(\\010\\022\\023\\n\\tint_value\\030\\003 \\001(\\rH\\000\\022\\024\\n\\nlong_\" +\n      \"value\\030\\004 \\001(\\004H\\000\\022\\025\\n\\013float_value\\030\\005 \\001(\\002H\\000\\022\\026\\n\\014\" +\n      \"double_value\\030\\006 \\001(\\001H\\000\\022\\027\\n\\rboolean_value\\030\\007 \" +\n      \"\\001(\\010H\\000\\022\\026\\n\\014string_value\\030\\010 \\001(\\tH\\000\\022K\\n\\021propert\" +\n      \"yset_value\\030\\t \\001(\\0132..org.eclipse.tahu.prot\" +\n      \"obuf.Payload.PropertySetH\\000\\022P\\n\\022propertyse\" +\n      \"ts_value\\030\\n \\001(\\01322.org.eclipse.tahu.protob\",\n      \"uf.Payload.PropertySetListH\\000\\022b\\n\\017extensio\" +\n      \"n_value\\030\\013 \\001(\\0132G.org.eclipse.tahu.protobu\" +\n      \"f.Payload.PropertyValue.PropertyValueExt\" +\n      \"ensionH\\000\\032\\\"\\n\\026PropertyValueExtension*\\010\\010\\001\\020\\200\" +\n      \"\\200\\200\\200\\002B\\007\\n\\005value\\032g\\n\\013PropertySet\\022\\014\\n\\004keys\\030\\001 \\003\" +\n      \"(\\t\\022@\\n\\006values\\030\\002 \\003(\\01320.org.eclipse.tahu.pr\" +\n      \"otobuf.Payload.PropertyValue*\\010\\010\\003\\020\\200\\200\\200\\200\\002\\032`\" +\n      \"\\n\\017PropertySetList\\022C\\n\\013propertyset\\030\\001 \\003(\\0132.\" +\n      \".org.eclipse.tahu.protobuf.Payload.Prope\" +\n      \"rtySet*\\010\\010\\002\\020\\200\\200\\200\\200\\002\\032\\244\\001\\n\\010MetaData\\022\\025\\n\\ris_mult\",\n      \"i_part\\030\\001 \\001(\\010\\022\\024\\n\\014content_type\\030\\002 \\001(\\t\\022\\014\\n\\004si\" +\n      \"ze\\030\\003 \\001(\\004\\022\\013\\n\\003seq\\030\\004 \\001(\\004\\022\\021\\n\\tfile_name\\030\\005 \\001(\\t\" +\n      \"\\022\\021\\n\\tfile_type\\030\\006 \\001(\\t\\022\\013\\n\\003md5\\030\\007 \\001(\\t\\022\\023\\n\\013desc\" +\n      \"ription\\030\\010 \\001(\\t*\\010\\010\\t\\020\\200\\200\\200\\200\\002\\032\\277\\005\\n\\006Metric\\022\\014\\n\\004na\" +\n      \"me\\030\\001 \\001(\\t\\022\\r\\n\\005alias\\030\\002 \\001(\\004\\022\\021\\n\\ttimestamp\\030\\003 \\001\" +\n      \"(\\004\\022\\020\\n\\010datatype\\030\\004 \\001(\\r\\022\\025\\n\\ris_historical\\030\\005 \" +\n      \"\\001(\\010\\022\\024\\n\\014is_transient\\030\\006 \\001(\\010\\022\\017\\n\\007is_null\\030\\007 \\001\" +\n      \"(\\010\\022=\\n\\010metadata\\030\\010 \\001(\\0132+.org.eclipse.tahu.\" +\n      \"protobuf.Payload.MetaData\\022B\\n\\nproperties\\030\" +\n      \"\\t \\001(\\0132..org.eclipse.tahu.protobuf.Payloa\",\n      \"d.PropertySet\\022\\023\\n\\tint_value\\030\\n \\001(\\rH\\000\\022\\024\\n\\nlo\" +\n      \"ng_value\\030\\013 \\001(\\004H\\000\\022\\025\\n\\013float_value\\030\\014 \\001(\\002H\\000\\022\" +\n      \"\\026\\n\\014double_value\\030\\r \\001(\\001H\\000\\022\\027\\n\\rboolean_value\" +\n      \"\\030\\016 \\001(\\010H\\000\\022\\026\\n\\014string_value\\030\\017 \\001(\\tH\\000\\022\\025\\n\\013byte\" +\n      \"s_value\\030\\020 \\001(\\014H\\000\\022C\\n\\rdataset_value\\030\\021 \\001(\\0132*\" +\n      \".org.eclipse.tahu.protobuf.Payload.DataS\" +\n      \"etH\\000\\022E\\n\\016template_value\\030\\022 \\001(\\0132+.org.eclip\" +\n      \"se.tahu.protobuf.Payload.TemplateH\\000\\022Y\\n\\017e\" +\n      \"xtension_value\\030\\023 \\001(\\0132>.org.eclipse.tahu.\" +\n      \"protobuf.Payload.Metric.MetricValueExten\",\n      \"sionH\\000\\032 \\n\\024MetricValueExtension*\\010\\010\\001\\020\\200\\200\\200\\200\\002\" +\n      \"B\\007\\n\\005value*\\010\\010\\006\\020\\200\\200\\200\\200\\002*\\362\\003\\n\\010DataType\\022\\013\\n\\007Unkn\" +\n      \"own\\020\\000\\022\\010\\n\\004Int8\\020\\001\\022\\t\\n\\005Int16\\020\\002\\022\\t\\n\\005Int32\\020\\003\\022\\t\\n\" +\n      \"\\005Int64\\020\\004\\022\\t\\n\\005UInt8\\020\\005\\022\\n\\n\\006UInt16\\020\\006\\022\\n\\n\\006UInt3\" +\n      \"2\\020\\007\\022\\n\\n\\006UInt64\\020\\010\\022\\t\\n\\005Float\\020\\t\\022\\n\\n\\006Double\\020\\n\\022\\013\" +\n      \"\\n\\007Boolean\\020\\013\\022\\n\\n\\006String\\020\\014\\022\\014\\n\\010DateTime\\020\\r\\022\\010\\n\" +\n      \"\\004Text\\020\\016\\022\\010\\n\\004UUID\\020\\017\\022\\013\\n\\007DataSet\\020\\020\\022\\t\\n\\005Bytes\\020\" +\n      \"\\021\\022\\010\\n\\004File\\020\\022\\022\\014\\n\\010Template\\020\\023\\022\\017\\n\\013PropertySet\" +\n      \"\\020\\024\\022\\023\\n\\017PropertySetList\\020\\025\\022\\r\\n\\tInt8Array\\020\\026\\022\\016\" +\n      \"\\n\\nInt16Array\\020\\027\\022\\016\\n\\nInt32Array\\020\\030\\022\\016\\n\\nInt64A\",\n      \"rray\\020\\031\\022\\016\\n\\nUInt8Array\\020\\032\\022\\017\\n\\013UInt16Array\\020\\033\\022\" +\n      \"\\017\\n\\013UInt32Array\\020\\034\\022\\017\\n\\013UInt64Array\\020\\035\\022\\016\\n\\nFlo\" +\n      \"atArray\\020\\036\\022\\017\\n\\013DoubleArray\\020\\037\\022\\020\\n\\014BooleanArr\" +\n      \"ay\\020 \\022\\017\\n\\013StringArray\\020!\\022\\021\\n\\rDateTimeArray\\020\\\"\" +\n      \"B,\\n\\031org.eclipse.tahu.protobufB\\017Sparkplug\" +\n      \"BProto\"\n    };\n    com.google.protobuf.Descriptors.FileDescriptor.InternalDescriptorAssigner assigner =\n        new com.google.protobuf.Descriptors.FileDescriptor.    InternalDescriptorAssigner() {\n          public com.google.protobuf.ExtensionRegistry assignDescriptors(\n              com.google.protobuf.Descriptors.FileDescriptor root) {\n            descriptor = root;\n            return null;\n          }\n        };\n    com.google.protobuf.Descriptors.FileDescriptor\n      .internalBuildGeneratedFileFrom(descriptorData,\n        new com.google.protobuf.Descriptors.FileDescriptor[] {\n        }, assigner);\n    internal_static_org_eclipse_tahu_protobuf_Payload_descriptor =\n      getDescriptor().getMessageTypes().get(0);\n    internal_static_org_eclipse_tahu_protobuf_Payload_fieldAccessorTable = new\n      com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(\n        internal_static_org_eclipse_tahu_protobuf_Payload_descriptor,\n        new java.lang.String[] { \"Timestamp\", \"Metrics\", \"Seq\", \"Uuid\", \"Body\", });\n    internal_static_org_eclipse_tahu_protobuf_Payload_Template_descriptor =\n      internal_static_org_eclipse_tahu_protobuf_Payload_descriptor.getNestedTypes().get(0);\n    internal_static_org_eclipse_tahu_protobuf_Payload_Template_fieldAccessorTable = new\n      com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(\n        internal_static_org_eclipse_tahu_protobuf_Payload_Template_descriptor,\n        new java.lang.String[] { \"Version\", \"Metrics\", \"Parameters\", \"TemplateRef\", \"IsDefinition\", });\n    internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_descriptor =\n      internal_static_org_eclipse_tahu_protobuf_Payload_Template_descriptor.getNestedTypes().get(0);\n    internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_fieldAccessorTable = new\n      com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(\n        internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_descriptor,\n        new java.lang.String[] { \"Name\", \"Type\", \"IntValue\", \"LongValue\", \"FloatValue\", \"DoubleValue\", \"BooleanValue\", \"StringValue\", \"ExtensionValue\", \"Value\", });\n    internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_descriptor =\n      internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_descriptor.getNestedTypes().get(0);\n    internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_fieldAccessorTable = new\n      com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(\n        internal_static_org_eclipse_tahu_protobuf_Payload_Template_Parameter_ParameterValueExtension_descriptor,\n        new java.lang.String[] { });\n    internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_descriptor =\n      internal_static_org_eclipse_tahu_protobuf_Payload_descriptor.getNestedTypes().get(1);\n    internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_fieldAccessorTable = new\n      com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(\n        internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_descriptor,\n        new java.lang.String[] { \"NumOfColumns\", \"Columns\", \"Types\", \"Rows\", });\n    internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_descriptor =\n      internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_descriptor.getNestedTypes().get(0);\n    internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_fieldAccessorTable = new\n      com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(\n        internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_descriptor,\n        new java.lang.String[] { \"IntValue\", \"LongValue\", \"FloatValue\", \"DoubleValue\", \"BooleanValue\", \"StringValue\", \"ExtensionValue\", \"Value\", });\n    internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_descriptor =\n      internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_descriptor.getNestedTypes().get(0);\n    internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_fieldAccessorTable = new\n      com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(\n        internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_DataSetValue_DataSetValueExtension_descriptor,\n        new java.lang.String[] { });\n    internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_Row_descriptor =\n      internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_descriptor.getNestedTypes().get(1);\n    internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_Row_fieldAccessorTable = new\n      com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(\n        internal_static_org_eclipse_tahu_protobuf_Payload_DataSet_Row_descriptor,\n        new java.lang.String[] { \"Elements\", });\n    internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_descriptor =\n      internal_static_org_eclipse_tahu_protobuf_Payload_descriptor.getNestedTypes().get(2);\n    internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_fieldAccessorTable = new\n      com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(\n        internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_descriptor,\n        new java.lang.String[] { \"Type\", \"IsNull\", \"IntValue\", \"LongValue\", \"FloatValue\", \"DoubleValue\", \"BooleanValue\", \"StringValue\", \"PropertysetValue\", \"PropertysetsValue\", \"ExtensionValue\", \"Value\", });\n    internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_descriptor =\n      internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_descriptor.getNestedTypes().get(0);\n    internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_fieldAccessorTable = new\n      com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(\n        internal_static_org_eclipse_tahu_protobuf_Payload_PropertyValue_PropertyValueExtension_descriptor,\n        new java.lang.String[] { });\n    internal_static_org_eclipse_tahu_protobuf_Payload_PropertySet_descriptor =\n      internal_static_org_eclipse_tahu_protobuf_Payload_descriptor.getNestedTypes().get(3);\n    internal_static_org_eclipse_tahu_protobuf_Payload_PropertySet_fieldAccessorTable = new\n      com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(\n        internal_static_org_eclipse_tahu_protobuf_Payload_PropertySet_descriptor,\n        new java.lang.String[] { \"Keys\", \"Values\", });\n    internal_static_org_eclipse_tahu_protobuf_Payload_PropertySetList_descriptor =\n      internal_static_org_eclipse_tahu_protobuf_Payload_descriptor.getNestedTypes().get(4);\n    internal_static_org_eclipse_tahu_protobuf_Payload_PropertySetList_fieldAccessorTable = new\n      com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(\n        internal_static_org_eclipse_tahu_protobuf_Payload_PropertySetList_descriptor,\n        new java.lang.String[] { \"Propertyset\", });\n    internal_static_org_eclipse_tahu_protobuf_Payload_MetaData_descriptor =\n      internal_static_org_eclipse_tahu_protobuf_Payload_descriptor.getNestedTypes().get(5);\n    internal_static_org_eclipse_tahu_protobuf_Payload_MetaData_fieldAccessorTable = new\n      com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(\n        internal_static_org_eclipse_tahu_protobuf_Payload_MetaData_descriptor,\n        new java.lang.String[] { \"IsMultiPart\", \"ContentType\", \"Size\", \"Seq\", \"FileName\", \"FileType\", \"Md5\", \"Description\", });\n    internal_static_org_eclipse_tahu_protobuf_Payload_Metric_descriptor =\n      internal_static_org_eclipse_tahu_protobuf_Payload_descriptor.getNestedTypes().get(6);\n    internal_static_org_eclipse_tahu_protobuf_Payload_Metric_fieldAccessorTable = new\n      com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(\n        internal_static_org_eclipse_tahu_protobuf_Payload_Metric_descriptor,\n        new java.lang.String[] { \"Name\", \"Alias\", \"Timestamp\", \"Datatype\", \"IsHistorical\", \"IsTransient\", \"IsNull\", \"Metadata\", \"Properties\", \"IntValue\", \"LongValue\", \"FloatValue\", \"DoubleValue\", \"BooleanValue\", \"StringValue\", \"BytesValue\", \"DatasetValue\", \"TemplateValue\", \"ExtensionValue\", \"Value\", });\n    internal_static_org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_descriptor =\n      internal_static_org_eclipse_tahu_protobuf_Payload_Metric_descriptor.getNestedTypes().get(0);\n    internal_static_org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_fieldAccessorTable = new\n      com.google.protobuf.GeneratedMessageV3.FieldAccessorTable(\n        internal_static_org_eclipse_tahu_protobuf_Payload_Metric_MetricValueExtension_descriptor,\n        new java.lang.String[] { });\n  }\n\n  // @@protoc_insertion_point(outer_class_scope)\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/util/CompressionAlgorithm.java",
    "content": "/********************************************************************************\n * Copyright (c) 2017-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.util;\n\n/**\n * An enumeration of supported payload compression algorithms\n */\npublic enum CompressionAlgorithm {\n\n\tGZIP,\n\tDEFLATE;\n\n\tpublic static CompressionAlgorithm parse(String algorithm) {\n\t\treturn CompressionAlgorithm.valueOf(algorithm.toUpperCase());\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/util/GZipUtil.java",
    "content": "/********************************************************************************\n * Copyright (c) 2018-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.util;\n\nimport java.io.ByteArrayInputStream;\nimport java.io.ByteArrayOutputStream;\nimport java.util.zip.GZIPInputStream;\nimport java.util.zip.GZIPOutputStream;\n\npublic class GZipUtil {\n\n\tpublic static byte[] decompress(byte[] compressedData) throws Exception {\n\t\tByteArrayInputStream bais = new ByteArrayInputStream(compressedData);\n\t\tGZIPInputStream gis = null;\n\t\tByteArrayOutputStream baos = new ByteArrayOutputStream();\n\t\tbyte[] decompressedData = null;\n\t\tint bufferSize = 1024;\n\n\t\ttry {\n\t\t\tgis = new GZIPInputStream(bais);\n\t\t\tbyte[] buffer = new byte[bufferSize];\n\t\t\tint bytesRead = -1;\n\t\t\twhile ((bytesRead = gis.read(buffer, 0, bufferSize)) != -1) {\n\t\t\t\tbaos.write(buffer, 0, bytesRead);\n\t\t\t}\n\n\t\t\tbaos.flush();\n\t\t\tdecompressedData = baos.toByteArray();\n\t\t} finally {\n\t\t\tif (gis != null) {\n\t\t\t\tgis.close();\n\t\t\t}\n\t\t\tif (bais != null) {\n\t\t\t\tbais.close();\n\t\t\t}\n\t\t\tif (baos != null) {\n\t\t\t\tbaos.close();\n\t\t\t}\n\t\t}\n\t\treturn decompressedData;\n\t}\n\n\tpublic static byte[] compress(byte[] uncompressedData) throws Exception {\n\t\tByteArrayOutputStream baos = new ByteArrayOutputStream();\n\t\tGZIPOutputStream gos = null;\n\t\tbyte[] compressedData = null;\n\n\t\ttry {\n\t\t\tgos = new GZIPOutputStream(baos);\n\t\t\tgos.write(uncompressedData, 0, uncompressedData.length);\n\t\t\tgos.finish();\n\t\t\tgos.flush();\n\t\t\tbaos.flush();\n\t\t\tcompressedData = baos.toByteArray();\n\t\t} finally {\n\t\t\tif (gos != null) {\n\t\t\t\tgos.close();\n\t\t\t}\n\t\t\tif (baos != null) {\n\t\t\t\tbaos.close();\n\t\t\t}\n\t\t}\n\t\treturn compressedData;\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/util/MessageUtil.java",
    "content": "/********************************************************************************\n * Copyright (c) 2017-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.util;\n\nimport java.io.IOException;\nimport java.text.DateFormat;\nimport java.util.ArrayList;\nimport java.util.Collection;\nimport java.util.Date;\nimport java.util.List;\n\nimport org.eclipse.tahu.SparkplugException;\nimport org.eclipse.tahu.json.DeserializerModifier;\nimport org.eclipse.tahu.json.DeserializerModule;\nimport org.eclipse.tahu.message.model.Message;\nimport org.eclipse.tahu.message.model.Message.MessageBuilder;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.eclipse.tahu.message.model.Topic;\n\nimport com.fasterxml.jackson.core.JsonParseException;\nimport com.fasterxml.jackson.core.JsonProcessingException;\nimport com.fasterxml.jackson.databind.JsonMappingException;\nimport com.fasterxml.jackson.databind.ObjectMapper;\n\n/**\n * Utilities for Sparkplug Message handling.\n */\npublic class MessageUtil {\n\tprivate static final String TEN_THOUSAND_BYTE_STRING =\n\t\t\t\"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\"\n\t\t\t\t\t+ \"0123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789\";\n\n\t/**\n\t * Serializes a {@link Message} instance in to a JSON string.\n\t * \n\t * @param message a {@link Message} instance\n\t * @return a JSON string\n\t * @throws JsonProcessingException\n\t */\n\tpublic static String toJsonString(Message message) throws JsonProcessingException {\n\t\tObjectMapper mapper = new ObjectMapper();\n\t\treturn mapper.writeValueAsString(message);\n\t}\n\n\t/**\n\t * Serializes a {@link Message} instance in to a JSON string.\n\t * \n\t * @param message a {@link Message} instance\n\t * @param dateFormat a {@link DateFormat} to use for all {@link Date} Objects\n\t * @return a JSON string\n\t * @throws JsonProcessingException\n\t */\n\tpublic static String toJsonString(Message message, DateFormat dateFormat) throws JsonProcessingException {\n\t\tObjectMapper mapper = new ObjectMapper();\n\t\tmapper.setDateFormat(dateFormat);\n\t\treturn mapper.writeValueAsString(message);\n\t}\n\n\t/**\n\t * Deserializes a JSON string into a {@link Message} instance.\n\t * \n\t * @param payload a JSON string\n\t * @return a {@link Message} instance\n\t * @throws JsonProcessingException\n\t */\n\tpublic static Message fromJsonString(String jsonString)\n\t\t\tthrows JsonParseException, JsonMappingException, IOException {\n\t\treturn fromJsonString(jsonString, false);\n\t}\n\n\t/**\n\t * Deserializes a JSON string into a {@link Message} instance.\n\t *\n\t * @param payload a JSON string\n\t * @param excludeSeqNum a boolean flag denoting whether or not to exclude the seq number from the payload\n\t * @return a {@link Message} instance\n\t * @throws JsonProcessingException\n\t */\n\tpublic static Message fromJsonString(String jsonString, boolean excludeSeqNum)\n\t\t\tthrows JsonParseException, JsonMappingException, IOException {\n\t\tObjectMapper mapper = new ObjectMapper();\n\t\tmapper.registerModule(new DeserializerModule(new DeserializerModifier()));\n\t\tMessage message = mapper.readValue(jsonString, Message.class);\n\t\tif (excludeSeqNum) {\n\t\t\tmessage.getPayload().setSeq(null);\n\t\t\treturn message;\n\t\t} else {\n\t\t\treturn message;\n\t\t}\n\t}\n\n\t/**\n\t * Divides a {@link Message} instance into one or more instances based on the maximum JSON encoded size. This method\n\t * does not do an initial encoding/check on the size of the passed in message before it divides it in two.\n\t * \n\t * @param message the {@link Message} instance to divide\n\t * @param maxBytes the maximum bytes per {@link Message} instance\n\t * @return a {@link Collection} of {@link Message} instances\n\t * @throws JsonProcessingException\n\t */\n\tpublic static Collection<Message> divideJsonMessageByBytes(Message message, int maxBytes)\n\t\t\tthrows SparkplugException, JsonProcessingException {\n\t\tCollection<Message> messages = new ArrayList<Message>();\n\t\tdivideAndAddMessages(messages, message, maxBytes);\n\t\treturn messages;\n\t}\n\n\t/*\n\t * Recursively divides {@link Message} instances and adds them to the {@link Collection} once they are under the\n\t * maximum size.\n\t */\n\tprivate static void divideAndAddMessages(Collection<Message> messages, Message message, int maxBytes)\n\t\t\tthrows SparkplugException, JsonProcessingException {\n\t\tTopic topic = message.getTopic();\n\t\tSparkplugBPayload payload = message.getPayload();\n\t\tList<Metric> metrics = payload.getMetrics();\n\t\tfinal int metricCount = message.getPayload().getMetricCount();\n\t\tfinal int size = toJsonString(message).getBytes().length;\n\n\t\t// Check if the message can be divided\n\t\tif (metricCount <= 1) {\n\t\t\tString errorMessage = null;\n\t\t\tif (metricCount == 1) {\n\t\t\t\terrorMessage = \"Cannot divide SparkplugBPayload with one metric: \"\n\t\t\t\t\t\t+ message.getPayload().getMetrics().get(0).getName();\n\t\t\t} else {\n\t\t\t\terrorMessage = \"Cannot divide SparkplugBPayload with \" + metricCount + \" metrics\";\n\t\t\t}\n\t\t\tthrow new SparkplugException(errorMessage);\n\t\t}\n\n\t\tint newMessageCount = size / maxBytes + ((size % maxBytes > 0) ? 1 : 0);\n\t\tint metricsPerMessageCount = metricCount / newMessageCount + ((metricCount % newMessageCount > 0) ? 1 : 0);\n\t\tint index = 0;\n\n\t\twhile (index < metricCount) {\n\t\t\tint toIndex = metricCount < (index + metricsPerMessageCount) ? metricCount : index + metricsPerMessageCount;\n\t\t\t// build a new Message with the payload containing the next subset (count) of metrics\n\t\t\tMessage newMessage = new MessageBuilder(topic,\n\t\t\t\t\tnew SparkplugBPayloadBuilder().setTimestamp(payload.getTimestamp()).setUuid(payload.getUuid())\n\t\t\t\t\t\t\t.setSeq(payload.getSeq()).addMetrics(new ArrayList<Metric>(metrics.subList(index, toIndex)))\n\t\t\t\t\t\t\t.createPayload()).build();\n\t\t\tString jsonMessage = toJsonString(newMessage);\n\t\t\tif (jsonMessage.getBytes().length < maxBytes) {\n\t\t\t\tmessages.add(newMessage);\n\t\t\t} else {\n\t\t\t\tdivideAndAddMessages(messages, newMessage, maxBytes);\n\t\t\t}\n\t\t\tindex += metricsPerMessageCount;\n\t\t}\n\t}\n\n//\tpublic static void main(String[] args) throws Exception {\n//\t\tList<Metric> metrics = new ArrayList<Metric>();\n//\t\tRandom random = new Random();\n//\t\t// int numOfMetrics = random.nextInt(10) + 2;\n//\t\tint numOfMetrics = 1;\n//\t\tfor (int i = 0; i < numOfMetrics; i++) {\n//\t\t\t// int r = random.nextInt(10) + 1;\n//\t\t\tint r = 120;\n//\t\t\tString str = \"\";\n//\t\t\tfor (int j = 0; j < r; j++) {\n//\t\t\t\tstr = str + TEN_THOUSAND_BYTE_STRING;\n//\t\t\t}\n//\t\t\tSystem.out.println(\"new string: \" + str.getBytes().length + \", s\" + i);\n//\t\t\tmetrics.add(new MetricBuilder(\"s\" + i, MetricDataType.String, \"s\" + i + \"-\" + str).createMetric());\n//\t\t}\n//\n//\t\tMessage message =\n//\t\t\t\tnew MessageBuilder().topic(TopicUtil.parseTopic(\"spBv1.0/Group/DBIRTH/My Test Edge/My Device\"))\n//\t\t\t\t\t\t.payload(new SparkplugBPayloadBuilder().setTimestamp(new Date()).setUuid(newUUID()).setSeq(1L)\n//\t\t\t\t\t\t\t\t.addMetrics(metrics).createPayload())\n//\t\t\t\t\t\t.build();\n//\n//\t\tSystem.out.println(\"message size: \" + toJsonString(message).getBytes().length);\n//\n//\t\tCollection<Message> messages = divideJsonMessageByBytes(message, 11000);\n//\t\tSystem.out.println(\"new messages: \" + messages.size());\n//\n//\t\tfor (Message msg : messages) {\n//\t\t\tSystem.out.println(\" message contains \" + msg.getPayload().getMetricCount() + \" metrics\");\n//\t\t\tfor (Metric metric : msg.getPayload().getMetrics()) {\n//\t\t\t\tSystem.out.println(\"  message: \" + metric.getValue().toString().substring(0, 10) + \"...\");\n//\t\t\t}\n//\t\t}\n//\t}\n//\n//\tprivate static String newUUID() {\n//\t\treturn java.util.UUID.randomUUID().toString().substring(0, 8);\n//\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/util/PayloadUtil.java",
    "content": "/********************************************************************************\n * Copyright (c) 2017-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.util;\n\nimport java.io.ByteArrayOutputStream;\nimport java.io.IOException;\nimport java.util.List;\nimport java.util.zip.DataFormatException;\nimport java.util.zip.Deflater;\nimport java.util.zip.Inflater;\n\nimport org.eclipse.tahu.SparkplugException;\nimport org.eclipse.tahu.json.DeserializerModifier;\nimport org.eclipse.tahu.json.DeserializerModule;\nimport org.eclipse.tahu.message.SparkplugBPayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadEncoder;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.eclipse.tahu.model.MetricDataTypeMap;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\nimport com.fasterxml.jackson.core.JsonParseException;\nimport com.fasterxml.jackson.core.JsonProcessingException;\nimport com.fasterxml.jackson.databind.JsonMappingException;\nimport com.fasterxml.jackson.databind.ObjectMapper;\n\n/**\n * Utilities for Sparkplug Payload handling.\n */\npublic class PayloadUtil {\n\n\tprivate static final Logger logger = LoggerFactory.getLogger(PayloadUtil.class.getName());\n\n\tpublic static final String UUID_COMPRESSED = \"SPBV1.0_COMPRESSED\";\n\n\tpublic static final String METRIC_ALGORITHM = \"algorithm\";\n\n\t/**\n\t * Serializes a {@link SparkplugBPayload} instance in to a JSON string.\n\t * \n\t * @param payload a {@link SparkplugBPayload} instance\n\t * @return a JSON string\n\t * @throws JsonProcessingException\n\t */\n\tpublic static String toJsonString(SparkplugBPayload payload) throws JsonProcessingException {\n\t\tObjectMapper mapper = new ObjectMapper();\n\t\treturn mapper.writeValueAsString(payload);\n\t}\n\n\t/**\n\t * Deserializes a JSON string into a {@link SparkplugBPayload} instance.\n\t * \n\t * @param payload a JSON string\n\t * @return a {@link SparkplugBPayload} instance\n\t * @throws JsonProcessingException\n\t */\n\tpublic static SparkplugBPayload fromJsonString(String jsonString)\n\t\t\tthrows JsonParseException, JsonMappingException, IOException {\n\t\tObjectMapper mapper = new ObjectMapper();\n\t\tmapper.registerModule(new DeserializerModule(new DeserializerModifier()));\n\t\treturn mapper.readValue(jsonString, SparkplugBPayload.class);\n\t}\n\n\t/**\n\t * Returns a decompressed {@link SparkplugBPayload} instance from an existing payload. Will return the original\n\t * payload if not compressed payload exists.\n\t * \n\t * @param payload\n\t * @return\n\t * @throws Exception\n\t */\n\tpublic static SparkplugBPayload decompress(SparkplugBPayload payload, MetricDataTypeMap metricDataTypeMap)\n\t\t\tthrows Exception {\n\t\tif (UUID_COMPRESSED.equals(payload.getUuid())) {\n\t\t\tlogger.trace(\"Decompressing payload\");\n\t\t\tSparkplugBPayloadDecoder decoder = new SparkplugBPayloadDecoder();\n\t\t\tCompressionConfig config = new CompressionConfig();\n\t\t\tbyte[] decompressedBytes;\n\t\t\tList<Metric> metrics = payload.getMetrics();\n\n\t\t\tif (metrics != null && !metrics.isEmpty()) {\n\t\t\t\tfor (Metric metric : metrics) {\n\t\t\t\t\tif (metric.getName().equals(METRIC_ALGORITHM)) {\n\t\t\t\t\t\tconfig.setAlgorithm(CompressionAlgorithm.valueOf(metric.getValue().toString()));\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tswitch (config.getAlgorithm()) {\n\t\t\t\tcase GZIP:\n\t\t\t\t\tdecompressedBytes = GZipUtil.decompress(payload.getBody());\n\t\t\t\t\tbreak;\n\t\t\t\tcase DEFLATE:\n\t\t\t\t\tdecompressedBytes = inflateBytes(payload.getBody());\n\t\t\t\t\tbreak;\n\t\t\t\tdefault:\n\t\t\t\t\tthrow new SparkplugException(\"Unknown or unsupported algorithm \" + config.getAlgorithm());\n\t\t\t}\n\n\t\t\t// Decode bytes and return\n\t\t\treturn decoder.buildFromByteArray(decompressedBytes, metricDataTypeMap);\n\t\t} else {\n\t\t\tlogger.trace(\"Not decompressing payload\");\n\t\t\treturn payload;\n\t\t}\n\t}\n\n\t/**\n\t * \n\t * @param payload\n\t * @return\n\t * @throws IOException\n\t */\n\tpublic static SparkplugBPayload compress(SparkplugBPayload payload, boolean stripDataTypes) throws IOException {\n\t\tlogger.trace(\"Compressing payload\");\n\t\tSparkplugBPayloadEncoder encoder = new SparkplugBPayloadEncoder();\n\t\t// Encode bytes\n\t\tbyte[] encoded = encoder.getBytes(payload, stripDataTypes);\n\n\t\t// Default to DEFLATE\n\t\tbyte[] compressedBytes = deflateBytes(encoded);\n\n\t\t// Create new payload, add the bytes as the body, and return.\n\t\treturn new SparkplugBPayloadBuilder(payload.getSeq()).setBody(compressedBytes).setUuid(UUID_COMPRESSED)\n\t\t\t\t.createPayload();\n\t}\n\n\t/**\n\t * \n\t * @param payload\n\t * @return\n\t * @throws IOException\n\t */\n\tpublic static SparkplugBPayload compress(SparkplugBPayload payload, CompressionAlgorithm algorithm,\n\t\t\tboolean stripDataTypes) throws IOException, SparkplugException {\n\t\tlogger.trace(\"Compressing payload\");\n\t\tSparkplugBPayloadEncoder encoder = new SparkplugBPayloadEncoder();\n\t\t// Encode bytes\n\t\tbyte[] encoded = encoder.getBytes(payload, stripDataTypes);\n\t\tbyte[] compressed = null;\n\t\tMetric algorithmMetric =\n\t\t\t\tnew MetricBuilder(METRIC_ALGORITHM, MetricDataType.String, algorithm.toString()).createMetric();\n\n\t\t// Switch over compression algorithm\n\t\tswitch (algorithm) {\n\t\t\tcase GZIP:\n\t\t\t\ttry {\n\t\t\t\t\tcompressed = GZipUtil.compress(encoded);\n\t\t\t\t} catch (Exception e) {\n\t\t\t\t\tlogger.error(\"Failed to GZIP the payload\");\n\t\t\t\t\tthrow new SparkplugException(\"Failed to GZIP the payload\", e);\n\t\t\t\t}\n\t\t\t\tbreak;\n\t\t\tcase DEFLATE:\n\t\t\t\tcompressed = deflateBytes(encoded);\n\t\t\t\tbreak;\n\t\t\tdefault:\n\t\t\t\tthrow new SparkplugException(\"Unknown or unsupported algorithm \" + algorithm);\n\t\t}\n\n\t\t// Wrap and return the payload\n\t\treturn new SparkplugBPayloadBuilder(payload.getSeq()).setBody(compressed).setUuid(UUID_COMPRESSED)\n\t\t\t\t.addMetric(algorithmMetric).createPayload();\n\t}\n\n\t/**\n\t * Compresses a byte array using DEFLATE compression algorithm.\n\t * \n\t * @param bytes the byte array to compress.\n\t * @return the compressed byte array.\n\t * @throws IOException\n\t */\n\tprotected static byte[] deflateBytes(byte[] bytes) throws IOException {\n\t\tByteArrayOutputStream baos = new ByteArrayOutputStream(bytes.length);\n\t\tDeflater deflater = new Deflater();\n\t\tdeflater.setInput(bytes);\n\t\tdeflater.finish();\n\n\t\tbyte[] buffer = new byte[1024];\n\t\twhile (!deflater.finished()) {\n\t\t\tint count = deflater.deflate(buffer);\n\t\t\tbaos.write(buffer, 0, count);\n\t\t}\n\t\tbaos.close();\n\n\t\treturn baos.toByteArray();\n\t}\n\n\t/**\n\t * Decompresses a byte array using DEFLATE compression algorithm.\n\t * \n\t * @param bytes the byte array to decompress.\n\t * @return the decompressed byte array.\n\t * @throws IOException\n\t * @throws DataFormatException\n\t */\n\tprotected static byte[] inflateBytes(byte[] bytes) throws IOException, DataFormatException {\n\t\tByteArrayOutputStream baos = new ByteArrayOutputStream(bytes.length);\n\t\tInflater inflater = new Inflater();\n\t\tinflater.setInput(bytes);\n\n\t\tbyte[] buffer = new byte[1024];\n\t\twhile (!inflater.finished()) {\n\t\t\tint count = inflater.inflate(buffer);\n\t\t\tbaos.write(buffer, 0, count);\n\t\t}\n\t\tbaos.close();\n\t\treturn baos.toByteArray();\n\t}\n\n\tprivate static class CompressionConfig {\n\n\t\tprivate CompressionAlgorithm algorithm;\n\n\t\tprotected CompressionConfig() {\n\t\t\tthis.algorithm = CompressionAlgorithm.DEFLATE;\n\t\t}\n\n\t\tprotected CompressionAlgorithm getAlgorithm() {\n\t\t\treturn algorithm;\n\t\t}\n\n\t\tprotected void setAlgorithm(CompressionAlgorithm algorithm) {\n\t\t\tthis.algorithm = algorithm;\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/util/SparkplugUtil.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.util;\n\nimport java.math.BigInteger;\n\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.PropertySet;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugMeta;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class SparkplugUtil {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(SparkplugUtil.class.getName());\n\n\tpublic static int getQualityCode(Metric metric) {\n\t\tPropertySet propertySet = metric.getProperties();\n\t\tlogger.trace(\"Getting properties for {} with value: {}\", metric.getName(),\n\t\t\t\t(propertySet != null && propertySet.getPropertyMap() != null)\n\t\t\t\t\t\t? propertySet.getPropertyMap().toString()\n\t\t\t\t\t\t: \"null\");\n\t\tif (propertySet != null && propertySet.getPropertyValue(\"Quality\") != null) {\n\t\t\treturn (int) propertySet.getPropertyValue(\"Quality\").getValue();\n\t\t}\n\n\t\tlogger.trace(\"No incoming quality for {} - assuming good\", metric.getName());\n\t\treturn 192;\n\t}\n\n\tpublic static Long getBdSequenceNumber(SparkplugBPayload payload) throws Exception {\n\t\tfor (Metric metric : payload.getMetrics()) {\n\t\t\tif (SparkplugMeta.SPARKPLUG_BD_SEQUENCE_NUMBER_KEY.equals(metric.getName())) {\n\t\t\t\treturn convertSeqNumber(metric.getValue());\n\t\t\t}\n\t\t}\n\n\t\t// No BD sequence number found - return null\n\t\treturn null;\n\t}\n\n\tpublic static long convertSeqNumber(Object sequenceNumber) {\n\t\tif (sequenceNumber instanceof Long) {\n\t\t\treturn (long) sequenceNumber;\n\t\t} else if (sequenceNumber instanceof BigInteger) {\n\t\t\treturn ((BigInteger) sequenceNumber).longValue();\n\t\t} else if (sequenceNumber instanceof Integer) {\n\t\t\treturn ((Integer) sequenceNumber).longValue();\n\t\t} else if (sequenceNumber instanceof Byte) {\n\t\t\treturn ((Byte) sequenceNumber).longValue();\n\t\t} else if (sequenceNumber instanceof Short) {\n\t\t\treturn ((Short) sequenceNumber).longValue();\n\t\t}\n\t\t// Default to explicit cast to long\n\t\treturn (long) sequenceNumber;\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/util/TopicUtil.java",
    "content": "/********************************************************************************\n * Copyright (c) 2016-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.util;\n\nimport java.util.HashMap;\nimport java.util.Map;\n\nimport org.eclipse.tahu.SparkplugParsingException;\nimport org.eclipse.tahu.message.model.MessageType;\nimport org.eclipse.tahu.message.model.Topic;\n\nimport com.fasterxml.jackson.core.JsonProcessingException;\nimport com.fasterxml.jackson.databind.ObjectMapper;\n\n/**\n * Provides utility methods for handling Sparkplug MQTT message topics.\n */\npublic class TopicUtil {\n\n\tprivate static final Map<String, String[]> SPLIT_TOPIC_CACHE = new HashMap<String, String[]>();\n\n\tpublic static String[] getSplitTopic(String topic) {\n\t\tString[] splitTopic = SPLIT_TOPIC_CACHE.get(topic);\n\t\tif (splitTopic == null) {\n\t\t\tsplitTopic = topic.split(\"/\");\n\t\t\tSPLIT_TOPIC_CACHE.put(topic, splitTopic);\n\t\t}\n\n\t\treturn splitTopic;\n\t}\n\n\t/**\n\t * Serializes a {@link Topic} instance in to a JSON string.\n\t * \n\t * @param topic a {@link Topic} instance\n\t * @return a JSON string\n\t * @throws JsonProcessingException\n\t */\n\tpublic static String toJsonString(Topic topic) throws JsonProcessingException {\n\t\tObjectMapper mapper = new ObjectMapper();\n\t\treturn mapper.writeValueAsString(topic);\n\t}\n\n\t/**\n\t * Parses a Sparkplug MQTT message topic string and returns a {@link Topic} instance.\n\t *\n\t * @param topic a topic string\n\t * @return a {@link Topic} instance\n\t * @throws SparkplugParsingException if an error occurs while parsing\n\t */\n\tpublic static Topic parseTopic(String topic) throws SparkplugParsingException {\n\t\treturn parseTopic(TopicUtil.getSplitTopic(topic));\n\t}\n\n\t/**\n\t * Parses a Sparkplug MQTT message topic string and returns a {@link Topic} instance.\n\t *\n\t * @param splitTopic a topic split into tokens\n\t * @return a {@link Topic} instance\n\t * @throws SparkplugParsingException if an error occurs while parsing\n\t */\n\t@SuppressWarnings(\"incomplete-switch\")\n\tpublic static Topic parseTopic(String[] splitTopic) throws SparkplugParsingException {\n\t\tMessageType type;\n\t\tString namespace, edgeNodeId, groupId;\n\t\tint length = splitTopic.length;\n\n\t\tif (length == 3 && MessageType.STATE.toString().equals(splitTopic[1])) {\n\t\t\treturn new Topic(splitTopic[0], splitTopic[2], MessageType.STATE);\n\t\t}\n\n\t\tif (length < 4 || length > 5) {\n\t\t\tthrow new SparkplugParsingException(\"Invalid number of topic elements: \" + length);\n\t\t}\n\n\t\tnamespace = splitTopic[0];\n\t\tgroupId = splitTopic[1];\n\t\ttype = MessageType.parseMessageType(splitTopic[2]);\n\t\tedgeNodeId = splitTopic[3];\n\n\t\tif (length == 4) {\n\t\t\t// A node topic\n\t\t\tswitch (type) {\n\t\t\t\tcase NBIRTH:\n\t\t\t\tcase NCMD:\n\t\t\t\tcase NDATA:\n\t\t\t\tcase NDEATH:\n\t\t\t\tcase NRECORD:\n\t\t\t\t\treturn new Topic(namespace, groupId, edgeNodeId, type);\n\t\t\t}\n\t\t} else {\n\t\t\t// A device topic\n\t\t\tswitch (type) {\n\t\t\t\tcase DBIRTH:\n\t\t\t\tcase DCMD:\n\t\t\t\tcase DDATA:\n\t\t\t\tcase DDEATH:\n\t\t\t\tcase DRECORD:\n\t\t\t\t\treturn new Topic(namespace, groupId, edgeNodeId, splitTopic[4], type);\n\t\t\t}\n\t\t}\n\t\tthrow new SparkplugParsingException(\"Invalid number of topic elements \" + length + \" for topic type \" + type);\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/main/java/org/eclipse/tahu/util/ValidationUtils.java",
    "content": "/********************************************************************************\n * Copyright (c) 2017-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.util;\n\npublic class ValidationUtils {\n\n\tpublic static final String JSON_V4_SCHEMA_IDENTIFIER = \"http://json-schema.org/draft-04/schema#\";\n\tpublic static final String JSON_SCHEMA_IDENTIFIER_ELEMENT = \"$schema\";\n\n//\tpublic static JsonNode getJsonNode(String jsonText) throws IOException {\n//\t\treturn JsonLoader.fromString(jsonText);\n//\t} // getJsonNode(text) ends\n//\n//\tpublic static JsonNode getJsonNode(File jsonFile) throws IOException {\n//\t\treturn JsonLoader.fromFile(jsonFile);\n//\t} // getJsonNode(File) ends\n//\n//\tpublic static JsonNode getJsonNode(URL url) throws IOException {\n//\t\treturn JsonLoader.fromURL(url);\n//\t} // getJsonNode(URL) ends\n//\n//\tpublic static JsonNode getJsonNodeFromResource(String resource) throws IOException {\n//\t\treturn JsonLoader.fromResource(resource);\n//\t} // getJsonNode(Resource) ends\n//\n//\tpublic static JsonSchema getSchemaNode(String schemaText) throws IOException, ProcessingException {\n//\t\tfinal JsonNode schemaNode = getJsonNode(schemaText);\n//\t\treturn _getSchemaNode(schemaNode);\n//\t} // getSchemaNode(text) ends\n//\n//\tpublic static JsonSchema getSchemaNode(File schemaFile) throws IOException, ProcessingException {\n//\t\tfinal JsonNode schemaNode = getJsonNode(schemaFile);\n//\t\treturn _getSchemaNode(schemaNode);\n//\t} // getSchemaNode(File) ends\n//\n//\tpublic static JsonSchema getSchemaNode(URL schemaFile) throws IOException, ProcessingException {\n//\t\tfinal JsonNode schemaNode = getJsonNode(schemaFile);\n//\t\treturn _getSchemaNode(schemaNode);\n//\t} // getSchemaNode(URL) ends\n//\n//\tpublic static JsonSchema getSchemaNodeFromResource(String resource) throws IOException, ProcessingException {\n//\t\tfinal JsonNode schemaNode = getJsonNodeFromResource(resource);\n//\t\treturn _getSchemaNode(schemaNode);\n//\t} // getSchemaNode() ends\n//\n//\tpublic static void validateJson(JsonSchema jsonSchemaNode, JsonNode jsonNode) throws ProcessingException {\n//\t\tProcessingReport report = jsonSchemaNode.validate(jsonNode);\n//\t\tif (!report.isSuccess()) {\n//\t\t\tfor (ProcessingMessage processingMessage : report) {\n//\t\t\t\tthrow new ProcessingException(processingMessage);\n//\t\t\t}\n//\t\t}\n//\t} // validateJson(Node) ends\n//\n//\tpublic static boolean isJsonValid(JsonSchema jsonSchemaNode, JsonNode jsonNode) throws ProcessingException {\n//\t\tProcessingReport report = jsonSchemaNode.validate(jsonNode);\n//\t\treturn report.isSuccess();\n//\t} // validateJson(Node) ends\n//\n//\tpublic static boolean isJsonValid(String schemaText, String jsonText) throws ProcessingException, IOException {\n//\t\tfinal JsonSchema schemaNode = getSchemaNode(schemaText);\n//\t\tfinal JsonNode jsonNode = getJsonNode(jsonText);\n//\t\treturn isJsonValid(schemaNode, jsonNode);\n//\t} // validateJson(Node) ends\n//\n//\tpublic static boolean isJsonValid(File schemaFile, File jsonFile) throws ProcessingException, IOException {\n//\t\tfinal JsonSchema schemaNode = getSchemaNode(schemaFile);\n//\t\tfinal JsonNode jsonNode = getJsonNode(jsonFile);\n//\t\treturn isJsonValid(schemaNode, jsonNode);\n//\t} // validateJson(Node) ends\n//\n//\tpublic static boolean isJsonValid(URL schemaURL, URL jsonURL) throws ProcessingException, IOException {\n//\t\tfinal JsonSchema schemaNode = getSchemaNode(schemaURL);\n//\t\tfinal JsonNode jsonNode = getJsonNode(jsonURL);\n//\t\treturn isJsonValid(schemaNode, jsonNode);\n//\t} // validateJson(Node) ends\n//\n//\tpublic static void validateJson(String schemaText, String jsonText) throws IOException, ProcessingException {\n//\t\tfinal JsonSchema schemaNode = getSchemaNode(schemaText);\n//\t\tfinal JsonNode jsonNode = getJsonNode(jsonText);\n//\t\tvalidateJson(schemaNode, jsonNode);\n//\t} // validateJson(text) ends\n//\n//\tpublic static void validateJson(File schemaFile, File jsonFile) throws IOException, ProcessingException {\n//\t\tfinal JsonSchema schemaNode = getSchemaNode(schemaFile);\n//\t\tfinal JsonNode jsonNode = getJsonNode(jsonFile);\n//\t\tvalidateJson(schemaNode, jsonNode);\n//\t} // validateJson(File) ends\n//\n//\tpublic static void validateJson(URL schemaDocument, URL jsonDocument) throws IOException, ProcessingException {\n//\t\tfinal JsonSchema schemaNode = getSchemaNode(schemaDocument);\n//\t\tfinal JsonNode jsonNode = getJsonNode(jsonDocument);\n//\t\tvalidateJson(schemaNode, jsonNode);\n//\t} // validateJson(URL) ends\n//\n//\tpublic static void validateJsonResource(String schemaResource, String jsonResource)\n//\t\t\tthrows IOException, ProcessingException {\n//\t\tfinal JsonSchema schemaNode = getSchemaNode(schemaResource);\n//\t\tfinal JsonNode jsonNode = getJsonNodeFromResource(jsonResource);\n//\t\tvalidateJson(schemaNode, jsonNode);\n//\t} // validateJsonResource() ends\n//\n//\tprivate static JsonSchema _getSchemaNode(JsonNode jsonNode) throws ProcessingException {\n//\t\tfinal JsonNode schemaIdentifier = jsonNode.get(JSON_SCHEMA_IDENTIFIER_ELEMENT);\n//\t\tif (null == schemaIdentifier) {\n//\t\t\t((ObjectNode) jsonNode).put(JSON_SCHEMA_IDENTIFIER_ELEMENT, JSON_V4_SCHEMA_IDENTIFIER);\n//\t\t}\n//\n//\t\tfinal JsonSchemaFactory factory = JsonSchemaFactory.byDefault();\n//\t\treturn factory.getJsonSchema(jsonNode);\n//\t} // _getSchemaNode() ends\n}\n"
  },
  {
    "path": "java/lib/core/src/main/resources/logback.xml",
    "content": "<configuration>\n\n  <appender name=\"STDOUT\" class=\"ch.qos.logback.core.ConsoleAppender\">\n    <!-- encoders are assigned the type\n         ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->\n    <encoder>\n      <pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>\n    </encoder>\n  </appender>\n\n<!--\n  <logger name=\"org.eclipse.tahu.host.HostApplication\" level=\"TRACE\"/>\n-->\n  <root level=\"TRACE\">\n    <appender-ref ref=\"STDOUT\" />\n  </root>\n</configuration>\n"
  },
  {
    "path": "java/lib/core/src/main/resources/payload.json",
    "content": "{\n    \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n    \"title\": \"Sparkplug B Payload\",\n    \"description\": \"A Sparkplug B payload\",\n    \"definitions\" : {\n        \"parameter\" : {\n            \"type\" : \"object\",\n            \"properties\" : {\n                \"name\" : { \"type\" : \"string\" },\n                \"type\" : { \"type\" : \"string\" },\n                \"value\" : { \"type\" : [ \"string\", \"number\", \"integer\", \"boolean\" ] }\n            },\n            \"additionalProperties\" : false\n        },\n        \"template\" : {\n            \"type\" : \"object\",\n            \"properties\" : {\n                \"version\" : { \"type\" : \"string\" },\n                \"reference\" : { \"type\" : \"string\" },\n                \"isDefinition\" : { \"type\" : \"boolean\" },\n                \"parameters\" : {\n                    \"type\" : \"array\",\n                    \"items\" : { \"$ref\" : \"#/definitions/parameter\" }\n                },\n                \"metrics\" : {\n                    \"type\" : \"array\",\n                    \"items\" : { \"$ref\" : \"#/definitions/metric\" }\n                }\n            },\n            \"additionalProperties\" : false\n        },\n        \"dataset\" : {\n            \"type\" : \"object\",\n            \"properties\" : {\n                \"numberOfColumns\" : { \"type\" : \"integer\" },\n                \"columnNames\" : { \n                    \"type\" : \"array\",\n                    \"items\" : { \"type\" : \"string\" }\n                },\n                \"types\" : { \n                    \"type\" : \"array\",\n                    \"items\" : { \"type\" : \"string\" }\n                },\n                \"rows\" : {\n                    \"type\" : \"array\",\n                    \"items\" : {\n                        \"type\" : \"array\",\n                        \"items\" : { \"type\" : [ \"string\", \"number\", \"integer\", \"boolean\" ] }\n                    }\n                }\n            },\n            \"additionalProperties\" : false\n        },\n\n        \"property\" : {\n            \"type\" : \"object\",\n            \"properties\" : {\n                \"type\" : { \"type\" : \"string\" },\n                \"value\" : {\n                    \"oneOf\" : [\n                        { \"type\" : [ \"string\", \"number\", \"integer\", \"boolean\", \"null\" ] },\n                        { \"$ref\" : \"#/definitions/propertySet\" },\n                        {\n                            \"type\" : \"array\",\n                            \"items\" : { \"$ref\" : \"#/definitions/propertySet\" }\n                        }\n                    ]\n                }\n            },\n            \"additionalProperties\" : false\n        },\n\n        \"propertySet\" : {\n            \"type\" : \"object\",\n            \"additionalProperties\" : { \"$ref\" : \"#/definitions/property\" }\n        },\n\n        \"metadata\" : {\n            \"type\" : \"object\",\n            \"properties\" : { \n                \"contentType\" : { \"type\" : \"string\" },\n                \"isMultiPart\" : { \"type\" : \"boolean\" },\n                \"seq\" : { \"type\" : \"integer\" },\n                \"size\" : { \"type\" : \"integer\" },\n                \"fileName\" : { \"type\" : \"string\" },\n                \"fileType\" : { \"type\" : \"string\" },\n                \"md5\" : { \"type\" : \"string\" },\n                \"description\" : { \"type\" : \"string\" }\n            }\n        },\n\n        \"metric\" : {\n            \"type\" : \"object\",\n            \"properties\" : { \n                \"name\" : { \"type\" : \"string\" },\n                \"alias\" : { \"type\" : \"integer\" },\n                \"timestamp\" : { \"type\" : \"integer\" },\n                \"datatype\" : { \"type\" : \"string\" },\n                \"isHistorical\" : { \"type\" : \"boolean\" },\n                \"isTransient\" : { \"type\" : \"boolean\" },\n                \"metadata\" : { \"$ref\" : \"#/definitions/metadata\" },\n                \"properties\" : { \"$ref\" : \"#/definitions/propertySet\" },\n                \"value\" : {\n                    \"oneOf\" : [\n                        { \"type\" : [ \"string\", \"number\", \"integer\", \"boolean\", \"null\" ] },\n                        { \"$ref\" : \"#/definitions/dataset\" },\n                        { \"$ref\" : \"#/definitions/template\" }\n                    ]\n                }\n            }\n        }\n    },\n\n    \"type\": \"object\",\n    \"properties\": {\n        \"timestamp\" : { \n            \"description\" : \"A timestamp in milliseconds\",\n            \"type\" : \"integer\" \n        },\n        \"seq\" : { \n            \"description\" : \"A sequence number\",\n            \"type\" : \"integer\"\n        },\n        \"uuid\" : {\n            \"description\" : \"A unique identifier\",\n            \"type\" : \"string\"\n        },\n        \"body\" : {\n            \"description\" : \"A UTF-8 encoded string representing a byte array\",\n            \"type\" : \"string\"\n        },\n        \"metrics\" : {\n            \"description\" : \"An array of metrics\",\n            \"type\" : \"array\",\n            \"items\" : { \"$ref\" : \"#/definitions/metric\" }\n        }\n    }\n}\n"
  },
  {
    "path": "java/lib/core/src/test/java/org/eclipse/tahu/message/test/EnDeCodeTest.java",
    "content": "/*\n * Licensed Materials - Property of Cirrus Link Solutions\n * Copyright (c) 2023 Cirrus Link Solutions LLC - All Rights Reserved\n * Unauthorized copying of this file, via any medium is strictly prohibited\n * Proprietary and confidential\n */\npackage org.eclipse.tahu.message.test;\n\nimport org.eclipse.tahu.message.SparkplugBPayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadEncoder;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\n\nimport junit.framework.TestCase;\n\npublic class EnDeCodeTest extends TestCase {\n\n\tpublic EnDeCodeTest(String testName) {\n\t\tsuper(testName);\n\t}\n\n\tpublic void testSimple() {\n\t\tassertTrue(true);\n\t}\n\n\tpublic void testEncodeDecode() {\n\t\ttry {\n\t\t\tSparkplugBPayload originalPayload = new SparkplugBPayloadBuilder()\n\t\t\t\t\t.addMetric(\n\t\t\t\t\t\t\tnew MetricBuilder(\"String\", MetricDataType.String, \"日本人 中國的 ~=[]()%+{}@;\").createMetric())\n\t\t\t\t\t.addMetric(new MetricBuilder(\"StringArray\", MetricDataType.StringArray,\n\t\t\t\t\t\t\tnew String[] { \"日本人 中國的 ~=[]()%+{}@;\" }).createMetric())\n\t\t\t\t\t.addMetric(new MetricBuilder(\"StringArray\", MetricDataType.StringArray,\n\t\t\t\t\t\t\tnew String[] { \"日本人 中國的 ~=[]()%+{}@;\", \"漢字\" }).createMetric())\n\t\t\t\t\t.addMetric(new MetricBuilder(\"StringArray\", MetricDataType.StringArray,\n\t\t\t\t\t\t\tnew String[] { \"漢字\", \"日本人 中國的 ~=[]()%+{}@;\" }).createMetric())\n\t\t\t\t\t.addMetric(new MetricBuilder(\"StringArray\", MetricDataType.StringArray,\n\t\t\t\t\t\t\tnew String[] { \"ناطرريننهمم ع نااارررر\", \"漢字\", \"日本人 中國的 ~=[]()%+{}@;\" }).createMetric())\n\t\t\t\t\t.createPayload();\n\t\t\tbyte[] encoded = new SparkplugBPayloadEncoder().getBytes(originalPayload, false);\n\n\t\t\tSparkplugBPayload decodedPayload = new SparkplugBPayloadDecoder().buildFromByteArray(encoded, null);\n\n\t\t\tassertEquals(((String[]) originalPayload.getMetrics().get(1).getValue())[0],\n\t\t\t\t\t((String[]) decodedPayload.getMetrics().get(1).getValue())[0]);\n\n\t\t\tassertEquals(((String[]) originalPayload.getMetrics().get(2).getValue())[0],\n\t\t\t\t\t((String[]) decodedPayload.getMetrics().get(2).getValue())[0]);\n\t\t\tassertEquals(((String[]) originalPayload.getMetrics().get(2).getValue())[1],\n\t\t\t\t\t((String[]) decodedPayload.getMetrics().get(2).getValue())[1]);\n\n\t\t\tassertEquals(((String[]) originalPayload.getMetrics().get(3).getValue())[0],\n\t\t\t\t\t((String[]) decodedPayload.getMetrics().get(3).getValue())[0]);\n\t\t\tassertEquals(((String[]) originalPayload.getMetrics().get(3).getValue())[1],\n\t\t\t\t\t((String[]) decodedPayload.getMetrics().get(3).getValue())[1]);\n\n\t\t\tassertEquals(((String[]) originalPayload.getMetrics().get(4).getValue())[0],\n\t\t\t\t\t((String[]) decodedPayload.getMetrics().get(4).getValue())[0]);\n\t\t\tassertEquals(((String[]) originalPayload.getMetrics().get(4).getValue())[1],\n\t\t\t\t\t((String[]) decodedPayload.getMetrics().get(4).getValue())[1]);\n\t\t\tassertEquals(((String[]) originalPayload.getMetrics().get(4).getValue())[2],\n\t\t\t\t\t((String[]) decodedPayload.getMetrics().get(4).getValue())[2]);\n\t\t} catch (Exception e) {\n\t\t\tSystem.out.println(e);\n\t\t\tfail();\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/test/java/org/eclipse/tahu/mqtt/test/MqttServerUrlTest.java",
    "content": "/*\n * Licensed Materials - Property of Cirrus Link Solutions\n * Copyright (c) 2023 Cirrus Link Solutions LLC - All Rights Reserved\n * Unauthorized copying of this file, via any medium is strictly prohibited\n * Proprietary and confidential\n */\npackage org.eclipse.tahu.mqtt.test;\n\nimport org.eclipse.tahu.exception.TahuException;\nimport org.eclipse.tahu.mqtt.MqttServerUrl;\nimport org.testng.Assert;\nimport org.testng.annotations.DataProvider;\nimport org.testng.annotations.Test;\n\npublic class MqttServerUrlTest {\n\n\tpublic MqttServerUrlTest() {\n\t}\n\n\t@DataProvider\n\tpublic Object[][] goodUrlData() throws Exception {\n\t\treturn new Object[][] { { \"tcp://localhost:1883\", \"tcp\", \"localhost\", 1883 },\n\t\t\t\t{ \"ssl://localhost:8883\", \"ssl\", \"localhost\", 8883 },\n\t\t\t\t{ \"tls://localhost:8883\", \"tls\", \"localhost\", 8883 }, { \"localhost:1883\", \"tcp\", \"localhost\", 1883 },\n\t\t\t\t{ \"tcp://localhost\", \"tcp\", \"localhost\", 1883 }, { \"localhost\", \"tcp\", \"localhost\", 1883 }, };\n\t}\n\n\t@Test(\n\t\t\tdataProvider = \"goodUrlData\")\n\tpublic void testGoodMqttServerUrls(String url, String expectedProtol, String expectedFqdn, Integer expectedPort) {\n\t\ttry {\n\t\t\tMqttServerUrl mqttServerUrl = new MqttServerUrl(url);\n\t\t\tAssert.assertEquals(mqttServerUrl.getProtocol(), expectedProtol);\n\t\t\tAssert.assertEquals(mqttServerUrl.getFqdn(), expectedFqdn);\n\t\t\tAssert.assertEquals(mqttServerUrl.getPort(), expectedPort);\n\t\t} catch (TahuException e) {\n\t\t\tAssert.fail();\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/test/java/org/eclipse/tahu/test/SequenceTest.java",
    "content": "/*\n * Licensed Materials - Property of Cirrus Link Solutions\n * Copyright (c) 2022 Cirrus Link Solutions LLC - All Rights Reserved\n * Unauthorized copying of this file, via any medium is strictly prohibited\n * Proprietary and confidential\n */\npackage org.eclipse.tahu.test;\n\nimport static org.assertj.core.api.Assertions.fail;\n\nimport java.util.Date;\n\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\nimport org.eclipse.tahu.message.PayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadEncoder;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.testng.annotations.Test;\n\n/**\n * Sparkplug Test class for encoding and decoding sparkplug payloads\n */\npublic class SequenceTest {\n\n\tpublic SequenceTest() {\n\t}\n\n\t@Test\n\tpublic void testEnDeCode() throws SparkplugInvalidTypeException {\n\t\tunit(null);\n\t\tunit(0L);\n\t\tunit(1L);\n\t}\n\n\tprivate void unit(Long seq) {\n\t\tDate currentTime = new Date(0L);\n\n\t\t// Encode\n\t\tSparkplugBPayloadEncoder encoder = new SparkplugBPayloadEncoder();\n\t\tPayloadDecoder<SparkplugBPayload> decoder = new SparkplugBPayloadDecoder();\n\t\ttry {\n\t\t\tSparkplugBPayload initialPayload =\n\t\t\t\t\tnew SparkplugBPayloadBuilder().setTimestamp(currentTime).setSeq(seq).createPayload();\n\t\t\tbyte[] bytes = encoder.getBytes(initialPayload, false);\n\t\t\tSparkplugBPayload decodedPayload = decoder.buildFromByteArray(bytes, null);\n\t\t\tSystem.out.println(\"Initial: \" + initialPayload);\n\t\t\tSystem.out.println(seq + \":       \" + bytesToHex(bytes));\n\t\t\tSystem.out.println(\"Decoded: \" + decodedPayload);\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t\tfail(e.getMessage());\n\t\t}\n\t}\n\n\tprivate final static char[] hexArray = \"0123456789ABCDEF\".toCharArray();\n\n\tprivate static String bytesToHex(byte[] bytes) {\n\t\tchar[] hexChars = new char[bytes.length * 3];\n\t\tint v;\n\t\tfor (int j = 0; j < bytes.length; j++) {\n\t\t\tv = bytes[j] & 0xFF;\n\t\t\thexChars[j * 3] = hexArray[v >>> 4];\n\t\t\thexChars[j * 3 + 1] = hexArray[v & 0x0F];\n\t\t\thexChars[j * 3 + 2] = 0x20; // space separator\n\t\t}\n\t\treturn new String(hexChars);\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/test/java/org/eclipse/tahu/test/SparkplugTest.java",
    "content": "/********************************************************************************\n * Copyright (c) 2016-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.test;\n\nimport static org.assertj.core.api.Assertions.assertThat;\nimport static org.assertj.core.api.Assertions.fail;\nimport static org.junit.Assert.assertNull;\n\nimport java.io.IOException;\nimport java.math.BigInteger;\nimport java.util.Arrays;\nimport java.util.Date;\nimport java.util.List;\nimport java.util.Map;\n\nimport org.eclipse.tahu.SparkplugException;\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\nimport org.eclipse.tahu.json.JsonValidator;\nimport org.eclipse.tahu.message.PayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadEncoder;\nimport org.eclipse.tahu.message.model.DataSet;\nimport org.eclipse.tahu.message.model.DataSet.DataSetBuilder;\nimport org.eclipse.tahu.message.model.DataSetDataType;\nimport org.eclipse.tahu.message.model.File;\nimport org.eclipse.tahu.message.model.MetaData;\nimport org.eclipse.tahu.message.model.MetaData.MetaDataBuilder;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.Parameter;\nimport org.eclipse.tahu.message.model.ParameterDataType;\nimport org.eclipse.tahu.message.model.PropertyDataType;\nimport org.eclipse.tahu.message.model.PropertySet;\nimport org.eclipse.tahu.message.model.PropertySet.PropertySetBuilder;\nimport org.eclipse.tahu.message.model.PropertyValue;\nimport org.eclipse.tahu.message.model.Row;\nimport org.eclipse.tahu.message.model.Row.RowBuilder;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.eclipse.tahu.message.model.Template;\nimport org.eclipse.tahu.message.model.Template.TemplateBuilder;\nimport org.eclipse.tahu.message.model.Value;\nimport org.eclipse.tahu.util.PayloadUtil;\nimport org.testng.annotations.DataProvider;\nimport org.testng.annotations.Test;\n\n/**\n * Sparkplug Test class for encoding and decoding sparkplug payloads\n */\npublic class SparkplugTest {\n\n\t/**\n\t * A {@link JsonValidator} instance used for testing JSON validation.\n\t */\n\tprivate JsonValidator validator;\n\n\tpublic SparkplugTest() {\n\t\tvalidator = JsonValidator.getInstance();\n\t}\n\n\t@DataProvider\n\tpublic Object[][] metricData() throws Exception {\n\t\treturn new Object[][] { { \"TestByteObject\", MetricDataType.Int8, new Byte((byte) 123), null },\n\t\t\t\t{ \"TestByte\", MetricDataType.Int8, (byte) 123, null },\n\t\t\t\t{ \"TestShortObject\", MetricDataType.Int16, new Short((short) 12345), null },\n\t\t\t\t{ \"TestShort\", MetricDataType.Int16, (short) 12345, null },\n\t\t\t\t{ \"TestIntObject\", MetricDataType.Int32, new Integer(1234567890), null },\n\t\t\t\t{ \"TestInt\", MetricDataType.Int32, 1234567890, null },\n\t\t\t\t{ \"TestLongObject\", MetricDataType.Int64, new Long(12345679000L), null },\n\t\t\t\t{ \"TestLong\", MetricDataType.Int64, 12345679000L, null },\n\t\t\t\t{ \"TestUnsignedByte\", MetricDataType.UInt8, new Short((short) 123), null },\n\t\t\t\t{ \"TestUnsignedShort\", MetricDataType.UInt16, 12345, null },\n\t\t\t\t{ \"TestUnsignedInt\", MetricDataType.UInt32, new Long(1234567890), null },\n\t\t\t\t{ \"TestUnsignedLong\", MetricDataType.UInt64, BigInteger.valueOf(12345679000L), null },\n\t\t\t\t{ \"TestFloatObject\", MetricDataType.Float, new Float(1.11111111111111111e+30f), null },\n\t\t\t\t{ \"TestFloat\", MetricDataType.Float, 1.11111111111111111e+30f, null },\n\t\t\t\t{ \"TestDoubleObject\", MetricDataType.Double, new Double(1.11111111111111111e+300d), null },\n\t\t\t\t{ \"TestDouble\", MetricDataType.Double, 1.11111111111111111e+300d, null },\n\t\t\t\t{ \"TestBooleanObject\", MetricDataType.Boolean, new Boolean(true), null },\n\t\t\t\t{ \"TestBoolean\", MetricDataType.Boolean, true, null },\n\t\t\t\t{ \"TestString\", MetricDataType.String, \"TEST_STRING\", null },\n\t\t\t\t{ \"TestDateTime\", MetricDataType.DateTime, new Date(), null },\n\t\t\t\t{ \"TestText\", MetricDataType.Text, \"TEST_TEXT\", null },\n\t\t\t\t{ \"TestUUID\", MetricDataType.UUID, \"915cac68-a20e-11e6-80f5-76304dec7eb7\", null },\n\t\t\t\t{ \"TestBytes\", MetricDataType.Bytes, new byte[] { 0x0, 0x1, 0x2, 0x3, 0x4 }, null },\n\t\t\t\t{ \"TestFile\", MetricDataType.File, new File(\"/tmp/.testfile\", new byte[] { 0x0, 0x1, 0x2, 0x3, 0x4 }),\n\t\t\t\t\t\tnew MetaDataBuilder().fileType(\"bin\").fileName(\"/tmp/.testfile\").multiPart(false)\n\t\t\t\t\t\t\t\t.createMetaData() },\n\t\t\t\t{ \"TestDataSet\", MetricDataType.DataSet,\n\t\t\t\t\t\tnew DataSetBuilder(5).addColumnName(\"Booleans\").addColumnName(\"Int32s\").addColumnName(\"Floats\")\n\t\t\t\t\t\t\t\t.addColumnName(\"Dates\").addColumnName(\"Strings\").addType(DataSetDataType.Boolean)\n\t\t\t\t\t\t\t\t.addType(DataSetDataType.Int32).addType(DataSetDataType.Float)\n\t\t\t\t\t\t\t\t.addType(DataSetDataType.DateTime).addType(DataSetDataType.String)\n\t\t\t\t\t\t\t\t.addRow(new RowBuilder().addValue(new Value<Boolean>(DataSetDataType.Boolean, false))\n\t\t\t\t\t\t\t\t\t\t.addValue(new Value<Integer>(DataSetDataType.Int32, 1))\n\t\t\t\t\t\t\t\t\t\t.addValue(new Value<Float>(DataSetDataType.Float, 1.1F))\n\t\t\t\t\t\t\t\t\t\t.addValue(new Value<Date>(DataSetDataType.DateTime, new Date()))\n\t\t\t\t\t\t\t\t\t\t.addValue(new Value<String>(DataSetDataType.String, \"abc\")).createRow())\n\t\t\t\t\t\t\t\t.addRow(new RowBuilder().addValue(new Value<Boolean>(DataSetDataType.Boolean, true))\n\t\t\t\t\t\t\t\t\t\t.addValue(new Value<Integer>(DataSetDataType.Int32, 2))\n\t\t\t\t\t\t\t\t\t\t.addValue(new Value<Float>(DataSetDataType.Float, 1.2F))\n\t\t\t\t\t\t\t\t\t\t.addValue(new Value<Date>(DataSetDataType.DateTime, new Date()))\n\t\t\t\t\t\t\t\t\t\t.addValue(new Value<String>(DataSetDataType.String, \"\")).createRow())\n\t\t\t\t\t\t\t\t.addRow(new RowBuilder().addValue(new Value<Boolean>(DataSetDataType.Boolean, false))\n\t\t\t\t\t\t\t\t\t\t.addValue(new Value<Integer>(DataSetDataType.Int32, 3))\n\t\t\t\t\t\t\t\t\t\t.addValue(new Value<Float>(DataSetDataType.Float, 1.3F))\n\t\t\t\t\t\t\t\t\t\t.addValue(new Value<Date>(DataSetDataType.DateTime, null))\n\t\t\t\t\t\t\t\t\t\t.addValue(new Value<String>(DataSetDataType.String, null)).createRow())\n\t\t\t\t\t\t\t\t.createDataSet(),\n\t\t\t\t\t\tnull },\n\t\t\t\t{ \"NullDataSet\", MetricDataType.DataSet, new DataSetBuilder(5).addColumnName(\"Booleans\")\n\t\t\t\t\t\t.addColumnName(\"Int32s\").addColumnName(\"Floats\").addColumnName(\"Dates\").addColumnName(\"Strings\")\n\t\t\t\t\t\t.addType(DataSetDataType.Boolean).addType(DataSetDataType.Int32).addType(DataSetDataType.Float)\n\t\t\t\t\t\t.addType(DataSetDataType.DateTime).addType(DataSetDataType.String)\n\t\t\t\t\t\t.addRow(new RowBuilder().addValue(new Value<Boolean>(DataSetDataType.Boolean, null))\n\t\t\t\t\t\t\t\t.addValue(new Value<Integer>(DataSetDataType.Int32, null))\n\t\t\t\t\t\t\t\t.addValue(new Value<Float>(DataSetDataType.Float, null))\n\t\t\t\t\t\t\t\t.addValue(new Value<Date>(DataSetDataType.DateTime, null))\n\t\t\t\t\t\t\t\t.addValue(new Value<String>(DataSetDataType.String, null)).createRow())\n\t\t\t\t\t\t.createDataSet(), null },\n\t\t\t\t{ \"TestTemplateDef\", MetricDataType.Template, new TemplateBuilder().version(\"v1.0\").templateRef(null)\n\t\t\t\t\t\t.definition(true).addParameter(new Parameter(\"BoolParam\", ParameterDataType.Boolean, true))\n\t\t\t\t\t\t.addParameter(new Parameter(\"IntParam\", ParameterDataType.Int32, 12345678))\n\t\t\t\t\t\t.addParameter(new Parameter(\"DateParam\", ParameterDataType.DateTime, new Date()))\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"TemplateMetric1\", MetricDataType.Boolean, true).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"TemplateMetric2\", MetricDataType.Int32, 1234567890)\n\t\t\t\t\t\t\t\t.createMetric())\n\t\t\t\t\t\t.addMetric(\n\t\t\t\t\t\t\t\tnew MetricBuilder(\"TemplateMetric3\", MetricDataType.String,\n\t\t\t\t\t\t\t\t\t\t\"TEST_STRING\")\n\t\t\t\t\t\t\t\t\t\t\t\t.properties(\n\t\t\t\t\t\t\t\t\t\t\t\t\t\tnew PropertySetBuilder()\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t.addProperty(\"prop1a\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tnew PropertyValue(PropertyDataType.Float,\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t1.23F))\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t.addProperty(\"prop1b\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tnew PropertyValue(PropertyDataType.Float,\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tnew Float(1.23)))\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t.addProperty(\"prop2\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tnew PropertyValue(PropertyDataType.DateTime,\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tnew Date()))\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t.addProperty(\"prop3\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tnew PropertyValue(PropertyDataType.Text,\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\"PROP3_TEXT\"))\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t.addProperty(\"prop4\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tnew PropertyValue(PropertyDataType.String,\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tnull))\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t.createPropertySet())\n\t\t\t\t\t\t\t\t\t\t\t\t.createMetric())\n\t\t\t\t\t\t.createTemplate(), null },\n\t\t\t\t{ \"TestTemplateInst\", MetricDataType.Template, new TemplateBuilder().version(\"v1.0\")\n\t\t\t\t\t\t.templateRef(\"TestTemplateDef\").definition(true)\n\t\t\t\t\t\t.addParameter(new Parameter(\"BoolParam\", ParameterDataType.Boolean, true))\n\t\t\t\t\t\t.addParameter(new Parameter(\"IntParam\", ParameterDataType.Int32, 1234567890))\n\t\t\t\t\t\t.addParameter(new Parameter(\"DateParam\", ParameterDataType.DateTime, new Date()))\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"TemplateMetric1\", MetricDataType.Boolean, true).createMetric())\n\t\t\t\t\t\t.addMetric(\n\t\t\t\t\t\t\t\tnew MetricBuilder(\"TemplateMetric2\", MetricDataType.Int32, 1234567890).createMetric())\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"TemplateMetric3\", MetricDataType.String, \"TEST_STRING\")\n\t\t\t\t\t\t\t\t.createMetric())\n\t\t\t\t\t\t.createTemplate(), null } };\n\t}\n\n\t@DataProvider\n\tpublic Object[][] metricFieldsData() throws Exception {\n\t\treturn new Object[][] {\n\t\t\t\t{ new MetricBuilder(\"metric1\", MetricDataType.Int32, 1234).alias(12345L)\n\t\t\t\t\t\t.timestamp(new Date(1479424852194L)).isHistorical(true).isTransient(false).createMetric(),\n\t\t\t\t\t\t12345L, new Date(1479424852194L), true, false, false },\n\t\t\t\t{ new MetricBuilder(\"metric2\", MetricDataType.DateTime, null).alias(1L)\n\t\t\t\t\t\t.timestamp(new Date(1479421234564L)).isHistorical(true).isTransient(true).createMetric(), 1L,\n\t\t\t\t\t\tnew Date(1479421234564L), true, true, true },\n\t\t\t\t{ new MetricBuilder(\"metric3\", MetricDataType.String, \"Test\").alias(999999999L)\n\t\t\t\t\t\t.timestamp(new Date(1479123452194L)).isHistorical(false).isTransient(false).createMetric(),\n\t\t\t\t\t\t999999999L, new Date(1479123452194L), false, false, false },\n\t\t\t\t{ new MetricBuilder(\"metric4\", MetricDataType.BooleanArray,\n\t\t\t\t\t\tnew Boolean[] { false, true, false, true, false, true, false, true, false, true, false, true,\n\t\t\t\t\t\t\t\tfalse, true, false, true, false, true, false, true }).alias(999999999L)\n\t\t\t\t\t\t\t\t\t\t.timestamp(new Date(1479123452194L)).isHistorical(false).isTransient(false)\n\t\t\t\t\t\t\t\t\t\t.createMetric(),\n\t\t\t\t\t\t999999999L, new Date(1479123452194L), false, false, false }, };\n\n\t}\n\n\t@DataProvider\n\tpublic Object[][] invalidParamterTypeData() throws Exception {\n\t\treturn new Object[][] { { \"Param1\", ParameterDataType.Boolean, 12345 },\n\t\t\t\t{ \"Param2\", ParameterDataType.Int8, true }, { \"Param3\", ParameterDataType.Int16, \"123\" },\n\t\t\t\t{ \"Param4\", ParameterDataType.Int32, true }, { \"Param5\", ParameterDataType.Int64, new Date() },\n\t\t\t\t{ \"Param6\", ParameterDataType.DateTime, 12345 }, { \"Param7\", ParameterDataType.Text, 12345 },\n\t\t\t\t{ \"Param8\", ParameterDataType.String, 12345 }, };\n\t}\n\n\t@DataProvider\n\tpublic Object[][] invalidMetricDataType() throws Exception {\n\t\treturn new Object[][] { { \"TestByteObject\", MetricDataType.Int8, new Short((short) 123), null },\n\t\t\t\t{ \"TestByte\", MetricDataType.Int8, new Long(12345679000L), null },\n\t\t\t\t{ \"TestShortObject\", MetricDataType.Int16, new Float(12345), null },\n\t\t\t\t{ \"TestBooleanObject\", MetricDataType.Boolean, new Double(1.11111111111111111e+300d), null },\n\t\t\t\t{ \"TestBoolean\", MetricDataType.Boolean, 123, null },\n\t\t\t\t{ \"TestString\", MetricDataType.String, true, null },\n\t\t\t\t{ \"TestDateTime\", MetricDataType.DateTime, 1234567, null },\n\t\t\t\t{ \"TestUUID\", MetricDataType.UUID, 998877, null },\n\t\t\t\t{ \"TestBytes\", MetricDataType.Bytes, new int[] { 1, 2, 3, 4, 5 }, null } };\n\t}\n\n\t@Test\n\tpublic void testEnDeCode() throws SparkplugInvalidTypeException {\n\t\tDate currentTime = new Date();\n\t\tSparkplugBPayloadBuilder payloadBuilder = new SparkplugBPayloadBuilder().setTimestamp(currentTime).setSeq(0L)\n\t\t\t\t.setUuid(\"123456789\").setBody(\"Hello\".getBytes());\n\n\t\t// Create MetaData\n\t\tMetaData metaData = new MetaDataBuilder().contentType(\"none\").size(12L).seq(0L).fileName(\"none\")\n\t\t\t\t.fileType(\"none\").md5(\"none\").multiPart(false).description(\"none\").createMetaData();\n\n\t\t// Create one metric\n\t\tpayloadBuilder.addMetric(new MetricBuilder(\"Name\", MetricDataType.Int8, (byte) 65).alias(0L)\n\t\t\t\t.timestamp(currentTime).isHistorical(false).metaData(metaData).createMetric());\n\n\t\t// Create null metric\n\t\tpayloadBuilder.addMetric(new MetricBuilder(\"Null\", MetricDataType.String, null).alias(0L).timestamp(currentTime)\n\t\t\t\t.isHistorical(false).metaData(metaData).createMetric());\n\n\t\t// Encode\n\t\tSparkplugBPayloadEncoder encoder = new SparkplugBPayloadEncoder();\n\t\tbyte[] bytes = null;\n\t\ttry {\n\t\t\tbytes = encoder.getBytes(payloadBuilder.createPayload(), false);\n\t\t} catch (IOException e) {\n\t\t\te.printStackTrace();\n\t\t\tfail(e.getMessage());\n\t\t}\n\n\t\t// Decode\n\t\tPayloadDecoder<SparkplugBPayload> decoder = new SparkplugBPayloadDecoder();\n\t\tSparkplugBPayload decodedPayload = null;\n\t\ttry {\n\t\t\tdecodedPayload = decoder.buildFromByteArray(bytes, null);\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t\tfail(e.getMessage());\n\t\t}\n\n\t\t// SparkplugBPayload checks\n\t\tassertThat(currentTime).isEqualTo(decodedPayload.getTimestamp());\n\t\tassertThat(0L).isEqualTo(decodedPayload.getSeq());\n\t\tassertThat(\"123456789\").isEqualTo(decodedPayload.getUuid());\n\t\tassertThat(Arrays.equals(\"Hello\".getBytes(), decodedPayload.getBody())).isTrue();\n\n\t\t// Test the Metric\n\t\tassertThat(2).isEqualTo(decodedPayload.getMetrics().size());\n\t\tMetric decodedMetric = decodedPayload.getMetrics().get(0);\n\t\tassertThat(\"Name\").isEqualTo(decodedMetric.getName());\n\t\tassertThat(new Long(0)).isEqualTo(decodedMetric.getAlias());\n\t\tassertThat(currentTime).isEqualTo(decodedMetric.getTimestamp());\n\t\tassertThat(MetricDataType.Int8).isEqualTo(decodedMetric.getDataType());\n\t\tassertThat(Boolean.FALSE).isEqualTo(decodedMetric.isHistorical());\n\t\tassertThat((byte) 65).isEqualTo(decodedMetric.getValue());\n\t\tassertThat(decodedMetric.getMetaData()).isNotNull();\n\n\t\t// Test the MetaData\n\t\tMetaData decodedMetaData = decodedMetric.getMetaData();\n\t\tassertThat(metaData).isEqualTo(decodedMetric.getMetaData());\n\t\tassertThat(\"none\").isEqualTo(decodedMetaData.getContentType());\n\t\tassertThat(12L).isEqualTo(decodedMetaData.getSize());\n\t\tassertThat(0L).isEqualTo(decodedMetaData.getSeq());\n\t\tassertThat(\"none\").isEqualTo(decodedMetaData.getFileName());\n\t\tassertThat(\"none\").isEqualTo(decodedMetaData.getFileType());\n\t\tassertThat(\"none\").isEqualTo(decodedMetaData.getMd5());\n\t\tassertThat(\"none\").isEqualTo(decodedMetaData.getDescription());\n\t}\n\n\t@Test(\n\t\t\tdataProvider = \"metricData\")\n\tpublic void testValidMetricPayload(String name, MetricDataType type, Object value, MetaData metaData)\n\t\t\tthrows SparkplugException {\n\t\ttestMetricPayload(name, type, value, metaData);\n\t}\n\n\t@Test(\n\t\t\tdataProvider = \"invalidMetricDataType\",\n\t\t\texpectedExceptions = SparkplugInvalidTypeException.class)\n\tpublic void testInvalidMetricDataType(String name, MetricDataType type, Object value, MetaData metaData)\n\t\t\tthrows SparkplugException {\n\t\ttestMetricPayload(name, type, value, metaData);\n\t}\n\n\t@Test(\n\t\t\tdataProvider = \"invalidParamterTypeData\",\n\t\t\texpectedExceptions = SparkplugInvalidTypeException.class)\n\tpublic void testInvalidParameterDataType(String name, ParameterDataType type, Object value)\n\t\t\tthrows SparkplugInvalidTypeException {\n\t\tnew Parameter(name, type, value);\n\t}\n\n\t@Test(\n\t\t\tdataProvider = \"metricFieldsData\")\n\tpublic void testValidMetricPayload(Metric metric, long alias, Date timestamp, boolean isHistorical,\n\t\t\tboolean isTransient, boolean isNull) throws Exception {\n\t\t// Encode\n\t\tbyte[] bytes = new SparkplugBPayloadEncoder().getBytes(\n\t\t\t\tnew SparkplugBPayloadBuilder().setTimestamp(new Date()).addMetric(metric).createPayload(), false);\n\n\t\t// Decode and test\n\t\tSparkplugBPayload payload = new SparkplugBPayloadDecoder().buildFromByteArray(bytes, null);\n\t\tMetric decodedMetric = payload.getMetrics().get(0);\n\t\tassertThat(decodedMetric.getAlias()).isEqualTo(alias);\n\t\tassertThat(decodedMetric.getTimestamp()).isEqualTo(timestamp);\n\t\tassertThat(decodedMetric.isHistorical()).isEqualTo(isHistorical);\n\t\tassertThat(decodedMetric.isTransient()).isEqualTo(isTransient);\n\t\tassertThat(decodedMetric.isNull()).isEqualTo(isNull);\n\t\tSystem.out.println(\"JSON: \" + PayloadUtil.toJsonString(payload));\n\t}\n\n\t@Test(\n\t\t\tdataProvider = \"metricData\")\n\tpublic void testJsonValidation(String name, MetricDataType type, Object value, MetaData metaData)\n\t\t\tthrows SparkplugException, Exception {\n\t\tDate currentTime = new Date();\n\n\t\tSparkplugBPayload payload = new SparkplugBPayloadBuilder().setTimestamp(currentTime)\n\t\t\t\t.addMetric(new MetricBuilder(name, type, value).metaData(metaData).createMetric()).createPayload();\n\n//    \tassertThat(validator.isJsonValid(PayloadUtil.toJsonString(payload))).isTrue();\n\t}\n\n\tprivate void testMetricPayload(String name, MetricDataType type, Object value, MetaData metaData)\n\t\t\tthrows SparkplugException {\n\t\ttry {\n\t\t\tSparkplugBPayloadEncoder encoder = new SparkplugBPayloadEncoder();\n\t\t\tDate currentTime = new Date();\n\n\t\t\t// Encode\n\t\t\tbyte[] bytes = encoder.getBytes(new SparkplugBPayloadBuilder().setTimestamp(currentTime)\n\t\t\t\t\t.addMetric(new MetricBuilder(name, type, value).metaData(metaData).createMetric()).createPayload(),\n\t\t\t\t\tfalse);\n\n\t\t\t// Decode\n\t\t\tPayloadDecoder<SparkplugBPayload> decoder = new SparkplugBPayloadDecoder();\n\t\t\tSparkplugBPayload decodedPayload = decoder.buildFromByteArray(bytes, null);\n\n\t\t\tfor (Metric metric : decodedPayload.getMetrics()) {\n\t\t\t\tif (metric.getDataType().equals(MetricDataType.Template)) {\n\t\t\t\t\tSystem.out.println(\"PAYLOAD: \" + PayloadUtil.toJsonString(decodedPayload));\n\t\t\t\t\tbreak;\n\t\t\t\t}\n\t\t\t}\n\n\t\t\t// SparkplugBPayload checks\n\t\t\tassertThat(currentTime).isEqualTo(decodedPayload.getTimestamp());\n\t\t\tassertNull(decodedPayload.getSeq());\n\t\t\tassertThat(decodedPayload.getBody()).isNull();\n\n\t\t\t// Metric checks\n\t\t\tassertThat(1).isEqualTo(decodedPayload.getMetrics().size());\n\n\t\t\tdoMetricTests(new MetricBuilder(name, type, value).metaData(metaData).createMetric(),\n\t\t\t\t\tdecodedPayload.getMetrics().get(0));\n\t\t} catch (SparkplugException e) {\n\t\t\tthrow e;\n\t\t} catch (Exception e) {\n\t\t\te.printStackTrace();\n\t\t\tfail(e.getMessage());\n\t\t}\n\t}\n\n\tprivate void doMetricTests(Metric metric, Metric decodedMetric) throws Exception {\n\t\tString name = metric.getName();\n\t\tMetricDataType type = metric.getDataType();\n\t\tObject value = metric.getValue();\n\t\tMetaData metaData = metric.getMetaData();\n\t\tPropertySet propertySet = metric.getProperties();\n\n\t\tassertThat(name).isEqualTo(decodedMetric.getName());\n\t\tassertThat(type).isEqualTo(decodedMetric.getDataType());\n\t\tassertThat(Boolean.FALSE).isEqualTo(decodedMetric.isHistorical());\n\t\tassertThat(Boolean.FALSE).isEqualTo(decodedMetric.isTransient());\n\n\t\t// Test PropertySet\n\t\tif (propertySet != null) {\n\t\t\tMap<String, PropertyValue> map = propertySet.getPropertyMap();\n\t\t\tMap<String, PropertyValue> decodedMap = decodedMetric.getProperties().getPropertyMap();\n\t\t\tassertThat(map.size()).isEqualTo(decodedMap.size());\n\t\t\tfor (String key : map.keySet()) {\n\t\t\t\tassertThat(map.get(key)).isEqualTo(decodedMap.get(key));\n\t\t\t}\n\t\t}\n\n\t\t// Test the value\n\t\tswitch (type) {\n\t\t\tcase Bytes:\n\t\t\t\tcompareBytes((byte[]) value, (byte[]) decodedMetric.getValue());\n\t\t\t\tassertThat(decodedMetric.getMetaData()).isNull();\n\t\t\t\tbreak;\n\t\t\tcase File:\n\t\t\t\tFile someFile = (File) value;\n\t\t\t\tFile decodedFile = (File) decodedMetric.getValue();\n\t\t\t\tcompareBytes(someFile.getBytes(), decodedFile.getBytes());\n\t\t\t\tassertThat(someFile.getFileName()).isEqualTo(decodedFile.getFileName());\n\t\t\t\tassertThat(decodedMetric.getMetaData()).isNotNull();\n\t\t\t\tassertThat(metaData.getFileName()).isEqualTo(decodedMetric.getMetaData().getFileName());\n\t\t\t\tassertThat(metaData.isMultiPart()).isEqualTo(decodedMetric.getMetaData().isMultiPart());\n\t\t\t\tassertThat(metaData.getFileType()).isEqualTo(decodedMetric.getMetaData().getFileType());\n\t\t\t\tbreak;\n\t\t\tcase DataSet:\n\t\t\t\t// Tests for DataSets\n\t\t\t\tDataSet dataSet = (DataSet) value;\n\t\t\t\tDataSet decodedDataSet = (DataSet) decodedMetric.getValue();\n\t\t\t\tList<String> columnNames = dataSet.getColumnNames();\n\t\t\t\tList<String> decodedColumnNames = decodedDataSet.getColumnNames();\n\t\t\t\tList<DataSetDataType> types = dataSet.getTypes();\n\t\t\t\tList<DataSetDataType> decodedTypes = decodedDataSet.getTypes();\n\t\t\t\tList<Row> rows = dataSet.getRows();\n\t\t\t\tList<Row> decodedRows = decodedDataSet.getRows();\n\t\t\t\tassertThat(dataSet.getNumOfColumns()).isEqualTo(decodedDataSet.getNumOfColumns());\n\t\t\t\t// Test Columns\n\t\t\t\tfor (int i = 0; i < columnNames.size(); i++) {\n\t\t\t\t\tassertThat(columnNames.get(i)).isEqualTo(decodedColumnNames.get(i));\n\t\t\t\t}\n\t\t\t\t// Test Types\n\t\t\t\tfor (int i = 0; i < types.size(); i++) {\n\t\t\t\t\tassertThat(types.get(i)).isEqualTo(decodedTypes.get(i));\n\t\t\t\t}\n\t\t\t\t// Test Rows\n\t\t\t\tfor (int i = 0; i < rows.size(); i++) {\n\t\t\t\t\tList<Value<?>> values = rows.get(i).getValues();\n\t\t\t\t\tList<Value<?>> decodedValues = decodedRows.get(i).getValues();\n\t\t\t\t\tassertThat(values.size()).isEqualTo(decodedValues.size());\n\t\t\t\t\tfor (int j = 0; j < values.size(); j++) {\n\t\t\t\t\t\tValue<?> rowValue = values.get(j);\n\t\t\t\t\t\tValue<?> decodedValue = decodedValues.get(j);\n\t\t\t\t\t\tassertThat(rowValue.getType()).isEqualTo(decodedValue.getType());\n\t\t\t\t\t\tassertThat(rowValue.getValue()).isEqualTo(decodedValue.getValue());\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t\tbreak;\n\t\t\tcase Template:\n\t\t\t\t// Tests for Templates\n\t\t\t\tTemplate template = (Template) value;\n\t\t\t\tTemplate decodedTemplate = (Template) decodedMetric.getValue();\n\t\t\t\tList<Parameter> parameters = template.getParameters();\n\t\t\t\tList<Parameter> decodedParameters = decodedTemplate.getParameters();\n\t\t\t\tList<Metric> metrics = template.getMetrics();\n\t\t\t\tList<Metric> decodedMetrics = decodedTemplate.getMetrics();\n\t\t\t\t// Test Parameters\n\t\t\t\tassertThat(parameters.size()).isEqualTo(decodedParameters.size());\n\t\t\t\tfor (int i = 0; i < parameters.size(); i++) {\n\t\t\t\t\tassertThat(parameters.get(i)).isEqualTo(decodedParameters.get(i));\n\t\t\t\t}\n\t\t\t\t// Test Metrics\n\t\t\t\tfor (int i = 0; i < metrics.size(); i++) {\n\t\t\t\t\tdoMetricTests(metrics.get(i), decodedMetrics.get(i));\n\t\t\t\t}\n\t\t\t\tbreak;\n\t\t\tdefault:\n\t\t\t\t// Tests for all other types\n\t\t\t\tassertThat(value).isEqualTo(decodedMetric.getValue());\n\t\t\t\tassertThat(decodedMetric.getMetaData()).isNull();\n\t\t}\n\t}\n\n\tprivate void compareBytes(byte[] bytes1, byte[] bytes2) {\n\t\tassertThat(bytes1.length).isEqualTo(bytes2.length);\n\t\tfor (int i = 0; i < bytes1.length; i++) {\n\t\t\tassertThat(bytes1[i]).isEqualTo(bytes2[i]);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/test/java/org/eclipse/tahu/util/MessageUtilTest.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.util;\n\nimport static org.assertj.core.api.Assertions.assertThat;\n\nimport java.util.Date;\n\nimport org.eclipse.tahu.message.model.Message;\nimport org.eclipse.tahu.message.model.Message.MessageBuilder;\nimport org.eclipse.tahu.message.model.MessageType;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.eclipse.tahu.message.model.Topic;\nimport org.testng.annotations.DataProvider;\nimport org.testng.annotations.Test;\n\npublic class MessageUtilTest {\n\n\tprivate Date testTime;\n\n\tpublic MessageUtilTest() {\n\t\tthis.testTime = new Date();\n\t}\n\n\t@DataProvider\n\tpublic Object[][] messageData() throws Exception {\n\t\treturn new Object[][] { { new Topic(\"spBv1.0\", \"G1\", \"E1\", \"D1\", MessageType.DCMD),\n\t\t\t\tnew SparkplugBPayloadBuilder().setTimestamp(testTime)\n\t\t\t\t\t\t.addMetric(new MetricBuilder(\"T1\", MetricDataType.Int32, 12).timestamp(testTime).createMetric())\n\t\t\t\t\t\t.createPayload(),\n\t\t\t\t\"{\\\"topic\\\":{\\\"namespace\\\":\\\"spBv1.0\\\",\\\"edgeNodeDescriptor\\\":\\\"G1/E1\\\",\\\"groupId\\\":\\\"G1\\\",\\\"edgeNodeId\\\":\\\"E1\\\",\\\"deviceId\\\":\\\"D1\\\",\\\"type\\\":\\\"DCMD\\\"},\\\"payload\\\":{\\\"timestamp\\\":\"\n\t\t\t\t\t\t+ testTime.getTime() + \",\\\"metrics\\\":[{\\\"name\\\":\\\"T1\\\",\\\"timestamp\\\":\" + testTime.getTime()\n\t\t\t\t\t\t+ \",\\\"dataType\\\":\\\"Int32\\\",\\\"value\\\":12}]}}\" },\n\t\t\t\t{ new Topic(\"spBv1.0\", \"G1\", \"E1\", \"D2\", MessageType.DCMD),\n\t\t\t\t\t\tnew SparkplugBPayloadBuilder().setTimestamp(testTime)\n\t\t\t\t\t\t\t\t.addMetric(new MetricBuilder(\"T2\", MetricDataType.String, \"String Value\")\n\t\t\t\t\t\t\t\t\t\t.timestamp(testTime).createMetric())\n\t\t\t\t\t\t\t\t.createPayload(),\n\t\t\t\t\t\t\"{\\\"topic\\\":{\\\"namespace\\\":\\\"spBv1.0\\\",\\\"edgeNodeDescriptor\\\":\\\"G1/E1\\\",\\\"groupId\\\":\\\"G1\\\",\\\"edgeNodeId\\\":\\\"E1\\\",\\\"deviceId\\\":\\\"D2\\\",\\\"type\\\":\\\"DCMD\\\"},\\\"payload\\\":{\\\"timestamp\\\":\"\n\t\t\t\t\t\t\t\t+ testTime.getTime() + \",\\\"metrics\\\":[{\\\"name\\\":\\\"T2\\\",\\\"timestamp\\\":\"\n\t\t\t\t\t\t\t\t+ testTime.getTime() + \",\\\"dataType\\\":\\\"String\\\",\\\"value\\\":\\\"String Value\\\"}]}}\" } };\n\t}\n\n\t@Test(\n\t\t\tdataProvider = \"messageData\")\n\tpublic void testCompression(Topic topic, SparkplugBPayload payload, String expectedJson) throws Exception {\n\n\t\tMessage message = new MessageBuilder(topic, payload).build();\n\t\tString jsonString = MessageUtil.toJsonString(message);\n\t\tassertThat(jsonString).isEqualTo(expectedJson);\n\t}\n}\n"
  },
  {
    "path": "java/lib/core/src/test/java/org/eclipse/tahu/util/PayloadUtilTest.java",
    "content": "/********************************************************************************\n * Copyright (c) 2016-2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.util;\n\nimport static org.assertj.core.api.Assertions.assertThat;\n\nimport java.util.Arrays;\nimport java.util.Date;\nimport java.util.List;\n\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.testng.annotations.DataProvider;\nimport org.testng.annotations.Test;\n\n/**\n * Unit tests for PayloadUtil.\n */\npublic class PayloadUtilTest {\n\n\tprivate Date testTime;\n\n\tpublic PayloadUtilTest() {\n\t\tthis.testTime = new Date();\n\t}\n\n\t@DataProvider\n\tpublic Object[][] compressionData() throws Exception {\n\t\treturn new Object[][] {\n\t\t\t\t{ CompressionAlgorithm.DEFLATE,\n\t\t\t\t\t\tnew SparkplugBPayloadBuilder().setTimestamp(testTime).setSeq(0L).setUuid(\"123456789\")\n\t\t\t\t\t\t\t\t.setBody(\"Hello\".getBytes())\n\t\t\t\t\t\t\t\t.addMetric(\n\t\t\t\t\t\t\t\t\t\tnew MetricBuilder(\"TestInt\", MetricDataType.Int32, 1234567890).createMetric())\n\t\t\t\t\t\t\t\t.createPayload() },\n\t\t\t\t{ CompressionAlgorithm.GZIP,\n\t\t\t\t\t\tnew SparkplugBPayloadBuilder().setTimestamp(testTime).setSeq(0L).setUuid(\"123456789\")\n\t\t\t\t\t\t\t\t.setBody(\"Hello\".getBytes())\n\t\t\t\t\t\t\t\t.addMetric(\n\t\t\t\t\t\t\t\t\t\tnew MetricBuilder(\"TestInt\", MetricDataType.Int32, 1234567890).createMetric())\n\t\t\t\t\t\t\t\t.createPayload() } };\n\t}\n\n\t@Test(\n\t\t\tdataProvider = \"compressionData\")\n\tpublic void testCompression(CompressionAlgorithm algorithm, SparkplugBPayload payload) throws Exception {\n\n\t\t// Compress the payload\n\t\tSparkplugBPayload compressedPayload = PayloadUtil.compress(payload, algorithm, false);\n\n\t\t// Test that there is a body (the compressed bytes)\n\t\tassertThat(compressedPayload.getBody() != null).isTrue();\n\n\t\t// Test that the sequence number is the same\n\t\tassertThat(compressedPayload.getSeq()).isEqualTo(payload.getSeq());\n\n\t\t// Test that the UUID is set correctly\n\t\tassertThat(compressedPayload.getUuid()).isEqualTo(PayloadUtil.UUID_COMPRESSED);\n\n\t\t// Decompress the payload\n\t\tSparkplugBPayload decompressedPayload = PayloadUtil.decompress(compressedPayload, null);\n\n\t\t// Test that the decompressed payload matches the original\n\t\tassertThat(decompressedPayload.getTimestamp()).isEqualTo(payload.getTimestamp());\n\t\tassertThat(decompressedPayload.getSeq()).isEqualTo(payload.getSeq());\n\t\tassertThat(decompressedPayload.getUuid()).isEqualTo(payload.getUuid());\n\t\tassertThat(Arrays.equals(decompressedPayload.getBody(), payload.getBody())).isTrue();\n\t\t// Test metrics\n\t\tList<Metric> decompressedMetrics = decompressedPayload.getMetrics();\n\t\tList<Metric> metrics = payload.getMetrics();\n\t\tfor (int i = 0; i < metrics.size(); i++) {\n\t\t\tMetric decompressedMetric = decompressedMetrics.get(i);\n\t\t\tMetric metric = metrics.get(i);\n\t\t\tassertThat(decompressedMetric.getName()).isEqualTo(metric.getName());\n\t\t\tassertThat(decompressedMetric.getValue()).isEqualTo(metric.getValue());\n\t\t\tassertThat(decompressedMetric.getDataType()).isEqualTo(metric.getDataType());\n\t\t}\n\n\t}\n}\n"
  },
  {
    "path": "java/lib/edge/pom.xml",
    "content": "<!--/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n  xsi:schemaLocation=\"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">\n  <modelVersion>4.0.0</modelVersion>\n\n  <parent>\n    <groupId>org.eclipse.tahu</groupId>\n    <artifactId>tahu</artifactId>\n    <version>1.0.7</version>\n    <relativePath>../../pom.xml</relativePath>\n  </parent>\n\n  <artifactId>tahu-edge</artifactId>\n  <packaging>bundle</packaging>\n  <name>Tahu Edge</name>\n\n  <dependencies>\n    <dependency>\n      <groupId>org.eclipse.tahu</groupId>\n      <artifactId>tahu-core</artifactId>\n      <version>${project.version}</version>\n    </dependency>\n  </dependencies>\n\n  <build>\n    <plugins>\n      <plugin>\n        <groupId>org.sonatype.plugins</groupId>\n        <artifactId>nexus-staging-maven-plugin</artifactId>\n        <configuration>\n          <skipNexusStagingDeployMojo>false</skipNexusStagingDeployMojo>\n        </configuration>\n      </plugin>\n      <plugin>\n        <groupId>org.codehaus.mojo</groupId>\n        <artifactId>license-maven-plugin</artifactId>\n        <version>1.8</version>\n        <executions>\n          <execution>\n            <id>add-third-party</id>\n            <phase>package</phase>\n            <goals>\n              <goal>add-third-party</goal>\n              <goal>download-licenses</goal>\n            </goals>\n            <configuration>\n              <useMissingFile>true</useMissingFile>\n              <excludedScopes>test</excludedScopes>\n              <excludedGroups> (org.eclipse.tahu*)\n              </excludedGroups>\n              <licenseMerges>\n                <licenseMerge>The Apache Software License, Version\n                  2.0|Apache License, Version 2.0|Apache Public License\n                  2.0|Apache License 2.0|Apache Software License -\n                  Version 2.0</licenseMerge>\n              </licenseMerges>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <groupId>org.apache.felix</groupId>\n        <artifactId>maven-bundle-plugin</artifactId>\n        <version>${maven.bundle.version}</version>\n        <extensions>true</extensions>\n        <configuration>\n          <instructions>\n            <Export-Package>org.eclipse.tahu.*</Export-Package>\n            <Import-Package>*;resolution:=optional</Import-Package>\n          </instructions>\n        </configuration>\n        <executions>\n          <execution>\n            <id>bundle-manifest</id>\n            <phase>process-classes</phase>\n            <goals>\n              <goal>manifest</goal>\n            </goals>\n          </execution>\n        </executions>\n      </plugin>\n    </plugins>\n  </build>\n</project>\n"
  },
  {
    "path": "java/lib/edge/src/main/java/org/eclipse/tahu/edge/EdgeClient.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.edge;\n\nimport java.util.ArrayList;\nimport java.util.Date;\nimport java.util.List;\nimport java.util.Map;\nimport java.util.Timer;\nimport java.util.TimerTask;\nimport java.util.concurrent.ConcurrentHashMap;\n\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\nimport org.eclipse.tahu.edge.api.MetricHandler;\nimport org.eclipse.tahu.exception.TahuException;\nimport org.eclipse.tahu.message.SparkplugBPayloadEncoder;\nimport org.eclipse.tahu.message.model.DeviceDescriptor;\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\nimport org.eclipse.tahu.message.model.MessageType;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayloadMap;\nimport org.eclipse.tahu.message.model.SparkplugBPayloadMap.SparkplugBPayloadMapBuilder;\nimport org.eclipse.tahu.message.model.SparkplugMeta;\nimport org.eclipse.tahu.message.model.StatePayload;\nimport org.eclipse.tahu.message.model.Topic;\nimport org.eclipse.tahu.model.MetricMap;\nimport org.eclipse.tahu.model.MqttServerDefinition;\nimport org.eclipse.tahu.mqtt.ClientCallback;\nimport org.eclipse.tahu.mqtt.MqttClientId;\nimport org.eclipse.tahu.mqtt.MqttOperatorDefs;\nimport org.eclipse.tahu.mqtt.RandomStartupDelay;\nimport org.eclipse.tahu.mqtt.TahuClient;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class EdgeClient implements Runnable {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(EdgeClient.class.getName());\n\n\tprivate final List<MqttServerDefinition> mqttServerDefinitions;\n\tprivate final ClientCallback callback;\n\n\tprivate final MetricHandler metricHandler;\n\tprivate final EdgeNodeDescriptor edgeNodeDescriptor;\n\tprivate final Map<String, Boolean> deviceStatusMap;\n\tprivate final String primaryHostId;\n\tprivate final MetricMap metricMap;\n\tprivate final long rebirthDebounceDelay; // The user specified Rebirth Debounce Delay\n\tprivate final RandomStartupDelay randomStartupDelay;\n\n\tprivate TahuClient tahuClient;\n\n\tprivate final Object clientLock = new Object();\n\n\tprivate int seq;\n\n\tprivate int currentMqttClientIndex;\n\n\t// Tracking variables\n\tprivate volatile boolean stayRunning;\n\tprivate boolean connectedToPrimaryHost; // Whether or not this client is connected to Primary Host ID\n\tprivate Long lastStatePayloadTimestamp;\n\tprivate Timer primaryHostIdResponseTimer; // The Primary Host ID response timer\n\tprivate Timer rebirthDelayTimer; // A Timer used to prevent multiple rebirth requests while the timer is running\n\n\tpublic EdgeClient(MetricHandler metricHandler, EdgeNodeDescriptor edgeNodeDescriptor, List<String> deviceIds,\n\t\t\tString primaryHostId, boolean useAliases, Long rebirthDebounceDelay,\n\t\t\tList<MqttServerDefinition> mqttServerDefinitions, ClientCallback callback,\n\t\t\tRandomStartupDelay randomStartupDelay) {\n\n\t\tthis.mqttServerDefinitions = mqttServerDefinitions;\n\t\tthis.callback = callback;\n\n\t\tthis.metricHandler = metricHandler;\n\t\tthis.edgeNodeDescriptor = edgeNodeDescriptor;\n\t\tthis.deviceStatusMap = new ConcurrentHashMap<>();\n\t\tif (deviceIds != null) {\n\t\t\tfor (String deviceId : deviceIds) {\n\t\t\t\tdeviceStatusMap.put(deviceId, new Boolean(false));\n\t\t\t}\n\t\t}\n\t\tthis.primaryHostId = primaryHostId;\n\t\tthis.metricMap = useAliases ? new MetricMap() : null;\n\t\tthis.rebirthDebounceDelay = rebirthDebounceDelay;\n\t\tthis.randomStartupDelay = randomStartupDelay;\n\n\t\tstayRunning = true;\n\t\tconnectedToPrimaryHost = false;\n\t\tcurrentMqttClientIndex = -1;\n\t}\n\n\tpublic void shutdown() {\n\t\tdisconnect(true);\n\t\tstayRunning = false;\n\t\tconnectedToPrimaryHost = false;\n\t}\n\n\tpublic boolean isDisconnectedOrDisconnecting() {\n\t\treturn tahuClient.isDisconnectInProgress() || !tahuClient.isConnected();\n\t}\n\n\tpublic boolean isConnected() {\n\t\tif (tahuClient == null || !tahuClient.isConnected()) {\n\t\t\treturn false;\n\t\t} else {\n\t\t\treturn true;\n\t\t}\n\t}\n\n\tpublic boolean isConnectedToPrimaryHost() {\n\t\treturn connectedToPrimaryHost;\n\t}\n\n\tpublic void disconnect(boolean publishLwt) {\n\t\tsynchronized (clientLock) {\n\t\t\tlogger.debug(\"{} Attempting to disconnect from target server\",\n\t\t\t\t\tmqttServerDefinitions.get(currentMqttClientIndex).getMqttClientId());\n\n\t\t\t// Cancel the primaryHostId if it is running\n\t\t\tif (primaryHostIdResponseTimer != null) {\n\t\t\t\tlogger.debug(\"Cancelling the primary host ID timer\");\n\t\t\t\tprimaryHostIdResponseTimer.cancel();\n\t\t\t\tprimaryHostIdResponseTimer = null;\n\t\t\t}\n\t\t\tconnectedToPrimaryHost = false;\n\n\t\t\t// Attempt to close and clear the client if it is not already null\n\t\t\tif (tahuClient != null) {\n\t\t\t\tString connectionId = new StringBuilder().append(tahuClient.getMqttServerUrl()).append(\" :: \")\n\t\t\t\t\t\t.append(tahuClient.getClientId()).toString();\n\t\t\t\tlogger.info(\"Attempting disconnect {}\", connectionId);\n\t\t\t\ttry {\n\t\t\t\t\tif (publishLwt) {\n\t\t\t\t\t\tfor (String deviceId : deviceStatusMap.keySet()) {\n\t\t\t\t\t\t\t// Publish all of the DDEATHs since we're shutting down cleanly\n\t\t\t\t\t\t\tpublishDeviceDeath(deviceId);\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\ttahuClient.disconnect(50, 50, true, true, false);\n\t\t\t\t\t} else {\n\t\t\t\t\t\ttahuClient.disconnect(0, 1, false, false, false);\n\t\t\t\t\t}\n\t\t\t\t\tlogger.info(\"Successfully disconnected {}\", connectionId);\n\t\t\t\t} catch (Throwable t) {\n\t\t\t\t\tlogger.error(\"Error while attempting to close client: {}\", connectionId, t);\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\tpublic void publishNodeBirth(SparkplugBPayloadMap payload) throws SparkplugInvalidTypeException {\n\t\tif (metricMap != null) {\n\t\t\t// Aliasing is enabled so reinitialize the alias map and add the new NBIRTH metrics\n\t\t\tmetricMap.clear();\n\t\t\tfor (Metric metric : payload.getMetrics()) {\n\t\t\t\tmetric.setAlias(metricMap.addGeneratedAlias(metric.getName(), metric.getDataType()));\n\t\t\t}\n\t\t}\n\n\t\t// Ensure the 'Node Control/Rebirth' metric is present\n\t\tif (payload.getMetric(\"Node Control/Rebirth\") == null) {\n\t\t\tpayload.addMetric(new MetricBuilder(\"Node Control/Rebirth\", MetricDataType.Boolean, false).createMetric());\n\t\t}\n\n\t\tpublishSparkplugMessage(\n\t\t\t\tnew Topic(SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX, edgeNodeDescriptor, MessageType.NBIRTH), payload, 0,\n\t\t\t\tfalse);\n\t}\n\n\tpublic void publishNodeData(SparkplugBPayload payload) {\n\t\tif (connectedToPrimaryHost) {\n\t\t\tif (metricMap != null) {\n\t\t\t\t// Aliasing is enabled so replace metric names with aliases\n\t\t\t\tfor (Metric metric : payload.getMetrics()) {\n\t\t\t\t\tmetric.setAlias(metricMap.getAlias(metric.getName()));\n\t\t\t\t\tmetric.setName(null);\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tpublishSparkplugMessage(\n\t\t\t\t\tnew Topic(SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX, edgeNodeDescriptor, MessageType.NDATA), payload,\n\t\t\t\t\t0, false);\n\t\t}\n\t}\n\n\tpublic void publishDeviceBirth(String deviceId, SparkplugBPayload payload) {\n\t\tif (metricMap != null) {\n\t\t\t// Aliasing is enabled so add the new DBIRTH metrics\n\t\t\tfor (Metric metric : payload.getMetrics()) {\n\t\t\t\tmetric.setAlias(metricMap.addGeneratedAlias(metric.getName(), metric.getDataType()));\n\t\t\t}\n\t\t}\n\n\t\tpublishSparkplugMessage(new Topic(SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX,\n\t\t\t\tnew DeviceDescriptor(edgeNodeDescriptor, deviceId), MessageType.DBIRTH), payload, 0, false);\n\t\tdeviceStatusMap.put(deviceId, new Boolean(true));\n\t}\n\n\tpublic void publishDeviceData(String deviceId, SparkplugBPayload payload) {\n\t\tif (connectedToPrimaryHost) {\n\t\t\tif (metricMap != null && deviceStatusMap.get(deviceId) != null\n\t\t\t\t\t&& deviceStatusMap.get(deviceId).booleanValue()) {\n\t\t\t\t// Aliasing is enabled so replace metric names with aliases\n\t\t\t\tfor (Metric metric : payload.getMetrics()) {\n\t\t\t\t\tmetric.setAlias(metricMap.getAlias(metric.getName()));\n\t\t\t\t\tmetric.setName(null);\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tpublishSparkplugMessage(new Topic(SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX,\n\t\t\t\t\tnew DeviceDescriptor(edgeNodeDescriptor, deviceId), MessageType.DDATA), payload, 0, false);\n\t\t}\n\t}\n\n\tpublic void publishDeviceDeath(String deviceId) {\n\t\tSparkplugBPayloadMapBuilder payloadBuilder = new SparkplugBPayloadMapBuilder();\n\t\tpayloadBuilder.setTimestamp(new Date());\n\t\tpublishSparkplugMessage(new Topic(SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX,\n\t\t\t\tnew DeviceDescriptor(edgeNodeDescriptor, deviceId), MessageType.DDEATH), payloadBuilder.createPayload(),\n\t\t\t\t0, false);\n\t\tdeviceStatusMap.put(deviceId, new Boolean(false));\n\t}\n\n\tprivate void publishSparkplugMessage(Topic topic, SparkplugBPayload payload, int qos, boolean retained) {\n\t\tsynchronized (clientLock) {\n\t\t\ttry {\n\t\t\t\tpayload.setSeq(getNextSeqNum());\n\t\t\t\tif (topic.isType(MessageType.DCMD) || topic.isType(MessageType.DDATA) || topic.isType(MessageType.NCMD)\n\t\t\t\t\t\t|| topic.isType(MessageType.NDATA)) {\n\t\t\t\t\ttahuClient.publish(topic.toString(), new SparkplugBPayloadEncoder().getBytes(payload, true), qos,\n\t\t\t\t\t\t\tretained);\n\t\t\t\t} else {\n\t\t\t\t\ttahuClient.publish(topic.toString(), new SparkplugBPayloadEncoder().getBytes(payload, false), qos,\n\t\t\t\t\t\t\tretained);\n\t\t\t\t}\n\t\t\t} catch (Exception e) {\n\t\t\t\tlogger.error(\"Failed to publish message on topic={}\", topic, e);\n\t\t\t}\n\t\t}\n\t}\n\n\tpublic long getNextSeqNum() {\n\t\tsynchronized (clientLock) {\n\t\t\tif (seq == 256) {\n\t\t\t\tseq = 0;\n\t\t\t}\n\t\t\tlogger.trace(\"INC: SEQ number is: {}\", seq);\n\t\t\treturn seq++;\n\t\t}\n\t}\n\n\t// Runnable API\n\t@Override\n\t/**\n\t * The main runtime thread that handles the life-cycle of MQTT sessions for Transmission Edge Nodes\n\t */\n\tpublic void run() {\n\t\tlogger.info(\"Running EdgeClient: {}\", edgeNodeDescriptor);\n\t\twhile (stayRunning) {\n\t\t\tsynchronized (clientLock) {\n\t\t\t\ttry {\n\t\t\t\t\tboolean tryToConnect = false;\n\t\t\t\t\tboolean transitionToOnline = false;\n\t\t\t\t\tif (tahuClient == null || !tahuClient.isConnected()) {\n\t\t\t\t\t\tif (!stayRunning) {\n\t\t\t\t\t\t\treturn;\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tlogger.warn(\"{} Not connected - attempting connect with isStayRunning={}\",\n\t\t\t\t\t\t\t\t\tedgeNodeDescriptor, stayRunning);\n\t\t\t\t\t\t\ttryToConnect = true;\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\n\t\t\t\t\tif (stayRunning && tryToConnect) {\n\t\t\t\t\t\tboolean connectedToServer = connectToTargetServer();\n\n\t\t\t\t\t\t// Subscribe to our data feeds... and publish required BIRTH Certs\n\t\t\t\t\t\tif (connectedToServer) {\n\t\t\t\t\t\t\ttry {\n\t\t\t\t\t\t\t\t// Set transitionToOnline true\n\t\t\t\t\t\t\t\ttransitionToOnline = true;\n\n\t\t\t\t\t\t\t\t// Subscribe to all of the topics\n\t\t\t\t\t\t\t\tList<String> subTopics = new ArrayList<>();\n\t\t\t\t\t\t\t\tList<Integer> subQos = new ArrayList<>();\n\n\t\t\t\t\t\t\t\t// Subscribe to NCMD messages\n\t\t\t\t\t\t\t\tsubTopics.add(\n\t\t\t\t\t\t\t\t\t\tSparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX + \"/\" + edgeNodeDescriptor.getGroupId()\n\t\t\t\t\t\t\t\t\t\t\t\t+ \"/NCMD/\" + edgeNodeDescriptor.getEdgeNodeId());\n\t\t\t\t\t\t\t\tsubQos.add(1);\n\n\t\t\t\t\t\t\t\t// Subscribe to DCMDs\n\t\t\t\t\t\t\t\tif (deviceStatusMap != null && !deviceStatusMap.isEmpty()) {\n\t\t\t\t\t\t\t\t\tfor (String deviceId : deviceStatusMap.keySet()) {\n\t\t\t\t\t\t\t\t\t\tsubTopics.add(SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX + \"/\"\n\t\t\t\t\t\t\t\t\t\t\t\t+ edgeNodeDescriptor.getGroupId() + \"/DCMD/\"\n\t\t\t\t\t\t\t\t\t\t\t\t+ edgeNodeDescriptor.getEdgeNodeId() + \"/\" + deviceId);\n\t\t\t\t\t\t\t\t\t\tsubQos.add(1);\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\t// Subscribe to our own LWT\n\t\t\t\t\t\t\t\tsubTopics.add(\n\t\t\t\t\t\t\t\t\t\tSparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX + \"/\" + edgeNodeDescriptor.getGroupId()\n\t\t\t\t\t\t\t\t\t\t\t\t+ \"/NDEATH/\" + edgeNodeDescriptor.getEdgeNodeId());\n\t\t\t\t\t\t\t\tsubQos.add(1);\n\n\t\t\t\t\t\t\t\tif (primaryHostId != null && !primaryHostId.isEmpty()) {\n\t\t\t\t\t\t\t\t\tsubTopics\n\t\t\t\t\t\t\t\t\t\t\t.add(SparkplugMeta.SPARKPLUG_TOPIC_HOST_STATE_PREFIX + \"/\" + primaryHostId);\n\t\t\t\t\t\t\t\t\tsubQos.add(1);\n\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\tint[] grantedQos = tahuClient.subscribe(subTopics.toArray(new String[0]),\n\t\t\t\t\t\t\t\t\t\tsubQos.stream().mapToInt(i -> i).toArray());\n\t\t\t\t\t\t\t\tif (grantedQos == null || grantedQos.length == 0) {\n\t\t\t\t\t\t\t\t\tlogger.error(\"Failed to subscribe to: {}\", subTopics);\n\t\t\t\t\t\t\t\t\ttransitionToOnline = false;\n\t\t\t\t\t\t\t\t\tdisconnect(true);\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t} catch (TahuException e) {\n\t\t\t\t\t\t\t\tlogger.error(\"Failed to subscribe to TARGET elements\", e);\n\t\t\t\t\t\t\t\tconnectedToServer = false;\n\t\t\t\t\t\t\t\ttransitionToOnline = false;\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tdisconnect(true);\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\n\t\t\t\t\tif (transitionToOnline) {\n\t\t\t\t\t\t// In a transition to an MQTT session, publish the NBIRTH and DBIRTH messages.\n\t\t\t\t\t\ttransitionToOnline = false;\n\n\t\t\t\t\t\t// Check if the server type is NOT JSON and we have specified a primary host ID\n\t\t\t\t\t\tif (primaryHostId != null && !primaryHostId.isEmpty()) {\n\t\t\t\t\t\t\t// Set up the primary host ID Timer\n\t\t\t\t\t\t\tlogger.info(\"Waiting for primary host {} to be online\", primaryHostId);\n\t\t\t\t\t\t\tconnectedToPrimaryHost = false;\n\t\t\t\t\t\t\t// Start a timer to run while we wait for a response;\n\t\t\t\t\t\t\tif (primaryHostIdResponseTimer != null) {\n\t\t\t\t\t\t\t\tprimaryHostIdResponseTimer.cancel();\n\t\t\t\t\t\t\t\tprimaryHostIdResponseTimer = null;\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\tprimaryHostIdResponseTimer = new Timer(\n\t\t\t\t\t\t\t\t\tString.format(\"PrimaryHostIdResponseTimer-%s\", edgeNodeDescriptor.toString()));\n\t\t\t\t\t\t\tprimaryHostIdResponseTimer.schedule(new PrimaryHostIdResponseTask(), 30000);\n\n\t\t\t\t\t\t\t// Subscribe to the STATE topic for primary host ID notifications\n\t\t\t\t\t\t\tString subHostTopic = SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX + \"/\" + primaryHostId;\n\t\t\t\t\t\t\tint grantedQos = tahuClient.subscribe(subHostTopic, MqttOperatorDefs.QOS1);\n\t\t\t\t\t\t\tif (grantedQos != 1) {\n\t\t\t\t\t\t\t\tlogger.error(\"Failed to subscribe to '{}'\", subHostTopic);\n\t\t\t\t\t\t\t\t// Cancel the timer and disconnect\n\t\t\t\t\t\t\t\tif (primaryHostIdResponseTimer != null) {\n\t\t\t\t\t\t\t\t\tprimaryHostIdResponseTimer.cancel();\n\t\t\t\t\t\t\t\t\tprimaryHostIdResponseTimer = null;\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\tdisconnect(true);\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\thandleOnlineTransition(\"MAIN THREAD\");\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t} catch (Exception e) {\n\t\t\t\t\tlogger.error(\"Stay Running Exception\", e);\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\t/*\n\t * Connects to an MQTT Server\n\t *\n\t * @return true if the attempt succeeded and client is not stale\n\t */\n\tprivate boolean connectToTargetServer() {\n\t\tsynchronized (clientLock) {\n\t\t\tif (tahuClient != null && tahuClient.isConnected()) {\n\t\t\t\tlogger.debug(\"Not connecting to server, client is already connected\");\n\t\t\t\treturn false;\n\t\t\t}\n\n\t\t\tMqttClientId mqttClientId = null;\n\t\t\ttry {\n\t\t\t\tTopic deathTopic = metricHandler.getDeathTopic();\n\t\t\t\tbyte[] deathPayloadBytes = null;\n\t\t\t\ttry {\n\t\t\t\t\tdeathPayloadBytes = metricHandler.getDeathPayloadBytes();\n\t\t\t\t} catch (TahuException te) {\n\t\t\t\t\tlogger.error(\"Failed to get the NDEATH message deathTopic={} - disconnecting and BAILING\",\n\t\t\t\t\t\t\tdeathTopic);\n\t\t\t\t\tstayRunning = false;\n\t\t\t\t\tdisconnect(true);\n\t\t\t\t\treturn false;\n\t\t\t\t}\n\t\t\t\tif (deathTopic == null || deathPayloadBytes == null) {\n\t\t\t\t\tlogger.error(\"Failed to get the NDEATH message deathTopic={} and deathPayloadBytes={}\", deathTopic,\n\t\t\t\t\t\t\tdeathPayloadBytes);\n\t\t\t\t\treturn false;\n\t\t\t\t}\n\n\t\t\t\tcurrentMqttClientIndex++;\n\t\t\t\tif (currentMqttClientIndex >= mqttServerDefinitions.size()) {\n\t\t\t\t\tcurrentMqttClientIndex = 0;\n\t\t\t\t}\n\t\t\t\tMqttServerDefinition mqttServerDefinition = mqttServerDefinitions.get(currentMqttClientIndex);\n\t\t\t\tmqttClientId = mqttServerDefinition.getMqttClientId();\n\t\t\t\ttahuClient = new TahuClient(mqttClientId, mqttServerDefinition.getMqttServerName(),\n\t\t\t\t\t\tmqttServerDefinition.getMqttServerUrl(), mqttServerDefinition.getUsername(),\n\t\t\t\t\t\tmqttServerDefinition.getPassword(), true, mqttServerDefinition.getKeepAliveTimeout(), callback,\n\t\t\t\t\t\trandomStartupDelay, false, null, null, false, deathTopic.toString(), deathPayloadBytes, 1,\n\t\t\t\t\t\tfalse);\n\t\t\t\ttahuClient.setTrackFirstConnection(true);\n\t\t\t\ttahuClient.setAutoReconnect(false);\n\n\t\t\t\tlogger.info(\"{} Attempting to connect\", mqttClientId);\n\t\t\t\ttahuClient.connect();\n\n\t\t\t\t// Loop for 1.5 times the keep-alive timeout + randomStartupDelay + rebirthDebounceDelay, waiting for\n\t\t\t\t// the client to connect or finish attempting to connect.\n\t\t\t\tint totalTimeout =\n\t\t\t\t\t\t(int) (((int) tahuClient.getKeepAlive() * 1.5) + ((int) rebirthDebounceDelay / 1000));\n\t\t\t\tlogger.debug(\"Total timeout to connect is {} seconds\", totalTimeout);\n\t\t\t\tfor (int i = 0; i < totalTimeout; i++) {\n\t\t\t\t\tif (tahuClient.isAttemptingConnect()) {\n\t\t\t\t\t\tlogger.info(\"{} is attempting to connect\", mqttClientId);\n\t\t\t\t\t} else {\n\t\t\t\t\t\tlogger.info(\"{} is not attempting to connect\", mqttClientId);\n\t\t\t\t\t}\n\n\t\t\t\t\tif (!stayRunning) {\n\t\t\t\t\t\t// Attempt to disconnect from the target server\n\t\t\t\t\t\tlogger.debug(\"{} Shutting down\", mqttServerDefinition.getMqttClientId());\n\t\t\t\t\t\tdisconnect(true);\n\t\t\t\t\t\treturn false;\n\t\t\t\t\t} else if (tahuClient.isAttemptingConnect()) {\n\t\t\t\t\t\ttry {\n\t\t\t\t\t\t\tThread.sleep(1000);\n\t\t\t\t\t\t} catch (Exception e) {\n\t\t\t\t\t\t\tlogger.error(\"Error occured while sleeping\", e);\n\t\t\t\t\t\t}\n\t\t\t\t\t} else if (tahuClient.isConnected()) {\n\t\t\t\t\t\tlogger.info(\"{} Connected to the MQTT Server\", mqttClientId);\n\t\t\t\t\t\treturn true;\n\t\t\t\t\t} else {\n\t\t\t\t\t\tlogger.info(\"{} No longer attempting to connect\", mqttClientId);\n\t\t\t\t\t\tbreak;\n\t\t\t\t\t}\n\t\t\t\t}\n\n\t\t\t\t// Attempt to disconnect from the target server\n\t\t\t\tlogger.error(\"{} Failed to achieve connected state\", mqttClientId);\n\t\t\t\tdisconnect(true);\n\n\t\t\t\t// Return false to indicate a failed connect attempt\n\t\t\t\treturn false;\n\t\t\t} catch (Throwable t) {\n\t\t\t\tlogger.error(\"{} Error while attempting to connect to target server for {}\", mqttClientId,\n\t\t\t\t\t\tedgeNodeDescriptor, t);\n\t\t\t\tlogger.info(\"\\ttahuClient: {}\", tahuClient);\n\n\t\t\t\t// Attempt to disconnect from the target server\n\t\t\t\tdisconnect(true);\n\n\t\t\t\t// Return false to indicate a failed connect attempt\n\t\t\t\treturn false;\n\t\t\t}\n\t\t}\n\t}\n\n\t/*\n\t * Handles the transition to online\n\t */\n\tprivate void handleOnlineTransition(String source) {\n\t\tif (!stayRunning) {\n\t\t\tlogger.debug(\"EdgeClient is shutting down - not publishing BIRTH messages\");\n\t\t\tdisconnect(true);\n\t\t\treturn;\n\t\t} else {\n\t\t\tlogger.info(\"[{}] Handling transition to online\", source);\n\t\t}\n\n\t\ttry {\n\t\t\tlogger.debug(\"Publishing BIRTH for {}\", edgeNodeDescriptor);\n\t\t\tseq = 0;\n\t\t\tmetricHandler.publishBirthSequence();\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to publish birth - BAILING\", e);\n\t\t\tstayRunning = false;\n\t\t\tdisconnect(true);\n\t\t\treturn;\n\t\t}\n\n\t\t// This should happen after the birth sequence so DATA messages can't be published before the BIRTHs\n\t\tconnectedToPrimaryHost = true;\n\t}\n\n\t/**\n\t * Handles state messages received by the Edge Node.\n\t *\n\t * @param primaryHostId the primary host ID\n\t * @param state the state\n\t */\n\tpublic void handleStateMessage(String primaryHostId, StatePayload statePayload) {\n\t\tsynchronized (clientLock) {\n\t\t\tif (this.primaryHostId != null && this.primaryHostId.equals(primaryHostId)) {\n\t\t\t\tLong payloadTimestamp = statePayload.getTimestamp();\n\t\t\t\tif (lastStatePayloadTimestamp != null && payloadTimestamp.compareTo(lastStatePayloadTimestamp) < 0) {\n\t\t\t\t\tlogger.info(\"Reveived a stale STATE message - ignoring hostId={} and payload={}\", primaryHostId,\n\t\t\t\t\t\t\tstatePayload);\n\t\t\t\t\treturn;\n\t\t\t\t} else {\n\t\t\t\t\tlastStatePayloadTimestamp = payloadTimestamp;\n\t\t\t\t}\n\n\t\t\t\tif (statePayload.isOnline() && !connectedToPrimaryHost) {\n\t\t\t\t\tlogger.info(\"Critical/Primary app is online - cancelling disconnect timer\");\n\t\t\t\t\tif (primaryHostIdResponseTimer != null) {\n\t\t\t\t\t\tprimaryHostIdResponseTimer.cancel();\n\t\t\t\t\t\tprimaryHostIdResponseTimer = null;\n\t\t\t\t\t}\n\t\t\t\t\thandleOnlineTransition(\"STATE CHANGE\");\n\t\t\t\t} else if (!statePayload.isOnline()) {\n\t\t\t\t\tlogger.error(\"Critical/Primary app went offline - disconnecting from this server\");\n\t\t\t\t\t// Check if currently connected to primary host\n\t\t\t\t\tif (connectedToPrimaryHost) {\n\t\t\t\t\t\tconnectedToPrimaryHost = false;\n\t\t\t\t\t\tdisconnect(true);\n\t\t\t\t\t} else {\n\t\t\t\t\t\t// Disconnect cleanly and don't publish LWT\n\t\t\t\t\t\tdisconnect(false);\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\t/**\n\t * Processes an Edge Node \"Rebirth\" request.\n\t */\n\tpublic void handleRebirthRequest(boolean isRebirth) {\n\t\tsynchronized (clientLock) {\n\t\t\tif (tahuClient == null) {\n\t\t\t\tlogger.warn(\"Not processing {} request, client is null\", isRebirth ? \"Rebirth\" : \"Birth\");\n\t\t\t} else if (!stayRunning) {\n\t\t\t\tlogger.warn(\"Not processing {} request, client is shutting down\", isRebirth ? \"Rebirth\" : \"Birth\");\n\t\t\t} else if (rebirthDelayTimer == null) {\n\t\t\t\tlogger.info(\"Processing {} request\", isRebirth ? \"Rebirth\" : \"Birth\");\n\t\t\t\tseq = 0;\n\t\t\t\tmetricHandler.publishBirthSequence();\n\t\t\t\tlong randomDelay = randomStartupDelay != null ? randomStartupDelay.getRandomDelay() : 0L;\n\t\t\t\trebirthDelayTimer = new Timer(String.format(\"RebirthDelayTimer-%s\", edgeNodeDescriptor.toString()));\n\t\t\t\tlogger.debug(\"Setting RebirthDelayTimer to {}ms\", randomDelay + rebirthDebounceDelay);\n\t\t\t\trebirthDelayTimer.schedule(new RebirthDelayTask(), randomDelay + rebirthDebounceDelay);\n\t\t\t} else {\n\t\t\t\tlogger.info(\"Rebirth request but just issued a rebirth - ignoring\");\n\t\t\t}\n\t\t}\n\t}\n\n\tprivate class PrimaryHostIdResponseTask extends TimerTask {\n\t\tpublic void run() {\n\t\t\tlogger.error(\"Failed to validate the Primary Host is online\");\n\t\t\tdisconnect(true);\n\t\t}\n\t}\n\n\tprivate class RebirthDelayTask extends TimerTask {\n\t\tpublic void run() {\n\t\t\trebirthDelayTimer.cancel();\n\t\t\trebirthDelayTimer = null;\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/edge/src/main/java/org/eclipse/tahu/edge/EdgeNodeMetricMaps.java",
    "content": "/********************************************************************************\n * Copyright (c) 2023 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.edge;\n\nimport java.util.Map;\nimport java.util.concurrent.ConcurrentHashMap;\n\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.SparkplugDescriptor;\nimport org.eclipse.tahu.message.model.Template;\nimport org.eclipse.tahu.model.MetricDataTypeMap;\nimport org.eclipse.tahu.model.MetricMap;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class EdgeNodeMetricMaps {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(EdgeNodeMetricMaps.class.getName());\n\n\tprivate static Map<String, EdgeNodeMetricMaps> instances;\n\n\tprivate final Map<EdgeNodeDescriptor, Map<SparkplugDescriptor, MetricMap>> allEdgeNodeMetricMaps;\n\n\tprivate final Object mapLock = new Object();\n\n\tpublic static EdgeNodeMetricMaps getInstance(String agentName) {\n\t\tif (instances == null) {\n\t\t\tinstances = new ConcurrentHashMap<>();\n\t\t}\n\t\tif (instances.get(agentName) == null) {\n\t\t\tinstances.put(agentName, new EdgeNodeMetricMaps());\n\t\t}\n\t\treturn instances.get(agentName);\n\t}\n\n\tprivate EdgeNodeMetricMaps() {\n\t\tallEdgeNodeMetricMaps = new ConcurrentHashMap<>();\n\t}\n\n\tpublic void addMetric(EdgeNodeDescriptor edgeNodeDescriptor, SparkplugDescriptor sparkplugDescriptor,\n\t\t\tString metricName, Metric metric) {\n\t\tsynchronized (mapLock) {\n\t\t\tMap<SparkplugDescriptor, MetricMap> edgeNodeMetricMaps =\n\t\t\t\t\tallEdgeNodeMetricMaps.computeIfAbsent(edgeNodeDescriptor, (k) -> new ConcurrentHashMap<>());\n\t\t\tMetricMap metricMap = edgeNodeMetricMaps.computeIfAbsent(sparkplugDescriptor, (k) -> new MetricMap());\n\t\t\tmetricMap.addAlias(metricName, metric.getAlias(), metric.getDataType());\n\n\t\t\tif (metric.getDataType() == MetricDataType.Template && metric.getValue() != null\n\t\t\t\t\t&& Template.class.isAssignableFrom(metric.getValue().getClass())) {\n\t\t\t\tTemplate template = (Template) metric.getValue();\n\t\t\t\tfor (Metric childMetric : template.getMetrics()) {\n\t\t\t\t\taddMetric(edgeNodeDescriptor, sparkplugDescriptor, metricName + \"/\" + childMetric.getName(),\n\t\t\t\t\t\t\tchildMetric);\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\tpublic void clear() {\n\t\tsynchronized (mapLock) {\n\t\t\tallEdgeNodeMetricMaps.clear();\n\t\t}\n\t}\n\n\tpublic Long getAlias(EdgeNodeDescriptor edgeNodeDescriptor, SparkplugDescriptor sparkplugDescriptor,\n\t\t\tString metricName) {\n\t\tMap<SparkplugDescriptor, MetricMap> edgeNodeMetricMaps = allEdgeNodeMetricMaps.get(edgeNodeDescriptor);\n\t\tif (edgeNodeMetricMaps != null) {\n\t\t\tMetricMap metricMap = edgeNodeMetricMaps.get(sparkplugDescriptor);\n\t\t\tif (metricMap != null) {\n\t\t\t\treturn metricMap.getAlias(metricName);\n\t\t\t} else {\n\t\t\t\treturn null;\n\t\t\t}\n\t\t} else {\n\t\t\treturn null;\n\t\t}\n\t}\n\n\tpublic String getMetricName(EdgeNodeDescriptor edgeNodeDescriptor, SparkplugDescriptor sparkplugDescriptor,\n\t\t\tlong alias) {\n\t\tMap<SparkplugDescriptor, MetricMap> edgeNodeMetricMaps = allEdgeNodeMetricMaps.get(edgeNodeDescriptor);\n\t\tif (edgeNodeMetricMaps != null) {\n\t\t\tMetricMap metricMap = edgeNodeMetricMaps.get(sparkplugDescriptor);\n\t\t\tif (metricMap != null) {\n\t\t\t\treturn metricMap.getMetricName(alias);\n\t\t\t} else {\n\t\t\t\treturn null;\n\t\t\t}\n\t\t} else {\n\t\t\treturn null;\n\t\t}\n\t}\n\n\tpublic boolean aliasExists(EdgeNodeDescriptor edgeNodeDescriptor, SparkplugDescriptor sparkplugDescriptor,\n\t\t\tlong alias) {\n\t\tMap<SparkplugDescriptor, MetricMap> edgeNodeMetricMaps = allEdgeNodeMetricMaps.get(edgeNodeDescriptor);\n\t\tif (edgeNodeMetricMaps != null && edgeNodeMetricMaps.get(sparkplugDescriptor) != null) {\n\t\t\tMetricMap metricMap = edgeNodeMetricMaps.get(sparkplugDescriptor);\n\t\t\tif (metricMap != null && metricMap.getMetricName(alias) != null) {\n\t\t\t\treturn true;\n\t\t\t} else {\n\t\t\t\treturn false;\n\t\t\t}\n\t\t} else {\n\t\t\treturn false;\n\t\t}\n\t}\n\n\tpublic MetricDataTypeMap getMetricDataTypeMap(EdgeNodeDescriptor edgeNodeDescriptor,\n\t\t\tSparkplugDescriptor sparkplugDescriptor) {\n\t\tMap<SparkplugDescriptor, MetricMap> edgeNodeMetricMaps =\n\t\t\t\tallEdgeNodeMetricMaps.computeIfAbsent(edgeNodeDescriptor, (k) -> new ConcurrentHashMap<>());\n\t\tedgeNodeMetricMaps.computeIfAbsent(sparkplugDescriptor, (k) -> new MetricMap());\n\t\treturn edgeNodeMetricMaps.get(sparkplugDescriptor).getMetricDataTypeMap();\n\t}\n\n\tpublic MetricDataType getDataType(EdgeNodeDescriptor edgeNodeDescriptor, SparkplugDescriptor sparkplugDescriptor,\n\t\t\tString metricName) {\n\t\tMap<SparkplugDescriptor, MetricMap> edgeNodeMetricMaps = allEdgeNodeMetricMaps.get(edgeNodeDescriptor);\n\t\tif (edgeNodeMetricMaps != null && edgeNodeMetricMaps.get(sparkplugDescriptor) != null) {\n\t\t\treturn edgeNodeMetricMaps.get(sparkplugDescriptor).getMetricDataType(metricName);\n\t\t} else {\n\t\t\treturn null;\n\t\t}\n\t}\n\n\tpublic MetricDataType getDataType(EdgeNodeDescriptor edgeNodeDescriptor, SparkplugDescriptor sparkplugDescriptor,\n\t\t\tLong alias) {\n\t\tMap<SparkplugDescriptor, MetricMap> edgeNodeMetricMaps = allEdgeNodeMetricMaps.get(edgeNodeDescriptor);\n\t\tif (edgeNodeMetricMaps != null && edgeNodeMetricMaps.get(sparkplugDescriptor) != null) {\n\t\t\treturn edgeNodeMetricMaps.get(sparkplugDescriptor).getMetricDataType(alias);\n\t\t} else {\n\t\t\treturn null;\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/edge/src/main/java/org/eclipse/tahu/edge/api/MetricHandler.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.edge.api;\n\nimport org.eclipse.tahu.message.model.SparkplugDescriptor;\nimport org.eclipse.tahu.message.model.Topic;\n\npublic interface MetricHandler {\n\n\t/**\n\t * Returns the {@link String} representing the LWT topic to register in the MQTT CONNECT packet\n\t * \n\t * @return the {@link String} representing the LWT topic to register in the MQTT CONNECT packet\n\t */\n\tpublic Topic getDeathTopic();\n\n\t/**\n\t * The {@link byte[]} representing the LWT bytes to register in the MQTT CONNECT packet\n\t * \n\t * @return a {@link byte[]} representing the LWT bytes to register in the MQTT CONNECT packet\n\t * @throws Exception\n\t */\n\tpublic byte[] getDeathPayloadBytes() throws Exception;\n\n\t/**\n\t * Publishes the required birth message(s) for an Edge Node\n\t */\n\tpublic void publishBirthSequence();\n\n\t/**\n\t * Checks whether or not this MetricHandler has a Metric by a given name for a given {@link SparkplugDesciptor}\n\t * \n\t * @param sparkplugDescriptor the {@link SparkplugDescriptor} to scope the metricName search to\n\t * @param metricName the metricName to look for\n\t * @return\n\t */\n\tpublic boolean hasMetric(SparkplugDescriptor sparkplugDescriptor, String metricName);\n}\n"
  },
  {
    "path": "java/lib/host/pom.xml",
    "content": "<!--/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n  xsi:schemaLocation=\"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">\n  <modelVersion>4.0.0</modelVersion>\n\n  <parent>\n    <groupId>org.eclipse.tahu</groupId>\n    <artifactId>tahu</artifactId>\n    <version>1.0.7</version>\n    <relativePath>../../pom.xml</relativePath>\n  </parent>\n\n  <artifactId>tahu-host</artifactId>\n  <packaging>bundle</packaging>\n  <name>Tahu Host</name>\n\n  <dependencies>\n    <dependency>\n      <groupId>org.eclipse.tahu</groupId>\n      <artifactId>tahu-core</artifactId>\n      <version>${project.version}</version>\n    </dependency>\n  </dependencies>\n\n  <build>\n    <plugins>\n      <plugin>\n        <groupId>org.sonatype.plugins</groupId>\n        <artifactId>nexus-staging-maven-plugin</artifactId>\n        <configuration>\n          <skipNexusStagingDeployMojo>false</skipNexusStagingDeployMojo>\n        </configuration>\n      </plugin>\n      <plugin>\n        <groupId>org.codehaus.mojo</groupId>\n        <artifactId>license-maven-plugin</artifactId>\n        <version>1.8</version>\n        <executions>\n          <execution>\n            <id>add-third-party</id>\n            <phase>package</phase>\n            <goals>\n              <goal>add-third-party</goal>\n              <goal>download-licenses</goal>\n            </goals>\n            <configuration>\n              <useMissingFile>true</useMissingFile>\n              <excludedScopes>test</excludedScopes>\n              <excludedGroups> (org.eclipse.tahu*)\n              </excludedGroups>\n              <licenseMerges>\n                <licenseMerge>The Apache Software License, Version\n                  2.0|Apache License, Version 2.0|Apache Public License\n                  2.0|Apache License 2.0|Apache Software License -\n                  Version 2.0</licenseMerge>\n              </licenseMerges>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <groupId>org.apache.felix</groupId>\n        <artifactId>maven-bundle-plugin</artifactId>\n        <version>${maven.bundle.version}</version>\n        <extensions>true</extensions>\n        <configuration>\n          <instructions>\n            <Export-Package>org.eclipse.tahu.*</Export-Package>\n            <Import-Package>*;resolution:=optional</Import-Package>\n          </instructions>\n        </configuration>\n        <executions>\n          <execution>\n            <id>bundle-manifest</id>\n            <phase>process-classes</phase>\n            <goals>\n              <goal>manifest</goal>\n            </goals>\n          </execution>\n        </executions>\n      </plugin>\n    </plugins>\n  </build>\n</project>\n"
  },
  {
    "path": "java/lib/host/src/main/java/org/eclipse/tahu/host/CommandPublisher.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host;\n\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.Topic;\nimport org.eclipse.tahu.mqtt.MqttServerName;\n\npublic interface CommandPublisher {\n\n\tpublic void publishCommand(Topic topic, SparkplugBPayload payload) throws Exception;\n\n\tpublic void publishCommand(MqttServerName mqttServerName, Topic topic, SparkplugBPayload payload) throws Exception;\n}\n"
  },
  {
    "path": "java/lib/host/src/main/java/org/eclipse/tahu/host/HostApplication.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host;\n\nimport java.util.HashMap;\nimport java.util.List;\nimport java.util.Map;\n\nimport org.eclipse.tahu.exception.TahuErrorCode;\nimport org.eclipse.tahu.exception.TahuException;\nimport org.eclipse.tahu.host.api.HostApplicationEventHandler;\nimport org.eclipse.tahu.host.seq.SequenceReorderManager;\nimport org.eclipse.tahu.message.PayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadEncoder;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugMeta;\nimport org.eclipse.tahu.message.model.Topic;\nimport org.eclipse.tahu.model.MqttServerDefinition;\nimport org.eclipse.tahu.mqtt.MqttOperatorDefs;\nimport org.eclipse.tahu.mqtt.MqttServerName;\nimport org.eclipse.tahu.mqtt.RandomStartupDelay;\nimport org.eclipse.tahu.mqtt.TahuClient;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class HostApplication implements CommandPublisher {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(HostApplication.class.getName());\n\n\tprivate static int MAX_INFLIGHT_MESSAGES = 500;\n\n\tprivate final String hostId;\n\tprivate final RandomStartupDelay randomStartupDelay;\n\tprivate final String stateTopic;\n\tprivate final List<String> sparkplugSubscriptons;\n\tprivate final TahuHostCallback tahuHostCallback;\n\tprivate final List<MqttServerDefinition> mqttServerDefinitions;\n\tprivate final Map<MqttServerName, TahuClient> tahuClients = new HashMap<>();\n\n\tpublic HostApplication(HostApplicationEventHandler eventHandler, String hostId, List<String> sparkplugSubscriptons,\n\t\t\tList<MqttServerDefinition> mqttServerDefinitions, RandomStartupDelay randomStartupDelay,\n\t\t\tPayloadDecoder<SparkplugBPayload> payloadDecoder) {\n\t\tlogger.info(\"Creating the Host Application\");\n\n\t\tif (hostId != null) {\n\t\t\tthis.hostId = hostId;\n\t\t\tthis.stateTopic = SparkplugMeta.SPARKPLUG_TOPIC_HOST_STATE_PREFIX + \"/\" + hostId;\n\t\t} else {\n\t\t\tthis.hostId = null;\n\t\t\tthis.stateTopic = null;\n\t\t}\n\t\tthis.sparkplugSubscriptons = sparkplugSubscriptons;\n\t\tthis.mqttServerDefinitions = mqttServerDefinitions;\n\t\tthis.randomStartupDelay = randomStartupDelay;\n\n\t\tSequenceReorderManager sequenceReorderManager = SequenceReorderManager.getInstance();\n\t\tsequenceReorderManager.init(eventHandler, this, payloadDecoder, 5000L);\n\t\tthis.tahuHostCallback =\n\t\t\t\tnew TahuHostCallback(eventHandler, this, sequenceReorderManager, payloadDecoder, hostId);\n\t}\n\n\tpublic HostApplication(HostApplicationEventHandler eventHandler, String hostId, List<String> sparkplugSubscriptons,\n\t\t\tTahuHostCallback tahuHostCallback, Map<MqttServerName, TahuClient> tahuClients,\n\t\t\tRandomStartupDelay randomStartupDelay) {\n\t\tlogger.info(\"Creating the Host Application\");\n\n\t\tif (hostId != null && !hostId.trim().isEmpty()) {\n\t\t\tthis.hostId = hostId;\n\t\t\tthis.stateTopic = SparkplugMeta.SPARKPLUG_TOPIC_HOST_STATE_PREFIX + \"/\" + hostId;\n\t\t} else {\n\t\t\tthis.hostId = null;\n\t\t\tthis.stateTopic = null;\n\t\t}\n\n\t\tthis.sparkplugSubscriptons = sparkplugSubscriptons;\n\t\tthis.tahuHostCallback = tahuHostCallback;\n\t\tthis.mqttServerDefinitions = null;\n\t\tthis.tahuClients.putAll(tahuClients);\n\t\tthis.randomStartupDelay = randomStartupDelay;\n\t}\n\n\tpublic void start() {\n\t\tif (mqttServerDefinitions != null) {\n\t\t\tfor (MqttServerDefinition mqttServerDefinition : mqttServerDefinitions) {\n\t\t\t\tlogger.debug(\"Starting up the MQTT Client to {}\", mqttServerDefinition.getMqttServerName());\n\t\t\t\tTahuClient tahuClient = tahuClients.get(mqttServerDefinition.getMqttServerName());\n\t\t\t\tif (tahuClient == null) {\n\t\t\t\t\ttahuClient = new TahuClient(mqttServerDefinition.getMqttClientId(),\n\t\t\t\t\t\t\tmqttServerDefinition.getMqttServerName(), mqttServerDefinition.getMqttServerUrl(),\n\t\t\t\t\t\t\tmqttServerDefinition.getUsername(), mqttServerDefinition.getPassword(), true,\n\t\t\t\t\t\t\tmqttServerDefinition.getKeepAliveTimeout(), tahuHostCallback, randomStartupDelay, true,\n\t\t\t\t\t\t\tstateTopic, null, true, stateTopic, null, MqttOperatorDefs.QOS1, true);\n\t\t\t\t}\n\n\t\t\t\t// Add it to the Map\n\t\t\t\ttahuClients.put(mqttServerDefinition.getMqttServerName(), tahuClient);\n\t\t\t}\n\t\t}\n\n\t\t// Start the clients\n\t\tfor (TahuClient client : tahuClients.values()) {\n\t\t\tstartClient(client);\n\t\t}\n\n\t\tlogger.debug(\"MQTT Clients Started. Connection and subscriptions not verified yet\");\n\t}\n\n\tprivate void startClient(TahuClient tahuClient) {\n\t\ttahuClient.setMaxInflightMessages(MAX_INFLIGHT_MESSAGES);\n\t\ttahuHostCallback.setMqttClients(tahuClients);\n\n\t\ttry {\n\t\t\ttahuClient.setAutoReconnect(true);\n\t\t\ttahuClient.connect();\n\n\t\t\t// Subscribe to our own STATE topic\n\t\t\tif (stateTopic != null) {\n\t\t\t\tlogger.debug(\"PrimaryHostId is set. Subscribing on {}\", stateTopic);\n\t\t\t\tint grantedQos = tahuClient.subscribe(stateTopic, MqttOperatorDefs.QOS1);\n\t\t\t\tif (grantedQos != 1) {\n\t\t\t\t\tlogger.error(\"Failed to subscribe to '{}'\", stateTopic);\n\t\t\t\t\treturn;\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tfor (String subscriptionTopic : sparkplugSubscriptons) {\n\t\t\t\t// Subscribe to the Sparkplug namespace(s)\n\t\t\t\tlogger.debug(\"Subscribing on {}\", subscriptionTopic);\n\t\t\t\tint grantedQos = tahuClient.subscribe(subscriptionTopic, MqttOperatorDefs.QOS0);\n\t\t\t\tif (grantedQos != 0) {\n\t\t\t\t\tlogger.error(\"Failed to subscribe to '{}'\", subscriptionTopic);\n\t\t\t\t\treturn;\n\t\t\t\t}\n\t\t\t}\n\n\t\t\t// Pub\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to start client {} connecting to {}\", tahuClient.getClientId(),\n\t\t\t\t\ttahuClient.getMqttServerUrl(), e);\n\t\t\treturn;\n\t\t}\n\t}\n\n\tpublic void shutdown() {\n\t\tfor (TahuClient tahuClient : tahuClients.values()) {\n\t\t\tif (tahuClient != null) {\n\t\t\t\tString connectionId = new StringBuilder().append(tahuClient.getMqttServerUrl()).append(\" :: \")\n\t\t\t\t\t\t.append(tahuClient.getClientId()).toString();\n\t\t\t\ttry {\n\t\t\t\t\t// Unsubscribe\n\t\t\t\t\t// removeMqttClientSubscriptions(tahuClient, unsubscribe);\n\n\t\t\t\t\tif (stateTopic != null) {\n\t\t\t\t\t\t// Clean up STATE subscriptions\n\t\t\t\t\t\tlogger.debug(\"Unsubscribing from {}\", stateTopic);\n\t\t\t\t\t\ttahuClient.unsubscribe(stateTopic);\n\t\t\t\t\t}\n\n\t\t\t\t\tfor (String subscriptionTopic : sparkplugSubscriptons) {\n\t\t\t\t\t\t// Clean up the Sparkplug subscription(s)\n\t\t\t\t\t\tlogger.debug(\"Unsubscribing from {}\", subscriptionTopic);\n\t\t\t\t\t\ttahuClient.unsubscribe(subscriptionTopic);\n\t\t\t\t\t}\n\n\t\t\t\t\t// Shut down the client after the MQTT client is disconnected to prevent RejectedExecutionExceptions\n\t\t\t\t\ttahuHostCallback.shutdown();\n\n\t\t\t\t\t// Shut down the MQTT client\n\t\t\t\t\ttahuClient.setAutoReconnect(false);\n\t\t\t\t\tlogger.info(\"Attempting disconnect {}\", connectionId);\n\t\t\t\t\ttahuClient.disconnect(100, 100, true, true);\n\t\t\t\t\tlogger.info(\"Successfully disconnected {}\", connectionId);\n\n\t\t\t\t\t// Set the Edge Nodes associated with this client offline\n//\t\t\t\t\tedgeNodeManager.setAllEdgeNodesOffline(mqttServerName);\n\t\t\t\t} catch (Exception e) {\n\t\t\t\t\tlogger.error(\"Error shutting down {}\", connectionId, e);\n\t\t\t\t} finally {\n\t\t\t\t\ttahuClient = null;\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tlogger.trace(\"Cannot shutdown null client\");\n\t\t\t}\n\t\t}\n\t}\n\n\tpublic String getHostId() {\n\t\treturn hostId;\n\t}\n\n\t@Override\n\tpublic void publishCommand(Topic topic, SparkplugBPayload payload) throws Exception {\n\t\tfor (MqttServerName mqttServerName : tahuClients.keySet()) {\n\t\t\tpublishCommand(mqttServerName, topic, payload);\n\t\t}\n\t}\n\n\t@Override\n\tpublic void publishCommand(MqttServerName mqttServerName, Topic topic, SparkplugBPayload payload) throws Exception {\n\t\tTahuClient tahuClient = tahuClients.get(mqttServerName);\n\t\tif (tahuClient != null && tahuClient.isConnected()) {\n\t\t\tSparkplugBPayloadEncoder encoder = new SparkplugBPayloadEncoder();\n\t\t\tbyte[] bytes = encoder.getBytes(payload, true);\n\t\t\ttahuClient.publish(topic.toString(), bytes, MqttOperatorDefs.QOS0, false);\n\t\t} else {\n\t\t\tthrow new TahuException(TahuErrorCode.INITIALIZATION_ERROR,\n\t\t\t\t\t\"The Tahu Client is not connected - not publishing command on topic=\" + topic);\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/host/src/main/java/org/eclipse/tahu/host/TahuHostCallback.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host;\n\nimport java.util.Map;\nimport java.util.Map.Entry;\nimport java.util.UUID;\nimport java.util.concurrent.LinkedBlockingQueue;\nimport java.util.concurrent.ThreadFactory;\nimport java.util.concurrent.ThreadPoolExecutor;\nimport java.util.concurrent.TimeUnit;\n\nimport org.eclipse.paho.client.mqttv3.MqttMessage;\nimport org.eclipse.tahu.host.api.HostApplicationEventHandler;\nimport org.eclipse.tahu.host.seq.SequenceReorderManager;\nimport org.eclipse.tahu.message.PayloadDecoder;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugMeta;\nimport org.eclipse.tahu.message.model.StatePayload;\nimport org.eclipse.tahu.mqtt.ClientCallback;\nimport org.eclipse.tahu.mqtt.MqttClientId;\nimport org.eclipse.tahu.mqtt.MqttServerName;\nimport org.eclipse.tahu.mqtt.MqttServerUrl;\nimport org.eclipse.tahu.mqtt.TahuClient;\nimport org.eclipse.tahu.util.TopicUtil;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\nimport com.fasterxml.jackson.databind.ObjectMapper;\n\npublic class TahuHostCallback implements ClientCallback {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(TahuHostCallback.class.getName());\n\n\tprivate static final int DEFAULT_NUM_OF_THREADS = 100;\n\n\tprivate final ThreadPoolExecutor[] sparkplugBExecutors;\n\n\tprivate Map<MqttServerName, TahuClient> tahuClients;\n\n\tprivate final boolean enableSequenceReordering;\n\n\tprivate final HostApplicationEventHandler eventHandler;\n\n\tprivate final CommandPublisher commandPublisher;\n\n\tprivate final SequenceReorderManager sequenceReorderManager;\n\n\tprivate final PayloadDecoder<SparkplugBPayload> payloadDecoder;\n\n\tprivate final String hostId;\n\n\tpublic TahuHostCallback(HostApplicationEventHandler eventHandler, CommandPublisher commandPublisher,\n\t\t\tSequenceReorderManager sequenceReorderManager, PayloadDecoder<SparkplugBPayload> payloadDecoder,\n\t\t\tString hostId) {\n\t\tthis.eventHandler = eventHandler;\n\t\tthis.commandPublisher = commandPublisher;\n\t\tif (sequenceReorderManager != null) {\n\t\t\tthis.enableSequenceReordering = true;\n\t\t\tthis.sequenceReorderManager = sequenceReorderManager;\n\t\t\tthis.sequenceReorderManager.start();\n\t\t} else {\n\t\t\tthis.enableSequenceReordering = false;\n\t\t\tthis.sequenceReorderManager = null;\n\t\t}\n\t\tthis.payloadDecoder = payloadDecoder;\n\t\tthis.hostId = hostId;\n\n\t\tthis.sparkplugBExecutors = new ThreadPoolExecutor[DEFAULT_NUM_OF_THREADS];\n\t\tfor (int i = 0; i < DEFAULT_NUM_OF_THREADS; i++) {\n\t\t\tfinal String uuid = UUID.randomUUID().toString().substring(0, 8);\n\t\t\tthis.sparkplugBExecutors[i] = new ThreadPoolExecutor(1, 1, 0L, TimeUnit.MILLISECONDS,\n\t\t\t\t\tnew LinkedBlockingQueue<Runnable>(), new ThreadFactory() {\n\t\t\t\t\t\t@Override\n\t\t\t\t\t\tpublic Thread newThread(Runnable r) {\n\t\t\t\t\t\t\tfinal String threadName = String.format(\"%s-%s\", \"TahuHostCallback-\", uuid);\n\t\t\t\t\t\t\treturn new Thread(r, threadName);\n\t\t\t\t\t\t}\n\t\t\t\t\t});\n\t\t}\n\t}\n\n\t@Override\n\tpublic void shutdown() {\n\t\tlogger.info(\"Shutting down TahuHostCallback\");\n\t\tfor (int i = 0; i < DEFAULT_NUM_OF_THREADS; i++) {\n\t\t\ttry {\n\t\t\t\tsparkplugBExecutors[i].shutdownNow();\n\t\t\t} catch (Exception e) {\n\t\t\t\tlogger.error(\"Failed to shutdown executor\", e);\n\t\t\t}\n\t\t}\n\t}\n\n\tpublic void setMqttClients(Map<MqttServerName, TahuClient> tahuClients) {\n\t\tthis.tahuClients = tahuClients;\n\t}\n\n\t@Override\n\tpublic void messageArrived(MqttServerName server, MqttServerUrl url, MqttClientId clientId, String topic,\n\t\t\tMqttMessage message) {\n\t\ttry {\n\t\t\t// What sent the message - and what type?\n\n\t\t\tTahuClient client = tahuClients.get(server);\n\t\t\tif (client == null) {\n\t\t\t\tlogger.error(\"Message arrived on topic {} from unknown client {} on {}\", topic, clientId, server);\n\n\t\t\t\t// Debug messages\n\t\t\t\tfor (Entry<MqttServerName, TahuClient> entry : tahuClients.entrySet()) {\n\t\t\t\t\tlogger.error(\"Failed - but found: {}\", entry.getKey());\n\t\t\t\t}\n\n\t\t\t\treturn;\n\t\t\t} else {\n\t\t\t\tlogger.trace(\"Message arrived on topic {} from client {}\", topic, clientId);\n\t\t\t}\n\n\t\t\tif (topic == null) {\n\t\t\t\t// Should never get here since we should only get messages on topics we subscribe to\n\t\t\t\tlogger.error(\"Invalid null topic\");\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\tfinal String[] splitTopic = TopicUtil.getSplitTopic(topic);\n\n\t\t\tfinal long arrivedTime = System.nanoTime();\n\t\t\tif (topic.startsWith(SparkplugMeta.SPARKPLUG_B_TOPIC_PREFIX)) {\n\t\t\t\tif (splitTopic.length == 3 && splitTopic[1].equals(\"STATE\")) {\n\t\t\t\t\t// This is a STATE message - handle as needed\n\t\t\t\t\tObjectMapper mapper = new ObjectMapper();\n\t\t\t\t\tStatePayload statePayload = mapper.readValue(new String(message.getPayload()), StatePayload.class);\n\t\t\t\t\tif (hostId != null && !hostId.trim().isEmpty() && splitTopic[2].equals(hostId)\n\t\t\t\t\t\t\t&& !statePayload.isOnline()) {\n\t\t\t\t\t\t// Make sure this isn't an OFFLINE message\n\t\t\t\t\t\tlogger.info(\n\t\t\t\t\t\t\t\t\"This is a offline STATE message from {} - correcting with new online STATE message\",\n\t\t\t\t\t\t\t\tsplitTopic[2]);\n\t\t\t\t\t\tclient.publishBirthMessage();\n\t\t\t\t\t}\n\t\t\t\t} else {\n\t\t\t\t\t// Get the proper executor\n\t\t\t\t\tString key = splitTopic[1] + \"/\" + splitTopic[3];\n\t\t\t\t\tint index = getThreadPoolExecutorIndex(key, DEFAULT_NUM_OF_THREADS);\n\t\t\t\t\tlogger.debug(\"Adding Sparkplug B message to ThreadPoolExecutor {} :: {}\", index,\n\t\t\t\t\t\t\tsparkplugBExecutors[index].getQueue().size());\n\t\t\t\t\tThreadPoolExecutor executor = sparkplugBExecutors[index];\n\n\t\t\t\t\tif (enableSequenceReordering) {\n\t\t\t\t\t\t// Sequence reordering is required\n\t\t\t\t\t\tlogger.trace(\"Sending the message on {} to the SequenceReorderManager\", topic);\n\t\t\t\t\t\tsequenceReorderManager.handlePayload(this, executor, topic, splitTopic, message, server,\n\t\t\t\t\t\t\t\tclientId, arrivedTime);\n\t\t\t\t\t} else {\n\t\t\t\t\t\texecutor.execute(() -> {\n\t\t\t\t\t\t\ttry {\n\t\t\t\t\t\t\t\t// No sequence reordering required - just push the message through and handle the\n\t\t\t\t\t\t\t\t// Sparkplug B Payload\n\t\t\t\t\t\t\t\tlogger.trace(\"Sending the message on {} directly to the TahuPayloadHandler\", topic);\n\t\t\t\t\t\t\t\tnew TahuPayloadHandler(eventHandler, commandPublisher, payloadDecoder)\n\t\t\t\t\t\t\t\t\t\t.handlePayload(topic, splitTopic, message, server, clientId);\n\t\t\t\t\t\t\t} catch (Throwable t) {\n\t\t\t\t\t\t\t\tlogger.error(\"Failed to handle Sparkplug B message on topic {}\", topic, t);\n\t\t\t\t\t\t\t} finally {\n\t\t\t\t\t\t\t\t// Update the message latency\n\t\t\t\t\t\t\t\tlong latency = System.nanoTime() - arrivedTime;\n\t\t\t\t\t\t\t\tif (logger.isTraceEnabled()) {\n\t\t\t\t\t\t\t\t\tlogger.trace(\"Updating message processing latency {}\", latency);\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t});\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tlogger.debug(\"Received non-Sparkplug message on topic {}\", topic);\n\t\t\t}\n\t\t} catch (Throwable t) {\n\t\t\tlogger.error(\"Failed to handle message on topic {}\", topic, t);\n\t\t}\n\t}\n\n\t/*\n\t * Returns and index for the supplied key and number of ThreadPoolExecutors.\n\t */\n\tprivate int getThreadPoolExecutorIndex(String key, int numOfThreadPoolExecutors) {\n\t\treturn Math.abs(key.hashCode() % numOfThreadPoolExecutors);\n\t}\n\n\t@Override\n\tpublic void connectionLost(MqttServerName mqttServerName, MqttServerUrl url, MqttClientId clientId,\n\t\t\tThrowable cause) {\n\t\tlogger.warn(\"Connection Lost to - {} :: {} :: {}\", mqttServerName, url, clientId);\n\t\teventHandler.onDisconnect();\n\n\t\tif (cause != null) {\n\t\t\t// We don't need to see all of the connection lost callbacks for clients\n\t\t\tlogger.error(\"Connection lost due to - {}\", cause.getMessage(), cause);\n\t\t}\n\n\t\tlogger.info(\"Clear out all connection counts to this MQTT Server\");\n\t\ttahuClients.get(mqttServerName).clearConnectionCount();\n\n\t\tTahuClient tahuClient = tahuClients.get(mqttServerName);\n//\t\tedgeNodeManager.disconnectAllEdgeNodes(tahuClient);\n\n\t\t// Update the OFFLINE Engine Info tag for the client\n//\t\tupdateEngineInfoDateTag(mqttServerName, DATE_OFFLINE);\n\n\t\t// Update the Primary Host State tag to OFFLINE\n\t\tString lwtTopic = tahuClients.get(mqttServerName).getLwtTopic();\n\t\tif (lwtTopic != null && lwtTopic.startsWith(SparkplugMeta.SPARKPLUG_TOPIC_HOST_STATE_PREFIX)) {\n\t\t\tString primaryHostId =\n\t\t\t\t\tlwtTopic.substring(SparkplugMeta.SPARKPLUG_TOPIC_HOST_STATE_PREFIX.length() + 1, lwtTopic.length());\n\t\t\tlogger.debug(\"Setting Primary Host ID info tag for {} to offline\", primaryHostId);\n//\t\t\tString clientTagPath = join(EngineGwHook.MQTT_CLIENTS_PATH, mqttServerName, \"/\",\n//\t\t\t\t\tEngineTag.PRIMARY_HOST_STATE, \"/\", primaryHostId, \"/\");\n//\t\t\tModuleTagUtils.updateModuleTagValue(EngineSettings.getInstance().getContext(),\n//\t\t\t\t\tEngineSettings.getInstance().getManagedTagProvider(),\n//\t\t\t\t\tEngineSettings.getInstance().getManagedTagProviderName(), join(clientTagPath, \"Payload\"),\n//\t\t\t\t\tDataType.String, \"OFFLINE\");\n\t\t}\n\n\t\tif (tahuClient.getAutoReconnect()) {\n\t\t\ttahuClient.connect();\n\t\t}\n\t}\n\n\t@Override\n\tpublic void connectComplete(boolean reconnect, MqttServerName server, MqttServerUrl url, MqttClientId clientId) {\n//\t\t// Update the ONLINE Engine Info tag for the client\n//\t\tupdateEngineInfoDateTag(server, DATE_ONLINE);\n\t\teventHandler.onConnect();\n\t}\n\n\tprivate void updateEngineInfoDateTag(MqttServerName server, String tagName) {\n//\t\tModuleTagUtils.updateModuleTagValue(EngineSettings.getInstance().getContext(),\n//\t\t\t\tEngineSettings.getInstance().getManagedTagProvider(),\n//\t\t\t\tEngineSettings.getInstance().getManagedTagProviderName(),\n//\t\t\t\tjoin(EngineGwHook.MQTT_CLIENTS_PATH, server, \"/\", tagName), DataType.DateTime, new Date());\n\t}\n}\n"
  },
  {
    "path": "java/lib/host/src/main/java/org/eclipse/tahu/host/TahuPayloadHandler.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host;\n\nimport java.util.Date;\nimport java.util.Iterator;\nimport java.util.Map;\nimport java.util.Set;\nimport java.util.Timer;\nimport java.util.TimerTask;\nimport java.util.concurrent.ConcurrentHashMap;\n\nimport org.eclipse.paho.client.mqttv3.MqttMessage;\nimport org.eclipse.tahu.exception.TahuErrorCode;\nimport org.eclipse.tahu.exception.TahuException;\nimport org.eclipse.tahu.host.api.HostApplicationEventHandler;\nimport org.eclipse.tahu.host.manager.EdgeNodeManager;\nimport org.eclipse.tahu.host.manager.MetricManager;\nimport org.eclipse.tahu.host.manager.SparkplugDevice;\nimport org.eclipse.tahu.host.manager.SparkplugEdgeNode;\nimport org.eclipse.tahu.host.model.HostApplicationMetricMap;\nimport org.eclipse.tahu.host.model.HostMetric;\nimport org.eclipse.tahu.host.model.MessageContext;\nimport org.eclipse.tahu.message.PayloadDecoder;\nimport org.eclipse.tahu.message.model.DeviceDescriptor;\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\nimport org.eclipse.tahu.message.model.MessageType;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.Metric.MetricBuilder;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.SparkplugBPayload.SparkplugBPayloadBuilder;\nimport org.eclipse.tahu.message.model.SparkplugDescriptor;\nimport org.eclipse.tahu.message.model.Topic;\nimport org.eclipse.tahu.mqtt.MqttClientId;\nimport org.eclipse.tahu.mqtt.MqttServerName;\nimport org.eclipse.tahu.util.SparkplugUtil;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class TahuPayloadHandler {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(TahuPayloadHandler.class.getName());\n\n\tprivate static Map<EdgeNodeDescriptor, Timer> rebirthTimers = new ConcurrentHashMap<>();\n\n\tprivate final HostApplicationEventHandler eventHandler;\n\n\tprivate final CommandPublisher commandPublisher;\n\n\tprivate final PayloadDecoder<SparkplugBPayload> payloadDecoder;\n\n\tpublic TahuPayloadHandler(HostApplicationEventHandler eventHandler, CommandPublisher commandPublisher,\n\t\t\tPayloadDecoder<SparkplugBPayload> payloadDecoder) {\n\t\tthis.eventHandler = eventHandler;\n\t\tthis.commandPublisher = commandPublisher;\n\t\tthis.payloadDecoder = payloadDecoder;\n\t}\n\n\tpublic void handlePayload(String topicString, String[] splitTopic, MqttMessage message,\n\t\t\tMqttServerName mqttServerName, MqttClientId hostAppMqttClientId) {\n\t\tlogger.trace(\"Handling payload on {}\", topicString);\n\n\t\tTopic topic = null;\n\t\ttry {\n\t\t\tif (splitTopic.length == 4) {\n\t\t\t\ttopic = new Topic(splitTopic[0], splitTopic[1], splitTopic[3], MessageType.valueOf(splitTopic[2]));\n\t\t\t} else if (splitTopic.length == 5) {\n\t\t\t\ttopic = new Topic(splitTopic[0], splitTopic[1], splitTopic[3], splitTopic[4],\n\t\t\t\t\t\tMessageType.valueOf(splitTopic[2]));\n\t\t\t} else {\n\t\t\t\tlogger.error(\"Failed to handle the topic '{}'\", topicString);\n\t\t\t\treturn;\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Error parsing topic\", e);\n\t\t\treturn;\n\t\t}\n\t\tMessageType type = topic.getType();\n\n\t\tSparkplugBPayload payload = null;\n\t\ttry {\n\t\t\t// Parse the payload\n\t\t\tpayload = payloadDecoder.buildFromByteArray(message.getPayload(), HostApplicationMetricMap.getInstance()\n\t\t\t\t\t.getMetricDataTypeMap(topic.getEdgeNodeDescriptor(), topic.getSparkplugDescriptor()));\n\t\t\tlogger.trace(\"On topic={}: Incoming payload: {}\", topic, payload);\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to decode the payload\", e);\n\t\t\treturn;\n\t\t}\n\n\t\tif (type.isCommand()) {\n\t\t\t// This was an outbound command - ignore it\n\t\t\tlogger.debug(\"Ignoring outbound command: {}\", topicString);\n\t\t\treturn;\n\t\t}\n\n\t\t// Extract the sequence number unless it is a node death certificate\n\t\tLong seqNum = null;\n\t\tif (!type.equals(MessageType.NDEATH)) {\n\t\t\tif (payload == null || payload.getSeq() == null) {\n\t\t\t\tlogger.error(\"Invalid payload with topic={}: {}\", topicString,\n\t\t\t\t\t\tpayload == null ? \"payload is null\" : \"sequence number is null\");\n\t\t\t\treturn;\n\t\t\t} else {\n\t\t\t\tseqNum = payload.getSeq();\n\t\t\t\tif (seqNum == null) {\n\t\t\t\t\tlogger.error(\"Invalid payload missing sequence number: {}\", topicString);\n\t\t\t\t\treturn;\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\n\t\ttry {\n\t\t\tMessageContext messageContext = new MessageContext(mqttServerName, hostAppMqttClientId, topic, payload,\n\t\t\t\t\tmessage.getPayload() == null ? 0 : message.getPayload().length, seqNum == null ? -1 : seqNum);\n\n\t\t\tswitch (type) {\n\t\t\t\tcase NBIRTH:\n\t\t\t\t\tlogger.info(\"Handling NBIRTH from {}\", topic.getSparkplugDescriptor());\n\t\t\t\t\thandleNodeBirth(messageContext);\n\t\t\t\t\tbreak;\n\t\t\t\tcase DBIRTH:\n\t\t\t\t\tlogger.info(\"Handling DBIRTH from {}\", topic.getSparkplugDescriptor());\n\t\t\t\t\thandleDeviceBirth(messageContext);\n\t\t\t\t\tbreak;\n\t\t\t\tcase NDATA:\n\t\t\t\t\tlogger.info(\"Handling NDATA from {}\", topic.getSparkplugDescriptor());\n\t\t\t\t\thandleNodeData(messageContext);\n\t\t\t\t\tbreak;\n\t\t\t\tcase DDATA:\n\t\t\t\t\tlogger.info(\"Handling DDATA from {}\", topic.getSparkplugDescriptor());\n\t\t\t\t\thandleDeviceData(messageContext);\n\t\t\t\t\tbreak;\n\t\t\t\tcase NDEATH:\n\t\t\t\t\tlogger.info(\"Handling NDEATH from {}\", topic.getSparkplugDescriptor());\n\t\t\t\t\thandleNodeDeath(messageContext);\n\t\t\t\t\tbreak;\n\t\t\t\tcase DDEATH:\n\t\t\t\t\tlogger.info(\"Handling DDEATH from {}\", topic.getSparkplugDescriptor());\n\t\t\t\t\thandleDeviceDeath(messageContext);\n\t\t\t\t\tbreak;\n\n\t\t\t\tdefault:\n\t\t\t\t\tlogger.info(\"Unknown message with type={} on topic={}\", type, topic);\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to handle payload on topic: {} with payload={}\", topic, payload, e);\n\t\t\treturn;\n\t\t}\n\t}\n\n\tprotected void handleNodeBirth(MessageContext messageContext) throws Exception {\n\t\tlogger.debug(\"Processing NBIRTH from Edge Node {} with Seq# {}\",\n\t\t\t\tmessageContext.getTopic().getEdgeNodeDescriptor(), messageContext.getSeqNum());\n\t\tEdgeNodeDescriptor edgeNodeDescriptor = messageContext.getTopic().getEdgeNodeDescriptor();\n\t\tSparkplugEdgeNode sparkplugEdgeNode =\n\t\t\t\tEdgeNodeManager.getInstance().getSparkplugEdgeNode(messageContext.getTopic().getEdgeNodeDescriptor());\n\t\tif (sparkplugEdgeNode == null) {\n\t\t\tsparkplugEdgeNode = EdgeNodeManager.getInstance().addSparkplugEdgeNode(edgeNodeDescriptor,\n\t\t\t\t\tmessageContext.getMqttServerName(), messageContext.getHostAppMqttClientId());\n\t\t} else {\n\t\t\t// Reset the metrics\n\t\t\tsparkplugEdgeNode.clearMetrics();\n\t\t}\n\n\t\t// Reset the alias map\n\t\tHostApplicationMetricMap hostApplicationMetricMap = HostApplicationMetricMap.getInstance();\n\t\thostApplicationMetricMap.clear(sparkplugEdgeNode.getEdgeNodeDescriptor());\n\n\t\t// Set online\n\t\tsparkplugEdgeNode.setOnline(true, messageContext.getPayload().getTimestamp(),\n\t\t\t\tSparkplugUtil.getBdSequenceNumber(messageContext.getPayload()), messageContext.getSeqNum());\n\n\t\teventHandler.onNodeBirthArrived(edgeNodeDescriptor, messageContext.getMessage());\n\t\teventHandler.onMessage(edgeNodeDescriptor, messageContext.getMessage());\n\t\tfor (Metric metric : messageContext.getPayload().getMetrics()) {\n\t\t\tif (metric.hasAlias()) {\n\t\t\t\t// Make sure the alias doesn't already exist\n\t\t\t\tif (hostApplicationMetricMap.aliasExists(edgeNodeDescriptor,\n\t\t\t\t\t\tmessageContext.getTopic().getSparkplugDescriptor(), metric.getAlias())) {\n\t\t\t\t\tString errorMessage = \"Not adding duplicated alias for edgeNode=\" + edgeNodeDescriptor + \" - alias=\"\n\t\t\t\t\t\t\t+ metric.getAlias() + \" and metric name=\" + metric.getName() + \" - with existing alias for \"\n\t\t\t\t\t\t\t+ hostApplicationMetricMap.getMetricName(edgeNodeDescriptor,\n\t\t\t\t\t\t\t\t\tmessageContext.getTopic().getSparkplugDescriptor(), metric.getAlias());\n\t\t\t\t\tlogger.error(errorMessage);\n\n\t\t\t\t\trequestRebirth(messageContext.getMqttServerName(), messageContext.getHostAppMqttClientId(),\n\t\t\t\t\t\t\tmessageContext.getTopic().getEdgeNodeDescriptor());\n\t\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT, errorMessage);\n\t\t\t\t}\n\t\t\t}\n\n\t\t\thostApplicationMetricMap.addMetric(edgeNodeDescriptor, edgeNodeDescriptor, metric.getName(), metric);\n\n\t\t\t// Update the cache and notify\n\t\t\tsparkplugEdgeNode.putMetric(metric.getName(), new HostMetric(metric, false));\n\t\t\teventHandler.onBirthMetric(edgeNodeDescriptor, metric);\n\t\t}\n\t\teventHandler.onNodeBirthComplete(edgeNodeDescriptor);\n\t}\n\n\tprotected void handleDeviceBirth(MessageContext messageContext) throws Exception {\n\t\tlogger.debug(\"Processing DBIRTH from Device {} with Seq# {}\",\n\t\t\t\tmessageContext.getTopic().getSparkplugDescriptor(), messageContext.getSeqNum());\n\t\tEdgeNodeDescriptor edgeNodeDescriptor = messageContext.getTopic().getEdgeNodeDescriptor();\n\t\tDeviceDescriptor deviceDescriptor = (DeviceDescriptor) messageContext.getTopic().getSparkplugDescriptor();\n\t\tSparkplugEdgeNode sparkplugEdgeNode = EdgeNodeManager.getInstance().getSparkplugEdgeNode(edgeNodeDescriptor);\n\t\tSparkplugDevice sparkplugDevice =\n\t\t\t\tEdgeNodeManager.getInstance().getSparkplugDevice(edgeNodeDescriptor, deviceDescriptor);\n\t\tif (sparkplugDevice == null) {\n\t\t\tsparkplugDevice = EdgeNodeManager.getInstance().addSparkplugDevice(edgeNodeDescriptor, deviceDescriptor,\n\t\t\t\t\tmessageContext.getPayload().getTimestamp());\n\t\t} else {\n\t\t\tsparkplugDevice.clearMetrics();\n\t\t}\n\n\t\tsparkplugEdgeNode.handleSeq(messageContext.getPayload().getSeq());\n\n\t\t// Set online\n\t\tsparkplugDevice.setOnline(true, messageContext.getPayload().getTimestamp());\n\n\t\teventHandler.onDeviceBirthArrived(deviceDescriptor, messageContext.getMessage());\n\t\teventHandler.onMessage(deviceDescriptor, messageContext.getMessage());\n\t\tHostApplicationMetricMap hostApplicationMetricMap = HostApplicationMetricMap.getInstance();\n\t\tfor (Metric metric : messageContext.getPayload().getMetrics()) {\n\t\t\tif (metric.hasAlias()) {\n\t\t\t\tif (hostApplicationMetricMap.aliasExists(edgeNodeDescriptor, deviceDescriptor, metric.getAlias())) {\n\t\t\t\t\tString errorMessage = \"Not adding duplicated alias for device=\" + deviceDescriptor + \" - alias=\"\n\t\t\t\t\t\t\t+ metric.getAlias() + \" and metric name=\" + metric.getName() + \" - with existing alias for \"\n\t\t\t\t\t\t\t+ hostApplicationMetricMap.getMetricName(edgeNodeDescriptor, deviceDescriptor,\n\t\t\t\t\t\t\t\t\tmetric.getAlias());\n\t\t\t\t\tlogger.error(errorMessage);\n\n\t\t\t\t\trequestRebirth(messageContext.getMqttServerName(), messageContext.getHostAppMqttClientId(),\n\t\t\t\t\t\t\tmessageContext.getTopic().getEdgeNodeDescriptor());\n\t\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT, errorMessage);\n\t\t\t\t}\n\t\t\t}\n\n\t\t\thostApplicationMetricMap.addMetric(edgeNodeDescriptor, deviceDescriptor, metric.getName(), metric);\n\n\t\t\t// Update the cache and notify\n\t\t\tsparkplugDevice.putMetric(metric.getName(), new HostMetric(metric, false));\n\t\t\teventHandler.onBirthMetric(deviceDescriptor, metric);\n\t\t}\n\t\teventHandler.onDeviceBirthComplete(deviceDescriptor);\n\t}\n\n\tprotected void handleNodeData(MessageContext messageContext) throws Exception {\n\t\tlogger.debug(\"Processing NDATA from Edge Node {} with Seq# {}\",\n\t\t\t\tmessageContext.getTopic().getEdgeNodeDescriptor(), messageContext.getSeqNum());\n\t\tEdgeNodeDescriptor edgeNodeDescriptor = messageContext.getTopic().getEdgeNodeDescriptor();\n\t\tSparkplugEdgeNode sparkplugEdgeNode =\n\t\t\t\tEdgeNodeManager.getInstance().getSparkplugEdgeNode(messageContext.getTopic().getEdgeNodeDescriptor());\n\t\tif (sparkplugEdgeNode == null || !sparkplugEdgeNode.isOnline()) {\n\t\t\trequestRebirth(messageContext.getMqttServerName(), messageContext.getHostAppMqttClientId(),\n\t\t\t\t\tmessageContext.getTopic().getEdgeNodeDescriptor());\n\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT,\n\t\t\t\t\t\"Invalid state of the Sparkplug Edge Node when receiving a NDATA - \"\n\t\t\t\t\t\t\t+ messageContext.getTopic().getSparkplugDescriptor() + \" is offline\");\n\t\t}\n\n\t\tsparkplugEdgeNode.handleSeq(messageContext.getPayload().getSeq());\n\n\t\teventHandler.onNodeDataArrived(edgeNodeDescriptor, messageContext.getMessage());\n\t\teventHandler.onMessage(edgeNodeDescriptor, messageContext.getMessage());\n\t\tfor (Metric metric : messageContext.getPayload().getMetrics()) {\n\t\t\tif (!metric.hasName() && metric.hasAlias()) {\n\t\t\t\tmetric.setName(HostApplicationMetricMap.getInstance().getMetricName(edgeNodeDescriptor,\n\t\t\t\t\t\tedgeNodeDescriptor, metric.getAlias()));\n\t\t\t}\n\n\t\t\t// Update the metric in the cache and notify\n\t\t\tsparkplugEdgeNode.updateValue(metric.getName(), metric.getValue());\n\t\t\teventHandler.onDataMetric(edgeNodeDescriptor, metric);\n\t\t}\n\t\teventHandler.onNodeDataArrived(edgeNodeDescriptor, messageContext.getMessage());\n\t}\n\n\tprotected void handleDeviceData(MessageContext messageContext) throws Exception {\n\t\tlogger.debug(\"Processing DDATA from Device {} with Seq# {}\", messageContext.getTopic().getSparkplugDescriptor(),\n\t\t\t\tmessageContext.getSeqNum());\n\t\tEdgeNodeDescriptor edgeNodeDescriptor = messageContext.getTopic().getEdgeNodeDescriptor();\n\t\tDeviceDescriptor deviceDescriptor = (DeviceDescriptor) messageContext.getTopic().getSparkplugDescriptor();\n\t\tSparkplugEdgeNode sparkplugEdgeNode = EdgeNodeManager.getInstance().getSparkplugEdgeNode(edgeNodeDescriptor);\n\t\tSparkplugDevice sparkplugDevice =\n\t\t\t\tEdgeNodeManager.getInstance().getSparkplugDevice(edgeNodeDescriptor, deviceDescriptor);\n\t\tif (sparkplugDevice == null || !sparkplugEdgeNode.isOnline()) {\n\t\t\trequestRebirth(messageContext.getMqttServerName(), messageContext.getHostAppMqttClientId(),\n\t\t\t\t\tmessageContext.getTopic().getEdgeNodeDescriptor());\n\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT,\n\t\t\t\t\t\"Invalid state of the Sparkplug Device when receiving a DDATA - \"\n\t\t\t\t\t\t\t+ messageContext.getTopic().getSparkplugDescriptor() + \" is offline\");\n\t\t}\n\n\t\tsparkplugEdgeNode.handleSeq(messageContext.getPayload().getSeq());\n\n\t\teventHandler.onDeviceDataArrived(deviceDescriptor, messageContext.getMessage());\n\t\teventHandler.onMessage(deviceDescriptor, messageContext.getMessage());\n\t\tfor (Metric metric : messageContext.getPayload().getMetrics()) {\n\t\t\tif (!metric.hasName() && metric.hasAlias()) {\n\t\t\t\tmetric.setName(HostApplicationMetricMap.getInstance().getMetricName(edgeNodeDescriptor,\n\t\t\t\t\t\tdeviceDescriptor, metric.getAlias()));\n\t\t\t}\n\n\t\t\t// Update the metric in the cache and notify\n\t\t\tsparkplugDevice.updateValue(metric.getName(), metric.getValue());\n\t\t\teventHandler.onDataMetric(deviceDescriptor, metric);\n\t\t}\n\t\teventHandler.onDeviceDataComplete(deviceDescriptor);\n\t}\n\n\tprotected void handleNodeDeath(MessageContext messageContext) {\n\t\tLong incomingBdSeqNum = -1L;\n\t\tEdgeNodeDescriptor edgeNodeDescriptor = messageContext.getTopic().getEdgeNodeDescriptor();\n\t\ttry {\n\t\t\tSparkplugEdgeNode sparkplugEdgeNode =\n\t\t\t\t\tEdgeNodeManager.getInstance().getSparkplugEdgeNode(edgeNodeDescriptor);\n\t\t\tincomingBdSeqNum = SparkplugUtil.getBdSequenceNumber(messageContext.getPayload());\n\t\t\tif (sparkplugEdgeNode != null && incomingBdSeqNum != null) {\n\t\t\t\tif (sparkplugEdgeNode.isOnline()) {\n\t\t\t\t\tlong birthBdSeqNum = sparkplugEdgeNode.getBirthBdSeqNum();\n\t\t\t\t\tif (birthBdSeqNum == incomingBdSeqNum) {\n\t\t\t\t\t\teventHandler.onNodeDeath(edgeNodeDescriptor, messageContext.getMessage());\n\t\t\t\t\t\teventHandler.onMessage(edgeNodeDescriptor, messageContext.getMessage());\n\t\t\t\t\t\tstaleTags(edgeNodeDescriptor, sparkplugEdgeNode);\n\t\t\t\t\t\tsparkplugEdgeNode.setOnline(false, messageContext.getPayload().getTimestamp(), incomingBdSeqNum,\n\t\t\t\t\t\t\t\tnull);\n\t\t\t\t\t\tfor (SparkplugDevice sparkplugDevice : sparkplugEdgeNode.getSparkplugDevices().values()) {\n\t\t\t\t\t\t\tstaleTags(sparkplugDevice.getDeviceDescrptor(), sparkplugDevice);\n\t\t\t\t\t\t\tsparkplugDevice.setOnline(false, messageContext.getPayload().getTimestamp());\n\t\t\t\t\t\t}\n\t\t\t\t\t\teventHandler.onNodeDeathComplete(edgeNodeDescriptor);\n\t\t\t\t\t} else {\n\t\t\t\t\t\tlogger.error(\n\t\t\t\t\t\t\t\t\"Edge Node bdSeq number mismatch on incoming NDEATH from {} - received {}, expected {} - ignoring NDEATH\",\n\t\t\t\t\t\t\t\tedgeNodeDescriptor, incomingBdSeqNum, birthBdSeqNum);\n\t\t\t\t\t}\n\t\t\t\t} else {\n\t\t\t\t\tlogger.error(\"Edge Node '{}' is not online - ignoring NDEATH\", edgeNodeDescriptor);\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tlogger.error(\"Unable to find Edge Node or current bdSeq number for NDEATH from {} - ignoring NDEATH\",\n\t\t\t\t\t\tmessageContext.getTopic().getEdgeNodeDescriptor());\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Sparkplug BD sequence number from {} is missing - ignoring NDEATH\", edgeNodeDescriptor);\n\t\t}\n\t}\n\n\tprotected void handleDeviceDeath(MessageContext messageContext) throws TahuException {\n\t\tEdgeNodeDescriptor edgeNodeDescriptor = messageContext.getTopic().getEdgeNodeDescriptor();\n\t\tDeviceDescriptor deviceDescriptor = (DeviceDescriptor) messageContext.getTopic().getSparkplugDescriptor();\n\t\tSparkplugEdgeNode sparkplugEdgeNode = EdgeNodeManager.getInstance().getSparkplugEdgeNode(edgeNodeDescriptor);\n\t\tSparkplugDevice sparkplugDevice =\n\t\t\t\tEdgeNodeManager.getInstance().getSparkplugDevice(edgeNodeDescriptor, deviceDescriptor);\n\t\tif (sparkplugDevice == null || !sparkplugEdgeNode.isOnline() || !sparkplugDevice.isOnline()) {\n\t\t\tlogger.error(\"Invalid state of the Sparkplug Device when receiving a DDEATH - \"\n\t\t\t\t\t+ messageContext.getTopic().getSparkplugDescriptor() + \" is offline - ignoring DDEATH\");\n\t\t\treturn;\n\t\t}\n\n\t\tsparkplugEdgeNode.handleSeq(messageContext.getPayload().getSeq());\n\n\t\tif (sparkplugEdgeNode.isOnline() && sparkplugDevice.isOnline()) {\n\t\t\teventHandler.onDeviceDeath(deviceDescriptor, messageContext.getMessage());\n\t\t\teventHandler.onMessage(deviceDescriptor, messageContext.getMessage());\n\t\t\tstaleTags(deviceDescriptor, sparkplugDevice);\n\t\t\tsparkplugDevice.setOnline(false, messageContext.getPayload().getTimestamp());\n\t\t\teventHandler.onDeviceDeathComplete(deviceDescriptor);\n\t\t} else {\n\t\t\tlogger.error(\"Online requirements not met for {} - edgeNode={} and device={} - ignoring DDEATH\",\n\t\t\t\t\tdeviceDescriptor, sparkplugEdgeNode.isOnline() ? \"online\" : \"offline\",\n\t\t\t\t\tsparkplugDevice.isOnline() ? \"online\" : \"offline\");\n\t\t}\n\t}\n\n\tprivate void staleTags(SparkplugDescriptor sparkplugDescriptor, MetricManager metricManager) {\n\t\t// Stale all tags associated with this Edge Node\n\t\tSet<String> metricNames = metricManager.getMetricNames();\n\t\tIterator<String> it = metricNames.iterator();\n\t\twhile (it.hasNext()) {\n\t\t\tString metricName = it.next();\n\n\t\t\t// Update the cache and notify\n\t\t\tmetricManager.setStale(metricName, true);\n\t\t\teventHandler.onStale(sparkplugDescriptor, metricManager.getMetric(metricName));\n\t\t}\n\t}\n\n\tpublic void requestRebirth(MqttServerName mqttServerName, MqttClientId hostAppMqttClientId,\n\t\t\tEdgeNodeDescriptor edgeNodeDescriptor) {\n\t\trequestRebirth(mqttServerName, hostAppMqttClientId, edgeNodeDescriptor, null);\n\t}\n\n\tpublic void requestRebirth(MqttServerName mqttServerName, MqttClientId hostAppMqttClientId,\n\t\t\tEdgeNodeDescriptor edgeNodeDescriptor, SparkplugEdgeNode sparkplugEdgeNode) {\n\t\ttry {\n\t\t\tTimer rebirthDelayTimer = rebirthTimers.get(edgeNodeDescriptor);\n\t\t\tif (rebirthDelayTimer == null) {\n\t\t\t\tlogger.info(\"Requesting Rebirth from {}\", edgeNodeDescriptor);\n\t\t\t\trebirthDelayTimer = new Timer();\n\t\t\t\trebirthTimers.put(edgeNodeDescriptor, rebirthDelayTimer);\n\t\t\t\trebirthDelayTimer.schedule(new RebirthDelayTask(edgeNodeDescriptor), 5000);\n\n\t\t\t\t// Request a rebirth\n\t\t\t\tSparkplugBPayload cmdPayload = new SparkplugBPayloadBuilder().setTimestamp(new Date())\n\t\t\t\t\t\t.addMetric(\n\t\t\t\t\t\t\t\tnew MetricBuilder(\"Node Control/Rebirth\", MetricDataType.Boolean, true).createMetric())\n\t\t\t\t\t\t.createPayload();\n\n\t\t\t\tTopic cmdTopic = new Topic(\"spBv1.0\", edgeNodeDescriptor, MessageType.NCMD);\n\t\t\t\tif (sparkplugEdgeNode != null) {\n\t\t\t\t\t// Set the Edge Node offline\n\t\t\t\t\tsparkplugEdgeNode.forceOffline(new Date());\n\n\t\t\t\t\tif (mqttServerName != null && sparkplugEdgeNode.getMqttServerName() != null\n\t\t\t\t\t\t\t&& mqttServerName.equals(sparkplugEdgeNode.getMqttServerName())) {\n\t\t\t\t\t\tlogger.debug(\"On Rebirth request - Current Engine MQTT Server is unchanged: {}\",\n\t\t\t\t\t\t\t\tmqttServerName);\n\t\t\t\t\t} else {\n\t\t\t\t\t\tlogger.info(\"On Rebirth request - MQTT Server has changed: new={}, old={}\", mqttServerName,\n\t\t\t\t\t\t\t\tsparkplugEdgeNode.getMqttServerName());\n\t\t\t\t\t}\n\t\t\t\t\tif (hostAppMqttClientId != null && sparkplugEdgeNode.getHostAppMqttClientId() != null\n\t\t\t\t\t\t\t&& hostAppMqttClientId.equals(sparkplugEdgeNode.getHostAppMqttClientId())) {\n\t\t\t\t\t\tlogger.debug(\"On Rebirth request - Current Engine MQTT Client ID is unchanged: {}\",\n\t\t\t\t\t\t\t\thostAppMqttClientId);\n\t\t\t\t\t} else {\n\t\t\t\t\t\tlogger.info(\"On Rebirth request - MQTT Client ID has changed: new={}, old={}\",\n\t\t\t\t\t\t\t\thostAppMqttClientId, sparkplugEdgeNode.getHostAppMqttClientId());\n\t\t\t\t\t}\n\n\t\t\t\t\t// Update the current Engine MQTT Server name and Client ID\n\t\t\t\t\tsparkplugEdgeNode.setMqttServerName(mqttServerName);\n\t\t\t\t\tsparkplugEdgeNode.setHostAppMqttClientId(hostAppMqttClientId);\n\n\t\t\t\t\tpublishCommand(mqttServerName, hostAppMqttClientId, cmdTopic, cmdPayload);\n\t\t\t\t} else {\n\t\t\t\t\tlogger.debug(\"Current Engine MQTT Server Name for unknown Edge Node: {}\", mqttServerName);\n\t\t\t\t\tlogger.debug(\"Current Engine MQTT Client ID for unknown Edge Node: {}\", hostAppMqttClientId);\n\t\t\t\t\tpublishCommand(mqttServerName, hostAppMqttClientId, cmdTopic, cmdPayload);\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tlogger.debug(\"Not requesting Rebirth since we have in the last 5 seconds\");\n\t\t\t}\n\t\t} catch (Exception e) {\n\t\t\tlogger.error(\"Failed to create Rebirth request\", e);\n\t\t\treturn;\n\t\t}\n\t}\n\n\t/**\n\t * A TimerTask subclass for timers on issued rebirth requests.\n\t */\n\tprivate class RebirthDelayTask extends TimerTask {\n\t\tprivate EdgeNodeDescriptor edgeNodeDescriptor;\n\n\t\tpublic RebirthDelayTask(EdgeNodeDescriptor edgeNodeDescriptor) {\n\t\t\tthis.edgeNodeDescriptor = edgeNodeDescriptor;\n\t\t}\n\n\t\tpublic void run() {\n\t\t\tif (rebirthTimers.get(edgeNodeDescriptor) != null) {\n\t\t\t\trebirthTimers.get(edgeNodeDescriptor).cancel();\n\t\t\t\trebirthTimers.remove(edgeNodeDescriptor);\n\t\t\t}\n\t\t}\n\t}\n\n\tprivate void publishCommand(MqttServerName mqttServerName, MqttClientId hostAppMqttClientId, Topic topic,\n\t\t\tSparkplugBPayload payload) throws Exception {\n\t\tcommandPublisher.publishCommand(topic, payload);\n\t}\n}\n"
  },
  {
    "path": "java/lib/host/src/main/java/org/eclipse/tahu/host/api/HostApplicationEventHandler.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host.api;\n\nimport org.eclipse.tahu.message.model.DeviceDescriptor;\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\nimport org.eclipse.tahu.message.model.Message;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.SparkplugDescriptor;\n\npublic interface HostApplicationEventHandler {\n\n\tpublic void onConnect();\n\n\tpublic void onDisconnect();\n\n\tpublic void onMessage(SparkplugDescriptor sparkplugDescriptor, Message message);\n\n\tpublic void onNodeBirthArrived(EdgeNodeDescriptor edgeNodeDescriptor, Message message);\n\n\tpublic void onNodeBirthComplete(EdgeNodeDescriptor edgeNodeDescriptor);\n\n\tpublic void onNodeDataArrived(EdgeNodeDescriptor edgeNodeDescriptor, Message message);\n\n\tpublic void onNodeDataComplete(EdgeNodeDescriptor edgeNodeDescriptor);\n\n\tpublic void onNodeDeath(EdgeNodeDescriptor edgeNodeDescriptor, Message message);\n\n\tpublic void onNodeDeathComplete(EdgeNodeDescriptor edgeNodeDescriptor);\n\n\tpublic void onDeviceBirthArrived(DeviceDescriptor deviceDescriptor, Message message);\n\n\tpublic void onDeviceBirthComplete(DeviceDescriptor deviceDescriptor);\n\n\tpublic void onDeviceDataArrived(DeviceDescriptor deviceDescriptor, Message message);\n\n\tpublic void onDeviceDataComplete(DeviceDescriptor deviceDescriptor);\n\n\tpublic void onDeviceDeath(DeviceDescriptor deviceDescriptor, Message message);\n\n\tpublic void onDeviceDeathComplete(DeviceDescriptor deviceDescriptor);\n\n\tpublic void onBirthMetric(SparkplugDescriptor sparkplugDescriptor, Metric metric);\n\n\tpublic void onDataMetric(SparkplugDescriptor sparkplugDescriptor, Metric metric);\n\n\tpublic void onStale(SparkplugDescriptor sparkplugDescriptor, Metric metric);\n}\n"
  },
  {
    "path": "java/lib/host/src/main/java/org/eclipse/tahu/host/manager/EdgeNodeManager.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host.manager;\n\nimport java.util.Date;\nimport java.util.Map;\nimport java.util.concurrent.ConcurrentHashMap;\n\nimport org.eclipse.tahu.exception.TahuErrorCode;\nimport org.eclipse.tahu.exception.TahuException;\nimport org.eclipse.tahu.message.model.DeviceDescriptor;\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\nimport org.eclipse.tahu.mqtt.MqttClientId;\nimport org.eclipse.tahu.mqtt.MqttServerName;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class EdgeNodeManager {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(EdgeNodeManager.class.getName());\n\n\tprivate static EdgeNodeManager instance;\n\n\tprivate Map<EdgeNodeDescriptor, SparkplugEdgeNode> edgeNodeMap;\n\n\tprivate final Object lock = new Object();\n\n\tprivate EdgeNodeManager() {\n\t\tedgeNodeMap = new ConcurrentHashMap<>();\n\t}\n\n\tpublic static EdgeNodeManager getInstance() {\n\t\tif (instance == null) {\n\t\t\tinstance = new EdgeNodeManager();\n\t\t}\n\n\t\treturn instance;\n\t}\n\n\tpublic SparkplugEdgeNode getSparkplugEdgeNode(EdgeNodeDescriptor edgeNodeDescriptor) {\n\t\tsynchronized (lock) {\n\t\t\treturn edgeNodeMap.get(edgeNodeDescriptor);\n\t\t}\n\t}\n\n\tpublic SparkplugEdgeNode addSparkplugEdgeNode(EdgeNodeDescriptor edgeNodeDescriptor, MqttServerName mqttServerName,\n\t\t\tMqttClientId hostAppMqttClientId) {\n\t\tsynchronized (lock) {\n\t\t\tSparkplugEdgeNode sparkplugEdgeNode =\n\t\t\t\t\tnew SparkplugEdgeNode(edgeNodeDescriptor, mqttServerName, hostAppMqttClientId);\n\t\t\tedgeNodeMap.put(edgeNodeDescriptor, sparkplugEdgeNode);\n\t\t\treturn sparkplugEdgeNode;\n\t\t}\n\t}\n\n\tpublic SparkplugDevice getSparkplugDevice(EdgeNodeDescriptor edgeNodeDescriptor,\n\t\t\tDeviceDescriptor deviceDescriptor) {\n\t\tsynchronized (lock) {\n\t\t\tSparkplugEdgeNode sparkplugEdgeNode = edgeNodeMap.get(edgeNodeDescriptor);\n\t\t\tif (sparkplugEdgeNode != null) {\n\t\t\t\treturn sparkplugEdgeNode.getSparkplugDevice(deviceDescriptor);\n\t\t\t} else {\n\t\t\t\treturn null;\n\t\t\t}\n\t\t}\n\t}\n\n\tpublic SparkplugDevice addSparkplugDevice(EdgeNodeDescriptor edgeNodeDescriptor, DeviceDescriptor deviceDescriptor,\n\t\t\tDate onlineTimestamp) throws TahuException {\n\t\tsynchronized (lock) {\n\t\t\t// Make sure there is a SparkplugEdgeNode already\n\t\t\tSparkplugEdgeNode sparkplugEdgeNode = edgeNodeMap.get(edgeNodeDescriptor);\n\t\t\tif (sparkplugEdgeNode == null) {\n\t\t\t\tthrow new TahuException(TahuErrorCode.INITIALIZATION_ERROR,\n\t\t\t\t\t\t\"The SparkplugEdgeNode must already exist before adding a device\");\n\t\t\t} else {\n\t\t\t\tSparkplugDevice sparkplugDevice =\n\t\t\t\t\t\tnew SparkplugDevice(sparkplugEdgeNode, deviceDescriptor, onlineTimestamp);\n\t\t\t\tsparkplugEdgeNode.addDevice(deviceDescriptor, sparkplugDevice);\n\t\t\t\treturn sparkplugDevice;\n\t\t\t}\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/host/src/main/java/org/eclipse/tahu/host/manager/MetricManager.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host.manager;\n\nimport java.util.Collections;\nimport java.util.Map;\nimport java.util.Set;\nimport java.util.concurrent.ConcurrentHashMap;\n\nimport org.eclipse.tahu.host.model.HostMetric;\nimport org.eclipse.tahu.message.model.SparkplugDescriptor;\n\npublic abstract class MetricManager {\n\n\tprivate final Map<String, HostMetric> metricMap;\n\n\tpublic MetricManager() {\n\t\tmetricMap = new ConcurrentHashMap<>();\n\t}\n\n\tpublic abstract SparkplugDescriptor getSparkplugDescriptor();\n\n\tpublic Map<String, HostMetric> getMetricMap() {\n\t\treturn Collections.unmodifiableMap(metricMap);\n\t}\n\n\tpublic Set<String> getMetricNames() {\n\t\treturn metricMap.keySet();\n\t}\n\n\tpublic HostMetric getMetric(String metricName) {\n\t\treturn metricMap.get(metricName);\n\t}\n\n\tpublic void putMetric(String metricName, HostMetric metric) {\n\t\tmetricMap.put(metricName, metric);\n\t}\n\n\tpublic void updateValue(String metricName, Object value) {\n\t\tHostMetric hostMetric = metricMap.get(metricName);\n\t\tif (hostMetric != null) {\n\t\t\thostMetric.setValue(value);\n\t\t}\n\t}\n\n\tpublic void setStale(String metricName, boolean stale) {\n\t\tHostMetric hostMetric = metricMap.get(metricName);\n\t\tif (hostMetric != null) {\n\t\t\thostMetric.setStale(stale);\n\t\t}\n\t}\n\n\tpublic void clearMetrics() {\n\t\tmetricMap.clear();\n\t}\n}\n"
  },
  {
    "path": "java/lib/host/src/main/java/org/eclipse/tahu/host/manager/SparkplugDevice.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host.manager;\n\nimport java.util.Date;\n\nimport org.eclipse.tahu.message.model.DeviceDescriptor;\nimport org.eclipse.tahu.message.model.SparkplugDescriptor;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class SparkplugDevice extends MetricManager {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(SparkplugDevice.class.getName());\n\n\t// Static variables\n\tprivate final SparkplugEdgeNode sparkplugEdgeNode;\n\tprivate final DeviceDescriptor deviceDescriptor;\n\tprivate final String groupId;\n\tprivate final String edgeNodeId;\n\tprivate final String deviceId;\n\n\t// Dynamic variables\n\tprivate boolean online;\n\tprivate Date onlineTimestamp;\n\tprivate Date offlineTimestamp;\n\n\tSparkplugDevice(SparkplugEdgeNode sparkplugEdgeNode, String groupId, String edgeNodeId, String deviceId,\n\t\t\tDate onlineTimestamp) {\n\t\tthis(sparkplugEdgeNode, new DeviceDescriptor(groupId, edgeNodeId, deviceId), onlineTimestamp);\n\t}\n\n\tSparkplugDevice(SparkplugEdgeNode sparkplugEdgeNode, DeviceDescriptor deviceDescriptor, Date onlineTimestamp) {\n\t\tthis.sparkplugEdgeNode = sparkplugEdgeNode;\n\t\tthis.deviceDescriptor = deviceDescriptor;\n\t\tthis.groupId = deviceDescriptor.getGroupId();\n\t\tthis.edgeNodeId = deviceDescriptor.getEdgeNodeId();\n\t\tthis.deviceId = deviceDescriptor.getDeviceId();\n\n\t\tthis.online = true;\n\t\tthis.onlineTimestamp = onlineTimestamp;\n\t}\n\n\t@Override\n\tpublic SparkplugDescriptor getSparkplugDescriptor() {\n\t\treturn deviceDescriptor;\n\t}\n\n\tpublic SparkplugEdgeNode getSparkplugEdgeNode() {\n\t\treturn sparkplugEdgeNode;\n\t}\n\n\tpublic DeviceDescriptor getDeviceDescrptor() {\n\t\treturn deviceDescriptor;\n\t}\n\n\tpublic String getGroupId() {\n\t\treturn groupId;\n\t}\n\n\tpublic String getEdgeNodeId() {\n\t\treturn edgeNodeId;\n\t}\n\n\tpublic String getDeviceId() {\n\t\treturn deviceId;\n\t}\n\n\tpublic boolean isOnline() {\n\t\treturn online;\n\t}\n\n\tpublic void setOnline(boolean online, Date timestamp) {\n\t\tthis.online = online;\n\t\tif (online) {\n\t\t\tlogger.info(\"Device {} set online at {}\", deviceDescriptor, timestamp);\n\t\t\tthis.onlineTimestamp = timestamp;\n\t\t} else {\n\t\t\tlogger.info(\"Device {} set offline at {}\", deviceDescriptor, timestamp);\n\t\t\tthis.offlineTimestamp = timestamp;\n\t\t}\n\t}\n\n\tpublic Date getOnlineTimestamp() {\n\t\treturn onlineTimestamp;\n\t}\n\n\tpublic Date getOfflineTimestamp() {\n\t\treturn offlineTimestamp;\n\t}\n}\n"
  },
  {
    "path": "java/lib/host/src/main/java/org/eclipse/tahu/host/manager/SparkplugEdgeNode.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host.manager;\n\nimport java.util.Collections;\nimport java.util.Date;\nimport java.util.Map;\nimport java.util.concurrent.ConcurrentHashMap;\n\nimport org.eclipse.tahu.exception.TahuErrorCode;\nimport org.eclipse.tahu.exception.TahuException;\nimport org.eclipse.tahu.message.model.DeviceDescriptor;\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\nimport org.eclipse.tahu.message.model.SparkplugDescriptor;\nimport org.eclipse.tahu.mqtt.MqttClientId;\nimport org.eclipse.tahu.mqtt.MqttServerName;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class SparkplugEdgeNode extends MetricManager {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(SparkplugEdgeNode.class.getName());\n\n\t// Static variables\n\tprivate final EdgeNodeDescriptor edgeNodeDescriptor;\n\tprivate final String groupId;\n\tprivate final String edgeNodeId;\n\tprivate final Map<DeviceDescriptor, SparkplugDevice> sparkplugDevices;\n\n\t// Dynamic variables\n\tprivate MqttServerName mqttServerName;\n\tprivate MqttClientId hostAppMqttClientId;\n\tprivate boolean online;\n\tprivate Date onlineTimestamp;\n\tprivate Date offlineTimestamp;\n\n\t// Sequence number tracking\n\tprivate Long birthBdSeqNum;\n\tprivate Long lastSeqNum;\n\n\tprivate final Object lock = new Object();\n\n\tSparkplugEdgeNode(String groupId, String edgeNodeId, MqttServerName mqttServerName,\n\t\t\tMqttClientId hostAppMqttClientId) {\n\t\tthis(new EdgeNodeDescriptor(groupId, edgeNodeId), mqttServerName, hostAppMqttClientId);\n\t}\n\n\tSparkplugEdgeNode(EdgeNodeDescriptor edgeNodeDescriptor, MqttServerName mqttServerName,\n\t\t\tMqttClientId hostAppMqttClientId) {\n\t\tthis.edgeNodeDescriptor = edgeNodeDescriptor;\n\t\tthis.groupId = edgeNodeDescriptor.getGroupId();\n\t\tthis.edgeNodeId = edgeNodeDescriptor.getEdgeNodeId();\n\t\tthis.sparkplugDevices = new ConcurrentHashMap<>();\n\n\t\tthis.mqttServerName = mqttServerName;\n\t\tthis.hostAppMqttClientId = hostAppMqttClientId;\n\t}\n\n\t@Override\n\tpublic SparkplugDescriptor getSparkplugDescriptor() {\n\t\treturn edgeNodeDescriptor;\n\t}\n\n\tpublic EdgeNodeDescriptor getEdgeNodeDescriptor() {\n\t\treturn edgeNodeDescriptor;\n\t}\n\n\tpublic String getGroupId() {\n\t\treturn groupId;\n\t}\n\n\tpublic String getEdgeNodeId() {\n\t\treturn edgeNodeId;\n\t}\n\n\tpublic void addDevice(DeviceDescriptor deviceDescriptor, SparkplugDevice sparkplugDevice) {\n\t\tsparkplugDevices.put(deviceDescriptor, sparkplugDevice);\n\t}\n\n\tpublic Map<DeviceDescriptor, SparkplugDevice> getSparkplugDevices() {\n\t\treturn Collections.unmodifiableMap(sparkplugDevices);\n\t}\n\n\tpublic SparkplugDevice getSparkplugDevice(DeviceDescriptor deviceDescriptor) {\n\t\treturn sparkplugDevices.get(deviceDescriptor);\n\t}\n\n\tpublic MqttServerName getMqttServerName() {\n\t\treturn mqttServerName;\n\t}\n\n\tpublic void setMqttServerName(MqttServerName mqttServerName) {\n\t\tthis.mqttServerName = mqttServerName;\n\t}\n\n\tpublic MqttClientId getHostAppMqttClientId() {\n\t\treturn hostAppMqttClientId;\n\t}\n\n\tpublic void setHostAppMqttClientId(MqttClientId hostAppMqttClientId) {\n\t\tthis.hostAppMqttClientId = hostAppMqttClientId;\n\t}\n\n\tpublic boolean isOnline() {\n\t\treturn online;\n\t}\n\n\tpublic void setOnline(boolean online, Date timestamp, Long incomingBdSeq, Long incomingSeq) throws TahuException {\n\t\tsynchronized (lock) {\n\t\t\tif (online) {\n\t\t\t\tif (timestamp == null) {\n\t\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT,\n\t\t\t\t\t\t\t\"The timestamp can not be missing from an NBIRTH message\");\n\t\t\t\t}\n\t\t\t\tif (incomingBdSeq == null) {\n\t\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT,\n\t\t\t\t\t\t\t\"The bdSeq can not be missing from an NBIRTH message\");\n\t\t\t\t}\n\n\t\t\t\tthis.online = online;\n\t\t\t\tthis.onlineTimestamp = timestamp;\n\t\t\t\tthis.birthBdSeqNum = incomingBdSeq;\n\t\t\t\tthis.lastSeqNum = incomingSeq;\n\t\t\t} else {\n\t\t\t\tif (incomingBdSeq == null) {\n\t\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT,\n\t\t\t\t\t\t\t\"The bdSeq can not be missing from an NDEATH message\");\n\t\t\t\t}\n\n\t\t\t\t// Check the bdSeq\n\t\t\t\tif (birthBdSeqNum != incomingBdSeq) {\n\t\t\t\t\tlogger.debug(\"Mismatched bdSeq number - got {} expected {} - ignoring\", incomingBdSeq,\n\t\t\t\t\t\t\tbirthBdSeqNum);\n\t\t\t\t\treturn;\n\t\t\t\t} else {\n\t\t\t\t\tthis.online = online;\n\t\t\t\t\tthis.offlineTimestamp = timestamp;\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tlogger.info(\"Edge Node {} set {} at {}\", edgeNodeDescriptor, online ? \"online\" : \"offline\", timestamp);\n\t\t}\n\t}\n\n\tpublic void forceOffline(Date timestamp) {\n\t\tsynchronized (lock) {\n\t\t\tthis.online = false;\n\t\t\tthis.offlineTimestamp = timestamp;\n\t\t}\n\t}\n\n\tpublic Date getOnlineTimestamp() {\n\t\treturn onlineTimestamp;\n\t}\n\n\tpublic Date getOfflineTimestamp() {\n\t\treturn offlineTimestamp;\n\t}\n\n\tpublic Long getBirthBdSeqNum() {\n\t\treturn birthBdSeqNum;\n\t}\n\n\tpublic void handleSeq(Long incomingSeq) throws TahuException {\n\t\tsynchronized (lock) {\n\t\t\tif (lastSeqNum != null) {\n\t\t\t\tlastSeqNum++;\n\t\t\t\tif (lastSeqNum.equals(256L)) {\n\t\t\t\t\tlastSeqNum = 0L;\n\t\t\t\t}\n\n\t\t\t\tif (!lastSeqNum.equals(incomingSeq)) {\n\t\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT,\n\t\t\t\t\t\t\t\"The sequence number check did not pass - expected \" + lastSeqNum + \" but received \"\n\t\t\t\t\t\t\t\t\t+ incomingSeq);\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tthrow new TahuException(TahuErrorCode.INVALID_ARGUMENT,\n\t\t\t\t\t\t\"The sequence number check did not pass - expected \" + lastSeqNum + \" but received \"\n\t\t\t\t\t\t\t\t+ incomingSeq);\n\t\t\t}\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/host/src/main/java/org/eclipse/tahu/host/model/HostApplicationMetricMap.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host.model;\n\nimport java.util.Map;\nimport java.util.concurrent.ConcurrentHashMap;\n\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.SparkplugDescriptor;\nimport org.eclipse.tahu.message.model.Template;\nimport org.eclipse.tahu.model.MetricDataTypeMap;\nimport org.eclipse.tahu.model.MetricMap;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class HostApplicationMetricMap {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(HostApplicationMetricMap.class.getName());\n\n\tprivate static HostApplicationMetricMap instance;\n\n\tprivate final Map<EdgeNodeDescriptor, Map<SparkplugDescriptor, MetricMap>> allEdgeNodeMetricMaps;\n\n\tprivate final Object mapLock = new Object();\n\n\tpublic static HostApplicationMetricMap getInstance() {\n\t\tif (instance == null) {\n\t\t\tinstance = new HostApplicationMetricMap();\n\t\t}\n\t\treturn instance;\n\t}\n\n\tprivate HostApplicationMetricMap() {\n\t\tallEdgeNodeMetricMaps = new ConcurrentHashMap<>();\n\t}\n\n\tpublic void addMetric(EdgeNodeDescriptor edgeNodeDescriptor, SparkplugDescriptor sparkplugDescriptor,\n\t\t\tString metricName, Metric metric) {\n\t\tsynchronized (mapLock) {\n\t\t\tMap<SparkplugDescriptor, MetricMap> edgeNodeMetricMaps =\n\t\t\t\t\tallEdgeNodeMetricMaps.computeIfAbsent(edgeNodeDescriptor, (k) -> new ConcurrentHashMap<>());\n\t\t\tMetricMap metricMap = edgeNodeMetricMaps.computeIfAbsent(sparkplugDescriptor, (k) -> new MetricMap());\n\t\t\tmetricMap.addAlias(metricName, metric.getAlias(), metric.getDataType());\n\n\t\t\tif (metric.getDataType() == MetricDataType.Template && metric.getValue() != null\n\t\t\t\t\t&& Template.class.isAssignableFrom(metric.getValue().getClass())) {\n\t\t\t\tTemplate template = (Template) metric.getValue();\n\t\t\t\tfor (Metric childMetric : template.getMetrics()) {\n\t\t\t\t\taddMetric(edgeNodeDescriptor, sparkplugDescriptor, metricName + \"/\" + childMetric.getName(),\n\t\t\t\t\t\t\tchildMetric);\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\tpublic void clear(EdgeNodeDescriptor edgeNodeDescriptor) {\n\t\tsynchronized (mapLock) {\n\t\t\tallEdgeNodeMetricMaps.remove(edgeNodeDescriptor);\n\t\t}\n\t}\n\n\tpublic Long getAlias(EdgeNodeDescriptor edgeNodeDescriptor, SparkplugDescriptor sparkplugDescriptor,\n\t\t\tString metricName) {\n\t\tMap<SparkplugDescriptor, MetricMap> edgeNodeMetricMaps = allEdgeNodeMetricMaps.get(edgeNodeDescriptor);\n\t\tif (edgeNodeMetricMaps != null) {\n\t\t\tMetricMap metricMap = edgeNodeMetricMaps.get(sparkplugDescriptor);\n\t\t\tif (metricMap != null) {\n\t\t\t\treturn metricMap.getAlias(metricName);\n\t\t\t} else {\n\t\t\t\treturn null;\n\t\t\t}\n\t\t} else {\n\t\t\treturn null;\n\t\t}\n\t}\n\n\tpublic String getMetricName(EdgeNodeDescriptor edgeNodeDescriptor, SparkplugDescriptor sparkplugDescriptor,\n\t\t\tlong alias) {\n\t\tMap<SparkplugDescriptor, MetricMap> edgeNodeMetricMaps = allEdgeNodeMetricMaps.get(edgeNodeDescriptor);\n\t\tif (edgeNodeMetricMaps != null) {\n\t\t\tMetricMap metricMap = edgeNodeMetricMaps.get(sparkplugDescriptor);\n\t\t\tif (metricMap != null) {\n\t\t\t\treturn metricMap.getMetricName(alias);\n\t\t\t} else {\n\t\t\t\treturn null;\n\t\t\t}\n\t\t} else {\n\t\t\treturn null;\n\t\t}\n\t}\n\n\tpublic boolean aliasExists(EdgeNodeDescriptor edgeNodeDescriptor, SparkplugDescriptor sparkplugDescriptor,\n\t\t\tlong alias) {\n\t\tMap<SparkplugDescriptor, MetricMap> edgeNodeMetricMaps = allEdgeNodeMetricMaps.get(edgeNodeDescriptor);\n\t\tif (edgeNodeMetricMaps != null && edgeNodeMetricMaps.get(sparkplugDescriptor) != null) {\n\t\t\tMetricMap metricMap = edgeNodeMetricMaps.get(sparkplugDescriptor);\n\t\t\tif (metricMap != null && metricMap.getMetricName(alias) != null) {\n\t\t\t\treturn true;\n\t\t\t} else {\n\t\t\t\treturn false;\n\t\t\t}\n\t\t} else {\n\t\t\treturn false;\n\t\t}\n\t}\n\n\tpublic MetricDataTypeMap getMetricDataTypeMap(EdgeNodeDescriptor edgeNodeDescriptor,\n\t\t\tSparkplugDescriptor sparkplugDescriptor) {\n\t\tMap<SparkplugDescriptor, MetricMap> edgeNodeMetricMaps = allEdgeNodeMetricMaps.get(edgeNodeDescriptor);\n\t\tif (edgeNodeMetricMaps != null && edgeNodeMetricMaps.get(sparkplugDescriptor) != null) {\n\t\t\treturn edgeNodeMetricMaps.get(sparkplugDescriptor).getMetricDataTypeMap();\n\t\t} else {\n\t\t\treturn null;\n\t\t}\n\t}\n\n\tpublic MetricDataType getDataType(EdgeNodeDescriptor edgeNodeDescriptor, SparkplugDescriptor sparkplugDescriptor,\n\t\t\tString metricName) {\n\t\tMap<SparkplugDescriptor, MetricMap> edgeNodeMetricMaps = allEdgeNodeMetricMaps.get(edgeNodeDescriptor);\n\t\tif (edgeNodeMetricMaps != null && edgeNodeMetricMaps.get(sparkplugDescriptor) != null) {\n\t\t\treturn edgeNodeMetricMaps.get(sparkplugDescriptor).getMetricDataType(metricName);\n\t\t} else {\n\t\t\treturn null;\n\t\t}\n\t}\n\n\tpublic MetricDataType getDataType(EdgeNodeDescriptor edgeNodeDescriptor, SparkplugDescriptor sparkplugDescriptor,\n\t\t\tLong alias) {\n\t\tMap<SparkplugDescriptor, MetricMap> edgeNodeMetricMaps = allEdgeNodeMetricMaps.get(edgeNodeDescriptor);\n\t\tif (edgeNodeMetricMaps != null && edgeNodeMetricMaps.get(sparkplugDescriptor) != null) {\n\t\t\treturn edgeNodeMetricMaps.get(sparkplugDescriptor).getMetricDataType(alias);\n\t\t} else {\n\t\t\treturn null;\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "java/lib/host/src/main/java/org/eclipse/tahu/host/model/HostMetric.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host.model;\n\nimport java.util.Date;\n\nimport org.eclipse.tahu.SparkplugInvalidTypeException;\nimport org.eclipse.tahu.message.model.MetaData;\nimport org.eclipse.tahu.message.model.Metric;\nimport org.eclipse.tahu.message.model.MetricDataType;\nimport org.eclipse.tahu.message.model.PropertySet;\n\npublic class HostMetric extends Metric {\n\n\tprivate boolean stale;\n\n\tpublic HostMetric(boolean stale) {\n\t\tsuper();\n\t\tthis.stale = stale;\n\t}\n\n\tpublic HostMetric(String name, Long alias, Date timestamp, MetricDataType dataType, Boolean isHistorical,\n\t\t\tBoolean isTransient, MetaData metaData, PropertySet properties, Object value, boolean stale)\n\t\t\tthrows SparkplugInvalidTypeException {\n\t\tsuper(name, alias, timestamp, dataType, isHistorical, isTransient, metaData, properties, value);\n\t\tthis.stale = stale;\n\t}\n\n\tpublic HostMetric(Metric metric, boolean stale) throws SparkplugInvalidTypeException {\n\t\tthis(metric.getName(), metric.getAlias(), metric.getTimestamp(), metric.getDataType(), metric.isHistorical(),\n\t\t\t\tmetric.isTransient(), metric.getMetaData(), metric.getProperties(), metric.getValue(), stale);\n\t}\n\n\tpublic boolean isStale() {\n\t\treturn stale;\n\t}\n\n\tpublic void setStale(boolean stale) {\n\t\tthis.stale = stale;\n\t}\n}\n"
  },
  {
    "path": "java/lib/host/src/main/java/org/eclipse/tahu/host/model/MessageContext.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host.model;\n\nimport org.eclipse.tahu.message.model.Message;\nimport org.eclipse.tahu.message.model.Message.MessageBuilder;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.Topic;\nimport org.eclipse.tahu.mqtt.MqttClientId;\nimport org.eclipse.tahu.mqtt.MqttServerName;\n\n/**\n * A container class to carry fields and objects associated with an MQTT message context.\n */\npublic class MessageContext {\n\n\tprivate final MqttServerName mqttServerName;\n\tprivate final MqttClientId hostAppMqttClientId;\n\tprivate final Message message;\n\tprivate final int payloadLength;\n\tprivate final long seqNum;\n\n\tpublic MessageContext(MqttServerName mqttServerName, MqttClientId hostAppMqttClientId, Topic topic,\n\t\t\tSparkplugBPayload payload, int payloadLength, long seqNum) {\n\t\tthis.mqttServerName = mqttServerName;\n\t\tthis.hostAppMqttClientId = hostAppMqttClientId;\n\t\tthis.message = new MessageBuilder(topic, payload).build();\n\t\tthis.payloadLength = payloadLength;\n\t\tthis.seqNum = seqNum;\n\t}\n\n\tpublic MqttServerName getMqttServerName() {\n\t\treturn mqttServerName;\n\t}\n\n\tpublic MqttClientId getHostAppMqttClientId() {\n\t\treturn hostAppMqttClientId;\n\t}\n\n\tpublic Message getMessage() {\n\t\treturn message;\n\t}\n\n\tpublic Topic getTopic() {\n\t\treturn message.getTopic();\n\t}\n\n\tpublic SparkplugBPayload getPayload() {\n\t\treturn message.getPayload();\n\t}\n\n\tpublic int getPayloadLength() {\n\t\treturn payloadLength;\n\t}\n\n\tpublic long getSeqNum() {\n\t\treturn seqNum;\n\t}\n}\n"
  },
  {
    "path": "java/lib/host/src/main/java/org/eclipse/tahu/host/seq/SequenceReorderContext.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host.seq;\n\nimport org.eclipse.paho.client.mqttv3.MqttMessage;\nimport org.eclipse.tahu.message.model.MessageType;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.Topic;\nimport org.eclipse.tahu.mqtt.MqttClientId;\nimport org.eclipse.tahu.mqtt.MqttServerName;\n\npublic class SequenceReorderContext {\n\n\tprivate final String topicString;\n\tprivate final String[] splitTopic;\n\tprivate final Topic topic;\n\tprivate final MqttMessage message;\n\tprivate final SparkplugBPayload payload;\n\tprivate final MessageType messageType;\n\tprivate final MqttServerName mqttServerName;\n\tprivate final MqttClientId hostAppMqttClientId;\n\tprivate final long arrivedTime;\n\n\tpublic SequenceReorderContext(String topicString, Topic topic, MqttMessage message, SparkplugBPayload payload,\n\t\t\tMessageType messageType, MqttServerName mqttServerName, MqttClientId hostAppMqttClientId,\n\t\t\tlong arrivedTime) {\n\t\tthis.topicString = topicString;\n\t\tthis.splitTopic = topicString.split(\"/\");\n\t\tthis.topic = topic;\n\t\tthis.message = message;\n\t\tthis.payload = payload;\n\t\tthis.messageType = messageType;\n\t\tthis.mqttServerName = mqttServerName;\n\t\tthis.hostAppMqttClientId = hostAppMqttClientId;\n\t\tthis.arrivedTime = arrivedTime;\n\t}\n\n\tpublic String getTopicString() {\n\t\treturn topicString;\n\t}\n\n\tpublic String[] getSplitTopic() {\n\t\treturn splitTopic;\n\t}\n\n\tpublic Topic getTopic() {\n\t\treturn topic;\n\t}\n\n\tpublic MqttMessage getMessage() {\n\t\treturn message;\n\t}\n\n\tpublic SparkplugBPayload getPayload() {\n\t\treturn payload;\n\t}\n\n\tpublic MessageType getMessageType() {\n\t\treturn messageType;\n\t}\n\n\tpublic MqttServerName getMqttServerName() {\n\t\treturn mqttServerName;\n\t}\n\n\tpublic MqttClientId getHostAppMqttClientId() {\n\t\treturn hostAppMqttClientId;\n\t}\n\n\tpublic long getArrivedTime() {\n\t\treturn arrivedTime;\n\t}\n}\n"
  },
  {
    "path": "java/lib/host/src/main/java/org/eclipse/tahu/host/seq/SequenceReorderManager.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host.seq;\n\nimport java.util.Calendar;\nimport java.util.Map;\nimport java.util.Timer;\nimport java.util.TimerTask;\nimport java.util.concurrent.ConcurrentHashMap;\nimport java.util.concurrent.ThreadPoolExecutor;\n\nimport org.eclipse.paho.client.mqttv3.MqttMessage;\nimport org.eclipse.tahu.SparkplugParsingException;\nimport org.eclipse.tahu.host.CommandPublisher;\nimport org.eclipse.tahu.host.TahuHostCallback;\nimport org.eclipse.tahu.host.TahuPayloadHandler;\nimport org.eclipse.tahu.host.api.HostApplicationEventHandler;\nimport org.eclipse.tahu.host.manager.EdgeNodeManager;\nimport org.eclipse.tahu.host.manager.SparkplugEdgeNode;\nimport org.eclipse.tahu.host.model.HostApplicationMetricMap;\nimport org.eclipse.tahu.message.PayloadDecoder;\nimport org.eclipse.tahu.message.SparkplugBPayloadDecoder;\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\nimport org.eclipse.tahu.message.model.MessageType;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.eclipse.tahu.message.model.Topic;\nimport org.eclipse.tahu.mqtt.MqttClientId;\nimport org.eclipse.tahu.mqtt.MqttServerName;\nimport org.eclipse.tahu.util.TopicUtil;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class SequenceReorderManager {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(SequenceReorderManager.class.getName());\n\n\tprivate static SequenceReorderManager instance;\n\n\tprivate final static long SEQUENCE_MONITOR_TIMER = 1000L;\n\n\tprivate final Map<EdgeNodeDescriptor, SequenceReorderMap> edgeNodeMap;\n\n\tprivate final Object edgeNodeMapLock = new Object();\n\n\tprivate Timer timer;\n\n\tprivate HostApplicationEventHandler eventHandler;\n\n\tprivate CommandPublisher commandPublisher;\n\n\tprivate PayloadDecoder<SparkplugBPayload> payloadDecoder;\n\n\tprivate Long timeout;\n\n\tprivate SequenceReorderManager() {\n\t\tthis.edgeNodeMap = new ConcurrentHashMap<>();\n\t}\n\n\tpublic static SequenceReorderManager getInstance() {\n\t\tif (instance == null) {\n\t\t\tinstance = new SequenceReorderManager();\n\t\t}\n\t\treturn instance;\n\t}\n\n\tpublic void init(HostApplicationEventHandler eventHandler, CommandPublisher commandPublisher,\n\t\t\tPayloadDecoder<SparkplugBPayload> payloadDecoder, Long timeout) {\n\t\tif (eventHandler != null && timeout != null) {\n\t\t\tinstance.eventHandler = eventHandler;\n\t\t\tinstance.commandPublisher = commandPublisher;\n\t\t\tinstance.payloadDecoder = payloadDecoder;\n\t\t\tinstance.timeout = timeout;\n\t\t} else {\n\t\t\tlogger.error(\"Not re-initializing the SequenceReorderManager timer\");\n\t\t}\n\t}\n\n\tpublic void start() {\n\t\tTimerTask monitorTask = new TimerTask() {\n\t\t\tpublic void run() {\n\t\t\t\tsynchronized (edgeNodeMapLock) {\n\t\t\t\t\tedgeNodeMap.values().forEach(sequenceReorderMap -> {\n\t\t\t\t\t\ttry {\n\t\t\t\t\t\t\tif (!sequenceReorderMap.isEmpty()) {\n\t\t\t\t\t\t\t\tCalendar calendar = Calendar.getInstance();\n\t\t\t\t\t\t\t\tcalendar.add(Calendar.MILLISECOND, (int) (timeout * -1));\n\t\t\t\t\t\t\t\tif (sequenceReorderMap.getLastUpdateTime().before(calendar.getTime())) {\n\t\t\t\t\t\t\t\t\t// Timed out\n\t\t\t\t\t\t\t\t\tlogger.info(\"Timeout while reording sequence numbers on {} with {} in queue\",\n\t\t\t\t\t\t\t\t\t\t\tsequenceReorderMap.getEdgeNodeDescriptor(), sequenceReorderMap.size());\n\t\t\t\t\t\t\t\t\tSequenceReorderContext sequenceReorderContext =\n\t\t\t\t\t\t\t\t\t\t\tsequenceReorderMap.getExpiredSequenceReorderContext(timeout);\n\t\t\t\t\t\t\t\t\tif (sequenceReorderContext != null) {\n\t\t\t\t\t\t\t\t\t\tTahuPayloadHandler handler =\n\t\t\t\t\t\t\t\t\t\t\t\tnew TahuPayloadHandler(eventHandler, commandPublisher, payloadDecoder);\n\t\t\t\t\t\t\t\t\t\tSparkplugEdgeNode edgeNode = EdgeNodeManager.getInstance()\n\t\t\t\t\t\t\t\t\t\t\t\t.getSparkplugEdgeNode(sequenceReorderMap.getEdgeNodeDescriptor());\n\n\t\t\t\t\t\t\t\t\t\t// Reset the map as all values are now invalid\n\t\t\t\t\t\t\t\t\t\tsequenceReorderMap.reset();\n\n\t\t\t\t\t\t\t\t\t\tif (edgeNode != null) {\n\t\t\t\t\t\t\t\t\t\t\tlogger.info(\"Requesting a rebirth from known edge node {}\",\n\t\t\t\t\t\t\t\t\t\t\t\t\tsequenceReorderMap.getEdgeNodeDescriptor());\n\t\t\t\t\t\t\t\t\t\t\tedgeNode.setHostAppMqttClientId(\n\t\t\t\t\t\t\t\t\t\t\t\t\tsequenceReorderContext.getHostAppMqttClientId());\n\t\t\t\t\t\t\t\t\t\t\tedgeNode.setMqttServerName(sequenceReorderContext.getMqttServerName());\n\t\t\t\t\t\t\t\t\t\t\thandler.requestRebirth(sequenceReorderContext.getMqttServerName(),\n\t\t\t\t\t\t\t\t\t\t\t\t\tsequenceReorderContext.getHostAppMqttClientId(),\n\t\t\t\t\t\t\t\t\t\t\t\t\tsequenceReorderMap.getEdgeNodeDescriptor(), edgeNode);\n\t\t\t\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\t\t\t\tlogger.info(\"Requesting a rebirth from unknown edge node {}\",\n\t\t\t\t\t\t\t\t\t\t\t\t\tsequenceReorderMap.getEdgeNodeDescriptor());\n\t\t\t\t\t\t\t\t\t\t\thandler.requestRebirth(sequenceReorderContext.getMqttServerName(),\n\t\t\t\t\t\t\t\t\t\t\t\t\tsequenceReorderContext.getHostAppMqttClientId(),\n\t\t\t\t\t\t\t\t\t\t\t\t\tsequenceReorderMap.getEdgeNodeDescriptor());\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t} catch (Exception e) {\n\t\t\t\t\t\t\tlogger.error(\"Failed to handle reorder entry in monitor\", e);\n\t\t\t\t\t\t}\n\t\t\t\t\t});\n\t\t\t\t}\n\t\t\t}\n\t\t};\n\t\ttimer = new Timer(\"SequenceMonitorTimer\");\n\t\ttimer.scheduleAtFixedRate(monitorTask, SEQUENCE_MONITOR_TIMER, SEQUENCE_MONITOR_TIMER);\n\t}\n\n\tpublic void stop() {\n\t\tif (timer != null) {\n\t\t\ttimer.cancel();\n\t\t\ttimer = null;\n\t\t}\n\t}\n\n\t/**\n\t * This handles a {@link SparkplugBPayload} when sequence number reordering is enabled. This method will buffer\n\t * messages as they flow into MQTT Engine and reorder based on sequence numbers within a given timeout period.\n\t * \n\t * @param engineCallback\n\t * @param executor\n\t * @param settingsAccessor\n\t * @param edgeNodesAccessor\n\t * @param topicString\n\t * @param splitTopic\n\t * @param message\n\t * @param mqttServerName\n\t * @param mqttClientId\n\t * @param arrivedTime\n\t * @throws Exception\n\t */\n\tpublic void handlePayload(TahuHostCallback tahuHostCallback, ThreadPoolExecutor executor, final String topicString,\n\t\t\tfinal String[] splitTopic, final MqttMessage message, final MqttServerName mqttServerName,\n\t\t\tfinal MqttClientId mqttClientId, final long arrivedTime) throws Exception {\n\n\t\t// Get the Topic and MessageType\n\t\tTopic topic;\n\t\ttry {\n\t\t\ttopic = TopicUtil.parseTopic(splitTopic);\n\t\t} catch (SparkplugParsingException e) {\n\t\t\tlogger.error(\"Error parsing topic\", e);\n\t\t\treturn;\n\t\t}\n\t\tMessageType messageType = topic.getType();\n\n\t\t// Early return for commands\n\t\tif (messageType == MessageType.NCMD || messageType == MessageType.DCMD) {\n\t\t\treturn;\n\t\t}\n\n\t\t// Parse the payload\n\t\tPayloadDecoder<SparkplugBPayload> decoder = new SparkplugBPayloadDecoder();\n\t\tSparkplugBPayload payload = decoder.buildFromByteArray(message.getPayload(), HostApplicationMetricMap\n\t\t\t\t.getInstance().getMetricDataTypeMap(topic.getEdgeNodeDescriptor(), topic.getSparkplugDescriptor()));\n\t\tlogger.trace(\"Incoming payload: {}\", payload);\n\n\t\tsynchronized (edgeNodeMapLock) {\n\t\t\t// See if the Edge Node is known and add if not\n\t\t\tEdgeNodeDescriptor edgeNodeDescriptor = new EdgeNodeDescriptor(topic.getGroupId(), topic.getEdgeNodeId());\n\t\t\tSequenceReorderMap sequenceReorderMap =\n\t\t\t\t\tedgeNodeMap.computeIfAbsent(edgeNodeDescriptor, (k) -> new SequenceReorderMap(edgeNodeDescriptor));\n\n\t\t\tif (topic.isType(MessageType.NBIRTH)) {\n\t\t\t\t// Reset the expected sequence number to zero\n\t\t\t\tlogger.debug(\"Resetting sequenceReorderMap on NBIRTH for {}\", edgeNodeDescriptor);\n\t\t\t\tsequenceReorderMap.resetSeqNum();\n\t\t\t} else if (topic.isType(MessageType.NDEATH)) {\n\t\t\t\t// Handle NDEATH immediately and return\n\t\t\t\thandleMessage(tahuHostCallback, executor, new SequenceReorderContext(topicString, topic, message,\n\t\t\t\t\t\tpayload, messageType, mqttServerName, mqttClientId, arrivedTime));\n\t\t\t\treturn;\n\t\t\t} else if (topic.isType(MessageType.NCMD) || topic.isType(MessageType.DCMD)) {\n\t\t\t\t// Ignition NCMD and DCMD\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\t// See if this is the next expected sequence number\n\t\t\tboolean passedSeqNumCheck = false;\n\t\t\tif (payload == null || payload.getSeq() == null) {\n\t\t\t\tlogger.warn(\"Invalid payload arrived on topic={} with {}\", topic,\n\t\t\t\t\t\tpayload == null\n\t\t\t\t\t\t\t\t? \"'payload is null'\"\n\t\t\t\t\t\t\t\t: payload.getSeq() == null\n\t\t\t\t\t\t\t\t\t\t? \"'payload sequence number is null'\"\n\t\t\t\t\t\t\t\t\t\t: \"sequence number is present - shouldn't have gotten here\");\n\t\t\t} else {\n\t\t\t\tpassedSeqNumCheck = sequenceReorderMap.liveSeqNumCheck(payload.getSeq());\n\t\t\t}\n\n\t\t\tif (passedSeqNumCheck) {\n\t\t\t\t// Set the session state\n\t\t\t\tif (topic.isType(MessageType.NBIRTH)) {\n\t\t\t\t\tsequenceReorderMap.prune(payload.getTimestamp());\n\t\t\t\t}\n\n\t\t\t\t// This is the next expected message - process it\n\t\t\t\tlogger.debug(\"Handling real time message on {} with seqNum={}\", topicString, payload.getSeq());\n\t\t\t\thandleMessage(tahuHostCallback, executor, new SequenceReorderContext(topicString, topic, message,\n\t\t\t\t\t\tpayload, messageType, mqttServerName, mqttClientId, arrivedTime));\n\n\t\t\t\t// Now check to see if there are other messages to process\n\t\t\t\tif (!sequenceReorderMap.isEmpty()) {\n\t\t\t\t\tboolean done = false;\n\t\t\t\t\tlong nextSeqNum = getNextSeqNum(payload.getSeq());\n\t\t\t\t\twhile (!done && !sequenceReorderMap.isEmpty()) {\n\t\t\t\t\t\tSequenceReorderContext sequenceReorderContext =\n\t\t\t\t\t\t\t\tsequenceReorderMap.storedSeqNumCheck(nextSeqNum);\n\t\t\t\t\t\tif (sequenceReorderContext != null) {\n\t\t\t\t\t\t\t// This is the next expected message - publish it\n\t\t\t\t\t\t\tlogger.debug(\"Handling stored message on {} with seqNum={}\", topicString, nextSeqNum);\n\t\t\t\t\t\t\thandleMessage(tahuHostCallback, executor, new SequenceReorderContext(\n//\t\t\t\t\t\t\t\t\t\t\tsequenceReorderContext.getSettingsAccessor(),\n//\t\t\t\t\t\t\t\t\t\t\tsequenceReorderContext.getEdgeNodesAccessor(),\n\t\t\t\t\t\t\t\t\tsequenceReorderContext.getTopicString(), sequenceReorderContext.getTopic(),\n\t\t\t\t\t\t\t\t\tsequenceReorderContext.getMessage(), sequenceReorderContext.getPayload(),\n\t\t\t\t\t\t\t\t\tsequenceReorderContext.getMessageType(), sequenceReorderContext.getMqttServerName(),\n\t\t\t\t\t\t\t\t\tsequenceReorderContext.getHostAppMqttClientId(),\n\t\t\t\t\t\t\t\t\tsequenceReorderContext.getArrivedTime()));\n\t\t\t\t\t\t\tnextSeqNum = getNextSeqNum(nextSeqNum);\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tlogger.debug(\"Failed to find SequenceReorderContext for {} - moving on\", nextSeqNum);\n\t\t\t\t\t\t\tdone = true;\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\t// This is not the next expected message - store it after handling session state\n\t\t\t\tlogger.debug(\"Storing message on {} due to out of sequence message with seqNum={} - was expecting {}\",\n\t\t\t\t\t\ttopicString, payload.getSeq(), sequenceReorderMap.getNextExpectedSeqNum());\n\t\t\t\tSequenceReorderContext sequenceReorderContext = new SequenceReorderContext(topicString, topic, message,\n\t\t\t\t\t\tpayload, messageType, mqttServerName, mqttClientId, arrivedTime);\n\t\t\t\tsequenceReorderMap.put(payload.getSeq(), sequenceReorderContext);\n\t\t\t}\n\t\t}\n\t}\n\n\t/**\n\t * Removes and Edge Node from the {@link SequenceReorderManager}. This should be used any time an Edge Node goes\n\t * offline.\n\t * \n\t * @param edgeNodeDescriptor the {@link EdgeNodeDescriptor} of the Edge Node to remove\n\t */\n\tpublic void removeEdgeNode(EdgeNodeDescriptor edgeNodeDescriptor) {\n\t\tsynchronized (edgeNodeMapLock) {\n\t\t\tedgeNodeMap.remove(edgeNodeDescriptor);\n\t\t}\n\t}\n\n\tprivate long getNextSeqNum(long currentSeqNum) {\n\t\tlong nextSeqNum = currentSeqNum + 1;\n\t\tif (nextSeqNum == 256) {\n\t\t\tnextSeqNum = 0;\n\t\t}\n\t\treturn nextSeqNum;\n\t}\n\n\tprivate void handleMessage(TahuHostCallback tahuHostCallback, ThreadPoolExecutor executor,\n\t\t\tSequenceReorderContext sequenceReorderContext) {\n\t\texecutor.execute(() -> {\n\t\t\ttry {\n\t\t\t\t// Handle the SparkplugBPayload\n\t\t\t\tnew TahuPayloadHandler(eventHandler, commandPublisher, payloadDecoder).handlePayload(\n\t\t\t\t\t\tsequenceReorderContext.getTopicString(), sequenceReorderContext.getSplitTopic(),\n\t\t\t\t\t\tsequenceReorderContext.getMessage(), sequenceReorderContext.getMqttServerName(),\n\t\t\t\t\t\tsequenceReorderContext.getHostAppMqttClientId());\n\n\t\t\t} catch (Throwable t) {\n\t\t\t\tlogger.error(\"Failed to handle Sparkplug B message on topic {} - requesting rebirth\",\n\t\t\t\t\t\tsequenceReorderContext.getTopic(), t);\n\t\t\t\tnew TahuPayloadHandler(eventHandler, commandPublisher, payloadDecoder).requestRebirth(\n\t\t\t\t\t\tsequenceReorderContext.getMqttServerName(), sequenceReorderContext.getHostAppMqttClientId(),\n\t\t\t\t\t\tsequenceReorderContext.getTopic().getEdgeNodeDescriptor());\n\t\t\t} finally {\n\t\t\t\t// Update the message latency\n\t\t\t\tlong latency = System.nanoTime() - sequenceReorderContext.getArrivedTime();\n\t\t\t\tif (logger.isTraceEnabled()) {\n\t\t\t\t\tlogger.trace(\"Updating message processing latency {}\", latency);\n\t\t\t\t}\n\t\t\t}\n\t\t});\n\t}\n}\n"
  },
  {
    "path": "java/lib/host/src/main/java/org/eclipse/tahu/host/seq/SequenceReorderMap.java",
    "content": "/********************************************************************************\n * Copyright (c) 2022 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\npackage org.eclipse.tahu.host.seq;\n\nimport java.util.Calendar;\nimport java.util.Date;\nimport java.util.Iterator;\nimport java.util.Map;\nimport java.util.concurrent.ConcurrentSkipListMap;\n\nimport org.eclipse.tahu.message.model.EdgeNodeDescriptor;\nimport org.eclipse.tahu.message.model.SparkplugBPayload;\nimport org.slf4j.Logger;\nimport org.slf4j.LoggerFactory;\n\npublic class SequenceReorderMap {\n\n\tprivate static Logger logger = LoggerFactory.getLogger(SequenceReorderMap.class.getName());\n\n\tprivate final EdgeNodeDescriptor edgeNodeDescriptor;\n\n\tprivate final Map<Long, SequenceReorderContext> sequenceMap;\n\n\tprivate volatile long expectedSeqNum;\n\n\tprivate volatile Date lastUpdateTime;\n\n\tprivate final Object seqLock = new Object();\n\n\tpublic SequenceReorderMap(EdgeNodeDescriptor edgeNodeDescriptor) {\n\t\tthis.edgeNodeDescriptor = edgeNodeDescriptor;\n\t\texpectedSeqNum = 0;\n\t\tlastUpdateTime = new Date();\n\t\tsequenceMap = new ConcurrentSkipListMap<>();\n\t}\n\n\tpublic EdgeNodeDescriptor getEdgeNodeDescriptor() {\n\t\treturn edgeNodeDescriptor;\n\t}\n\n\tpublic long getNextExpectedSeqNum() {\n\t\treturn expectedSeqNum;\n\t}\n\n\tpublic boolean liveSeqNumCheck(long toMatch) {\n\t\tsynchronized (seqLock) {\n\t\t\tboolean match = (toMatch == expectedSeqNum);\n\t\t\tlogger.trace(\"{} in liveSeqNumCheck - expected={} to actual={}\", match ? \"MATCHED\" : \"NOT MATCHED\",\n\t\t\t\t\texpectedSeqNum, toMatch);\n\t\t\tif (match) {\n\t\t\t\tincrementExpectedSeqNum();\n\t\t\t}\n\t\t\treturn match;\n\t\t}\n\t}\n\n\t/**\n\t * Checks if a sequence number matches the current expected one. If true, it increments the expected sequence number\n\t * and then returns the SequenceReorderConext for the message to be handled. It also removes that\n\t * SequenceReorderConext from the Map\n\t * \n\t * @param payload the {@link SparkplugBPayload} to handle\n\t * @return the SequenceReorderContext associated with the payload if the sequence number check passed, otherwise\n\t *         null is returned\n\t */\n\tpublic SequenceReorderContext storedSeqNumCheck(long toMatch) {\n\t\tsynchronized (seqLock) {\n\t\t\tSequenceReorderContext sequenceReorderContext = sequenceMap.remove(toMatch);\n\t\t\tif (sequenceReorderContext != null) {\n\t\t\t\tlogger.trace(\"MATCHED in storedSeqNumCheck - Found stored message for {}\", toMatch);\n\t\t\t\tincrementExpectedSeqNum();\n\t\t\t}\n\t\t\treturn sequenceReorderContext;\n\t\t}\n\t}\n\n\tpublic void resetSeqNum() {\n\t\tsynchronized (seqLock) {\n\t\t\texpectedSeqNum = 0;\n\t\t}\n\t}\n\n\t/**\n\t * Increments the sequence number and wraps if required\n\t */\n\tprivate void incrementExpectedSeqNum() {\n\t\tsynchronized (seqLock) {\n\t\t\t// Update the last update time and increment\n\t\t\tlastUpdateTime = new Date();\n\t\t\texpectedSeqNum++;\n\t\t\tif (expectedSeqNum == 256) {\n\t\t\t\texpectedSeqNum = 0;\n\t\t\t}\n\t\t}\n\t}\n\n\t/**\n\t * Adds a new {@link SequenceReorderContext} to the list by its sequence number\n\t * \n\t * @param seqNum the sequence number key\n\t * @param sequenceReorderContext the {@link SequenceReorderContext} associated with the sequence number\n\t */\n\tpublic void put(long seqNum, SequenceReorderContext sequenceReorderContext) {\n\t\tsynchronized (seqLock) {\n\t\t\tsequenceMap.put(seqNum, sequenceReorderContext);\n\t\t}\n\t}\n\n\t/**\n\t * Removes all messages in the map that are older than the NBIRTH\n\t *\n\t * @param nBirthDate the {@link Date} associated with the incoming NBIRTH\n\t */\n\tpublic void prune(Date nBirthDate) {\n\t\tif (nBirthDate == null) {\n\t\t\tlogger.error(\"Attempting to prune messages from the SequenceReorderMap failed. NBIRTH timestamp is null\");\n\t\t\treturn;\n\t\t}\n\n\t\tsynchronized (seqLock) {\n\t\t\tlogger.debug(\"Pruning with date {}\", nBirthDate);\n\t\t\tIterator<SequenceReorderContext> it = sequenceMap.values().iterator();\n\t\t\twhile (it.hasNext()) {\n\t\t\t\tSequenceReorderContext sequenceReorderContext = it.next();\n\t\t\t\tif (sequenceReorderContext != null && sequenceReorderContext.getPayload() != null\n\t\t\t\t\t\t&& sequenceReorderContext.getPayload().getTimestamp() != null\n\t\t\t\t\t\t&& sequenceReorderContext.getPayload().getTimestamp().before(nBirthDate)) {\n\t\t\t\t\tlogger.debug(\"Removing old message {}\", sequenceReorderContext.getTopic());\n\t\t\t\t\tit.remove();\n\t\t\t\t} else {\n\t\t\t\t\tlogger.debug(\"Checked {} - not removing because {} is after {}\", sequenceReorderContext.getTopic(),\n\t\t\t\t\t\t\tsequenceReorderContext.getPayload().getTimestamp(), nBirthDate);\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\tpublic void reset() {\n\t\tsynchronized (seqLock) {\n\t\t\texpectedSeqNum = 0;\n\t\t\tlastUpdateTime = new Date();\n\t\t\tsequenceMap.clear();\n\t\t}\n\t}\n\n\tpublic SequenceReorderContext getExpiredSequenceReorderContext(long timeout) {\n\t\tsynchronized (seqLock) {\n\t\t\tif (!sequenceMap.isEmpty()) {\n\t\t\t\tCalendar calendar = Calendar.getInstance();\n\t\t\t\tcalendar.add(Calendar.MILLISECOND, (int) (timeout * -1));\n\t\t\t\tif (lastUpdateTime.before(calendar.getTime())) {\n\t\t\t\t\treturn sequenceMap.values().iterator().next();\n\t\t\t\t}\n\t\t\t}\n\n\t\t\t// Didn't find an expired entry\n\t\t\treturn null;\n\t\t}\n\t}\n\n\tpublic int size() {\n\t\tsynchronized (seqLock) {\n\t\t\treturn sequenceMap.size();\n\t\t}\n\t}\n\n\tpublic boolean isEmpty() {\n\t\tsynchronized (seqLock) {\n\t\t\treturn sequenceMap.isEmpty();\n\t\t}\n\t}\n\n\tpublic Date getLastUpdateTime() {\n\t\treturn lastUpdateTime;\n\t}\n}\n"
  },
  {
    "path": "java/pom.xml",
    "content": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<!--/********************************************************************************\n * Copyright (c) 2014-2020 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n<project xmlns=\"http://maven.apache.org/POM/4.0.0\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\"\n  xsi:schemaLocation=\"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">\n  <modelVersion>4.0.0</modelVersion>\n\n  <properties>\n    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>\n    <main.basedir>${project.basedir}</main.basedir>\n    <paho.version>1.2.5</paho.version>\n    <jackson.version>2.13.4</jackson.version>\n    <jackson.databind.version>2.13.4.2</jackson.databind.version>\n    <slf4j.version>1.7.32</slf4j.version>\n    <logback.version>1.2.11</logback.version>\n    <protobuf.version>3.16.3</protobuf.version>\n    <maven.compiler.version>3.8.1</maven.compiler.version>\n    <maven.bundle.version>5.1.1</maven.bundle.version>\n  </properties>\n\n  <groupId>org.eclipse.tahu</groupId>\n  <artifactId>tahu</artifactId>\n  <version>1.0.7</version>\n  <packaging>pom</packaging>\n\n  <name>Eclipse Tahu</name>\n  <url>http://www.eclipse.org/tahu</url>\n  <description>\n    The Tahu project provides open-source implementations of Eclipse Sparkplug\n  </description>\n\n  <organization>\n    <name>Eclipse Tahu</name>\n    <url>http://www.eclipse.org/tahu</url>\n  </organization>\n\n  <developers>\n    <developer>\n      <id>ibinshtok</id>\n      <name>Ilya Binshtok</name>\n      <organization>Cirrus Link Solutions</organization>\n      <organizationUrl>http://www.cirrus-link.com</organizationUrl>\n      <roles>\n        <role>Developer</role>\n      </roles>\n    </developer>\n    <developer>\n      <id>nathandavenport</id>\n      <name>Nathan Davenport</name>\n      <organization>Cirrus Link Solutions</organization>\n      <organizationUrl>http://www.cirrus-link.com</organizationUrl>\n      <roles>\n        <role>Developer</role>\n      </roles>\n    </developer>\n    <developer>\n      <id>wes-johnson</id>\n      <name>Wes Johnson</name>\n      <organization>Cirrus Link Solutions</organization>\n      <organizationUrl>http://www.cirrus-link.com</organizationUrl>\n      <roles>\n        <role>Developer</role>\n      </roles>\n    </developer>\n    <developer>\n      <id>ckienle</id>\n      <name>Chad Kienle</name>\n      <organization>Cirrus Link Solutions</organization>\n      <organizationUrl>http://www.cirrus-link.com</organizationUrl>\n      <roles>\n        <role>Developer</role>\n      </roles>\n    </developer>\n  </developers>\n  <licenses>\n    <license>\n      <name>Eclipse Public License - Version 2.0</name>\n      <url>https://www.eclipse.org/legal/epl-2.0</url>\n    </license>\n  </licenses>\n\n  <scm>\n    <url>https://github.com/eclipse/tahu.git</url>\n    <connection>scm:git:git@github.com:eclipse/tahu.git</connection>\n  </scm>\n\n  <distributionManagement>\n    <snapshotRepository>\n      <id>ossrh</id>\n      <url>https://oss.sonatype.org/content/repositories/snapshots</url>\n    </snapshotRepository>\n    <repository>\n      <id>ossrh</id>\n      <url>https://oss.sonatype.org/service/local/staging/deploy/maven2/</url>\n    </repository>\n  </distributionManagement>\n\n  <modules>\n    <module>lib/core/pom.xml</module>\n\n    <!--\n      Do no change the order of these modules below - the last module MUST be deployed to Maven Central per this\n      https://issues.sonatype.org/browse/NEXUS-9138\n    -->\n    <module>lib/edge/pom.xml</module>\n    <module>lib/host/pom.xml</module>\n    <module>compat_impl/edge/pom.xml</module>\n    <module>compat_impl/host/pom.xml</module>\n  </modules>\n\n  <dependencies>\n    <dependency>\n      <groupId>junit</groupId>\n      <artifactId>junit</artifactId>\n      <version>4.13.2</version>\n      <scope>test</scope>\n    </dependency>\n    <dependency>\n      <groupId>org.testng</groupId>\n      <artifactId>testng</artifactId>\n      <version>6.9.10</version>\n      <scope>test</scope>\n    </dependency>\n    <dependency>\n      <groupId>org.assertj</groupId>\n      <artifactId>assertj-core</artifactId>\n      <version>3.5.1</version>\n      <scope>test</scope>\n    </dependency>\n    <dependency>\n      <groupId>org.eclipse.paho</groupId>\n      <artifactId>org.eclipse.paho.client.mqttv3</artifactId>\n      <version>${paho.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>com.google.protobuf</groupId>\n      <artifactId>protobuf-java</artifactId>\n      <version>${protobuf.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>commons-io</groupId>\n      <artifactId>commons-io</artifactId>\n      <version>2.11.0</version>\n    </dependency>\n    <dependency>\n      <groupId>com.fasterxml.jackson.core</groupId>\n      <artifactId>jackson-core</artifactId>\n      <version>${jackson.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>com.fasterxml.jackson.core</groupId>\n      <artifactId>jackson-annotations</artifactId>\n      <version>${jackson.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>com.fasterxml.jackson.core</groupId>\n      <artifactId>jackson-databind</artifactId>\n      <version>${jackson.databind.version}</version>\n    </dependency>\n\n    <!-- Logging -->\n    <dependency>\n      <groupId>org.slf4j</groupId>\n      <artifactId>slf4j-api</artifactId>\n      <version>${slf4j.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>ch.qos.logback</groupId>\n      <artifactId>logback-classic</artifactId>\n      <version>${logback.version}</version>\n    </dependency>\n    <dependency>\n      <groupId>ch.qos.logback</groupId>\n      <artifactId>logback-core</artifactId>\n      <version>${logback.version}</version>\n    </dependency>\n  </dependencies>\n\n  <build>\n    <plugins>\n      <plugin>\n        <groupId>org.apache.maven.plugins</groupId>\n        <artifactId>maven-compiler-plugin</artifactId>\n        <version>${maven.compiler.version}</version>\n        <configuration>\n          <source>1.8</source>\n          <target>1.8</target>\n          <encoding>UTF-8</encoding>\n        </configuration>\n      </plugin>\n      <plugin>\n        <groupId>org.apache.maven.plugins</groupId>\n        <artifactId>maven-gpg-plugin</artifactId>\n        <version>1.6</version>\n        <executions>\n          <execution>\n            <id>sign-artifacts</id>\n            <phase>deploy</phase>\n            <goals>\n              <goal>sign</goal>\n            </goals>\n            <configuration>\n                <gpgArguments>\n                    <arg>--pinentry-mode</arg>\n                    <arg>loopback</arg>\n                </gpgArguments>\n            </configuration>\n          </execution>\n        </executions>\n      </plugin>\n      <plugin>\n        <!--\n          Use the Nexus Staging plugin as a full replacement for the standard\n          Maven Deploy plugin.\n          See https://github.com/sonatype/nexus-maven-plugins/tree/master/staging/maven-plugin\n          why this makes sense :-)\n          We can control whether we want to deploy to the Eclipse repo or Maven Central\n          by a combination of the version being a SNAPHOT or release version and property\n          skipStaging=true/false.\n          In any case we can take advantage of the plugin's \"deferred deploy\" feature which\n          makes sure that all artifacts of a multi-module project are deployed as a whole\n          at the end of the build process instead of deploying each module's artifacts\n          individually as part of building the module.\n        -->\n        <groupId>org.sonatype.plugins</groupId>\n        <artifactId>nexus-staging-maven-plugin</artifactId>\n        <version>1.6.8</version>\n        <extensions>true</extensions>\n        <configuration>\n          <serverId>ossrh</serverId>\n          <nexusUrl>https://oss.sonatype.org/</nexusUrl>\n          <autoReleaseAfterClose>true</autoReleaseAfterClose>\n          <skipNexusStagingDeployMojo>false</skipNexusStagingDeployMojo>\n        </configuration>\n      </plugin>\n    </plugins>\n    <extensions>\n      <extension>\n        <groupId>org.kuali.maven.wagons</groupId>\n        <artifactId>maven-s3-wagon</artifactId>\n        <version>1.2.1</version>\n      </extension>\n    </extensions>\n  </build>\n</project>\n"
  },
  {
    "path": "javascript/core/node-red-contrib-sparkplug/README.md",
    "content": "node-red-contrib-sparkplug\n=========\n\nA node for an MQTT Edge Node client for MQTT device communication using the\nSparkplug Specification from Cirrus Link Solutions.  \n\nhttps://s3.amazonaws.com/ignition-modules/Current/Sparkplug+Specification.pdf\n\nThe client will connect to an MQTT Server and act as an MQTT Edge of Network\n(EoN) Node.  It will publish birth certificates (NBIRTH), node data messages\n(NDATA), and process node command messages (NCMD) that have been sent from\nanother MQTT client.\n\nThe client also provides and interface for other nodes to publish device birth\ncertificates (DBIRTH), device data messages (DDATA), device death certificates\n(DDEATH), and receive device command messages (DCMD) that have been sent from\nanother MQTT client.\n\nAdditional details on the example payloads below can be found here:\n\nhttps://www.npmjs.com/package/sparkplug-client\n\n## Installation\n\n  npm install node-red-contrib-sparkplug\n\n## Usage\n\n### Configuring the Sparkplug Node\n\nWhen editing the Sparkplug Node the following properties are configurable:\n\n* ServerUrl: The URL of the MQTT server.\n* Port: The port of the MQTT server.\n* Username: The username for the MQTT server connection.\n* Password: The password for the MQTT server connection.\n* Client ID: A unique client ID for the MQTT server connection.\n* Group ID: An ID representing a logical grouping of MQTT EoN Nodes and Devices\n  into the infrastructure.\n* Edge Node: An ID that uniquely identifies the MQTT EoN Node within the\n  infrastructure.\n* Version: The Sparkplug version (currently: A or B).\n* Enable Cache: Whether to enable EoN node caching.\n* Publish Death: Whether to publish the edge node's death certicate when the \n  client cleanly disconnects\n\nUpon deploying the flow, the Sparkplug Node it will automatically connect to\nthe MQTT Server. When the flow is stopped, the Sparkplug Node will cleanly close\ndown the client connection by disconnecting from the MQTT Server.\n\n### Sparkplug Node Inputs\n\nThe Sparkplug Node expects input messages to be received on topics of the\nformat:  *deviceId*/*type*.\n\nAcceptable values for each token in the topic are :\n\n * *deviceId*: A unique device ID string that does not contain the following\n   reserved characters: '/', '#', \"+\".\n * *type*: DDATA | DBIRTH | DDEATH\n\nThe payload of each message will depend on the message type.  The data types \nsupported for metrics of payload differ based on the Sparkplug version. A full \ndescription of each versions format and data type support is beyond the scope of \nthis readme and can be found in the Sparkplug specification linked above. The \nexamples in this ready will use Sparkplug B.\n\n#### DBIRTH message\n\nTopic:  *deviceId*/DBIRTH  \nPayload:  An object with a \"timestamp\" (required), array of ALL \"metric\" objects\n         (required).  \nExample:\n\n```javascript\n{\n    \"timestamp\" : 1465577611580\n    \"metrics\" : [\n        {\n            \"name\" : \"my_int\",\n            \"value\" : 456,\n            \"type\" : \"int32\"\n        },\n        {\n            \"name\" : \"my_float\",\n            \"value\" : 456,\n            \"type\" : \"float\"\n        }\n    ]\n}\n```\n\n#### DDATA message\n\nTopic: *deviceId*/DDATA  \nPayload: An object with a \"timestamp\" (required), array of one or more \"metric\"\n         objects (required), and \"position\" (optional).  \nExample:\n\n```javascript\n{\n    \"timestamp\" : 1465577611580,\n    \"metrics\" : [\n        {\n            \"name\" : \"my_int\",\n            \"value\" : 456,\n            \"type\" : \"int32\"\n        }\n    ]\n}\n```\n\n#### DDEATH message\n\nTopic: *deviceId*/DDEATH  \nPayload: An object with a \"timestamp\" (required).  \nExample:\n\n```javascript\n{\n    \"timestamp\" : 1465577611580\n}\n```\n\nFor each metric included in the payloads, the following types are supported:\nint, long, float, double, boolean, string, bytes.\n\n### Sparkplug Node Outputs\n\nThe Sparkplug Node sends output messages in order to notify other nodes of a\n'rebirth' request or to send a device command (DCMD) request.\n\n#### 'rebirth' message\n\nTopic: rebirth  \nPayload: {}\n\nThe Sparkplug Node sends a 'rebirth' message in order to force all device nodes\nresend DBIRTH messages. This message is send once upon the deployment of the\nflow, after the Sparkplug Node has connected with the MQTT Server, and also\nevery time the Sparkplug Node receives a node command (NCMD) message requesting\na rebirth from itself all all devices.\n\n#### command message\n\nTopic: *deviceId*  \nPayload: An object with an array of one or more \"metric\" objects (required).  \nExample:\n\n```javascript\n{\n    \"metrics\" : [\n        {\n            \"name\" : \"my_int\",\n            \"value\" : 456,\n            \"type\" : \"int32\"\n        },\n        {\n            \"name\" : \"my_float\",\n            \"value\" : 456,\n            \"type\" : \"float\"\n        }\n    ]\n}\n```\n\nA Sparkplug Node sends a command message every time it receives a device command\n(DCMD) message requesting write operations to the metrics of a specific device.\nThe message will contain a single device ID in the topic and the payload will\nspecify the metrics/values to write to the device. The device node specified by\nthe device ID should process the command message and then send a DDATA message\ncontaining any metric values that have changed or been successfully written to.\n\n## Release History\n\n* 1.0.0 Initial release\n* 1.0.2 Bug Fixes\n* 1.1.0 Added connection status indicator, changed category\n* 1.2.0 Added 'Publish Death' config option, and mouseover config descriptions\n* 1.2.1 Added \"node-red\" keyword\n* 2.0.0 Added support for Sparkplug B and made version configurable\n* 2.1.0 Updated sparkplug-client version to 3.0.0\n* 2.1.1 Updated License and repo links\n\n## License\n\nCopyright (c) 2016 Cirrus Link Solutions\n\nAll rights reserved. This program and the accompanying materials\nare made available under the terms of the Eclipse Public License v1.0\nwhich accompanies this distribution, and is available at\nhttp://www.eclipse.org/legal/epl-v10.html\n\nContributors: Cirrus Link Solutions\n"
  },
  {
    "path": "javascript/core/node-red-contrib-sparkplug/package.json",
    "content": "{\n  \"name\": \"node-red-contrib-sparkplug\",\n  \"version\": \"2.1.1\",\n  \"description\": \"A Sparkplug node for Node-RED\",\n  \"license\": \"EPL-2.0\",\n  \"repository\": {\n    \"type\": \"git\",\n    \"url\": \"git+https://github.com/eclipse/tahu.git\"\n  },\n  \"node-red\": {\n    \"nodes\": {\n      \"sparkplug\": \"sparkplug/sparkplug.js\"\n    }\n  },\n  \"bugs\": {\n    \"url\": \"https://github.com/eclipse/tahu/issues\"\n  },\n  \"homepage\": \"https://github.com/eclipse/tahu\",\n  \"main\": \"sparkplug.js\",\n  \"keywords\": [\n    \"tahu\",\n    \"mqtt\",\n    \"sparkplug\",\n    \"node-red\"\n  ],\n  \"author\": \"Chad Kienle <chad.kienle@cirrus-link.com> (http://www.cirrus-link.com)\",\n  \"dependencies\": {\n    \"sparkplug-client\": \"^3.0.0\"\n  },\n  \"files\": [\n    \"sparkplug/sparkplug.html\",\n    \"sparkplug/sparkplug.js\",\n    \"LICENSE\",\n    \"README.md\"\n  ]\n}\n"
  },
  {
    "path": "javascript/core/node-red-contrib-sparkplug/sparkplug/sparkplug.html",
    "content": "<!--\n/********************************************************************************\n * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n-->\n\n<script type=\"text/javascript\">\n    RED.nodes.registerType('sparkplug',{\n        category: 'function',\n        defaults: {\n            name: {value:\"\"},\n            broker: {value:\"tcp://localhost\", required:true},\n            port: {value:\"1883\", required:true},\n            clientid: {value:'NodeREDSimpleEdgeNode', required:true},\n            groupid: {value:'Sparkplug Devices', required:true},\n            edgenode: {value:'Node-RED Edge Node', required:true},\n            version:{value: \"spBv1.0\"},\n            enablecache: {value: \"false\"},\n            publishdeath: {value: \"true\"},\n            user: {value: \"admin\"},\n            password: {value: \"changeme\"}\n        },\n        credentials: {\n            user: {type:\"text\"},\n            password: {type: \"password\"}\n        },\n        color:\"#d8bfd8\",\n        inputs:1,\n        outputs:1,\n        icon: \"bridge.png\",\n        align: \"left\",\n        label: function() {\n            return this.name||\"sparkplug\";\n        },\n        labelStyle: function() {\n            return this.name?\"node_label_italic\":\"\";\n        }\n    });\n</script>\n\n<script type=\"text/x-red\" data-template-name=\"sparkplug\">\n    <div class=\"form-row\">\n        <label for=\"node-input-broker\" title=\"The URL of the MQTT server\"><i class=\"fa fa-globe\"></i> Server</label>\n        <input type=\"text\" id=\"node-input-broker\" placeholder=\"Server URL\">\n    </div>\n    <div class=\"form-row\">\n        <label for=\"node-input-port\" title=\"The port of the MQTT server\"><i class=\"fa fa-globe\"></i> Port</label>\n        <input type=\"text\" id=\"node-input-port\" placeholder=\"Server Port\" style=\"width:45px\">\n    </div>\n    <div class=\"form-row\">\n        <label for=\"node-input-user\" title=\"The username for the MQTT server connection\"><i class=\"fa fa-user\"></i> Username</label>\n        <input type=\"text\" id=\"node-input-user\" placeholder=\"Username\">\n    </div>\n    <div class=\"form-row\">\n        <label for=\"node-input-password\" title=\"The password for the MQTT server connection\"><i class=\"fa fa-lock\"></i> Password</label>\n        <input type=\"password\" id=\"node-input-password\" placeholder=\"Password\">\n    </div>\n    <div class=\"form-row\">\n        <label for=\"node-input-clientid\" title=\"A unique client ID for the MQTT server connection\"><i class=\"fa fa-tag\"></i> Client ID</label>\n        <input type=\"text\" id=\"node-input-clientid\" placeholder=\"Client ID\">\n    </div>\n    <div class=\"form-row\">\n        <label for=\"node-input-groupid\" title=\"An ID representing a logical grouping of MQTT EoN Nodes and Devices into the infrastructure\"><i class=\"fa fa-tag\"></i> Group ID</label>\n        <input type=\"text\" id=\"node-input-groupid\" placeholder=\"Group ID\">\n    </div>\n    <div class=\"form-row\">\n        <label for=\"node-input-edgenode\" title=\"An ID that uniquely identifies the MQTT EoN Node within the infrastructure\"><i class=\"fa fa-tag\"></i> Edge Node</label>\n        <input type=\"text\" id=\"node-input-edgenode\" placeholder=\"Edge Node\">\n    </div>\n    <div class=\"form-row\">\n        <label for=\"node-input-version\" title=\"The Sparkplug namespace version\"><i class=\"fa fa-tag\"></i> Version</label>\n        <select id=\"node-input-version\" style=\"width:125px !important\">\n            <option value=\"spBv1.0\">Sparkplug B</option>\n            <option value=\"spAv1.0\">Sparkplug A</option>\n        </select>\n    </div>\n    <div class=\"form-row\">\n        <label for=\"node-input-enablecache\" title=\"Whether to enable EoN node caching\"><i class=\"fa fa-tag\"></i> Enable Cache</label>\n        <select id=\"node-input-enablecache\" style=\"width:125px !important\">\n            <option value=\"false\">False</option>\n            <option value=\"true\">True</option>\n        </select>\n    </div>\n    <div class=\"form-row\">\n        <label for=\"node-input-publishdeath\" title=\"Whether to publish the edge node's death certicate when the client cleanly disconnects\"><i class=\"fa fa-tag\"></i> Publish Death</label>\n        <select id=\"node-input-publishdeath\" style=\"width:125px !important\">\n            <option value=\"true\">True</option>\n            <option value=\"false\">False</option>\n        </select>\n    </div>\n</script>\n\n<script type=\"text/x-red\" data-help-name=\"sparkplug\">\n    <p>A Sparkplug edge node that connects to an MQTT broker and publishes birth and data messages for the edge node and\n       any input devices.</p>\n</script>\n"
  },
  {
    "path": "javascript/core/node-red-contrib-sparkplug/sparkplug/sparkplug.js",
    "content": "/********************************************************************************\n * Copyright (c) 2016, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\nmodule.exports = function(RED) {\n    var SparkplugClient = require('sparkplug-client');\n    var deviceCache = {} // A cache of data for devices\n\n    function SparkplugNode(config) {\n        RED.nodes.createNode(this, config);\n        var node = this,\n            username = this.credentials.user,\n            password = this.credentials.password,\n            version = config.version,\n            cacheEnabled = config.enablecache == \"true\",\n            sparkPlugConfig = {\n                'serverUrl' : config.broker + \":\" + config.port,\n                'username' : username ?? '',\n                'password' : password ?? '',\n                'groupId' : config.groupid,\n                'edgeNode' : config.edgenode,\n                'clientId' : config.clientid,\n                'publishDeath' : config.publishdeath == \"true\",\n                'version' : version\n            },\n            sparkplugClient,\n\n        doDeviceBirths = function() {\n            if (cacheEnabled) {\n                // Loop over all devices in the device data cache\n                Object.keys(deviceCache).forEach(function(key) {\n                    var payload = { \n                        \"timestamp\" : new Date().getTime()\n                    };\n                    if (version === \"spBv1.0\") {\n                        // Sparkplug B uses \"metrics\" as the key\n                        payload.metrics = deviceCache[key];\n                    } else {\n                        // Sparkplug A uses \"metric\" as the key\n                        payload.metric = deviceCache[key];\n                    }\n                    // Publish BIRTH certificate for device\n                    sparkplugClient.publishDeviceBirth(key, payload);\n                });\n            } else {\n                node.log(config.edgenode + \" sending 'rebirth' message to downstream nodes\");\n                node.send({\n                    \"topic\" : \"rebirth\",\n                    \"payload\" : {}\n                });\n            }\n        },\n\n        doNodeBirth = function() {\n            var payload = { \n                    \"timestamp\" : new Date().getTime()\n                },\n                metrics = [\n                    {\n                        \"name\" : \"Node Control/Rebirth\",\n                        \"type\" : \"boolean\",\n                        \"value\" : false\n                    }\n                ];\n            if (version === \"spBv1.0\") {\n                // Sparkplug B uses \"metrics\" as the key\n                payload.metrics = metrics;\n            } else {\n                // Sparkplug A uses \"metric\" as the key\n                payload.metric = metrics;\n            }\n            // Publish Node BIRTH certificate\n            sparkplugClient.publishNodeBirth(payload);\n        };\n\n        try {\n            // Create the SparkplugClient\n            sparkplugClient = SparkplugClient.newClient(sparkPlugConfig);\n        } catch (e) {\n            node.error(\"Error creating new client\", e);\n        }\n\n        /*\n         * 'rebirth' handler\n         */\n        sparkplugClient.on('birth', function () {\n            node.log(config.edgenode + \" received 'birth' event\");\n            // Publish Node BIRTH certificate\n            doNodeBirth();\n            // Publish Device BIRTH certificate\n            doDeviceBirths();\n        });\n\n        /*\n         * 'command' handler\n         */\n        sparkplugClient.on('dcmd', function (deviceId, payload) {\n            node.log(config.edgenode + \" received 'command' event for deviceId: \" + deviceId + \", sending to nodes\");\n            node.send({\n                \"topic\" : deviceId,\n                \"payload\" : payload\n            });\n\n        });\n\n        /*\n         * 'command' handler\n         */\n        sparkplugClient.on('ncmd', function (payload) {\n            node.log(config.edgenode + \" received 'ncmd' event\");\n            var metrics = version === \"spBv1.0\" \n                    ? payload.metrics \n                    : payload.metric;\n\n            if (metrics !== undefined && metrics !== null) {\n                for (var i = 0; i < metrics.length; i++) {\n                    var metric = metrics[i];\n                    if (metric.name == \"Node Control/Rebirth\" && metric.value) {\n                        console.log(\"Received 'Rebirth' command\");\n                        // Publish Node BIRTH certificate\n                        doNodeBirth();\n                        // Publish Device BIRTH certificate\n                        doDeviceBirths();\n                    }\n                }\n            }  \n        });\n\n        /*\n         * 'error' handler\n         */\n        sparkplugClient.on('error', function (error) {\n            node.log(config.edgenode + \" received 'error' event: \" + error);\n            node.status( {\n                fill:\"red\", \n                shape:\"ring\", \n                text:\"disconnected\"\n            });\n        });\n\n        /*\n         * 'connect' handler\n         */\n        sparkplugClient.on('connect', function () {\n            node.log(config.edgenode + \" received 'connect' event\");\n            node.status( {\n                fill:\"green\", \n                shape:\"dot\", \n                text:\"connected\"\n            });\n        });\n\n        /*\n         * 'connect' handler\n         */\n        sparkplugClient.on('reconnect', function () {\n            node.log(config.edgenode + \" received 'reconnect' event\");\n            node.status( {\n                fill:\"yellow\", \n                shape:\"ring\", \n                text:\"connecting\"\n            });\n        });\n\n        /*\n         * Receive 'input' message.  \n         * The topic should be of the format: <deviceId>/<messageType>\n         * where <messageType> can be one of: DDATA, DBIRTH, or DDEATH.\n         */\n        this.on('input', function(msg) {\n            var tokens = msg.topic.split(\"/\"),\n                payload = msg.payload,\n                publishBirth = false,\n                deviceId, messageType, cachedMetrics;\n\n            node.log(config.edgenode + \" recieved input msg: \" + JSON.stringify(msg));\n\n            if (tokens.length != 2) {\n                node.error(config.edgenode + \" received message with invalid topic \" + msg.topic + \", must be of the form <deviceId>/<msgType>\");\n                return;\n            }\n\n            // Parse topic to get deviceId and messageType\n            deviceId = tokens[0];\n            messageType = tokens[1];\n\n            // Get cached device\n            cachedMetrics = deviceCache[deviceId];\n\n            if (messageType === \"DBIRTH\") {\n                if (cacheEnabled) {\n                    console.log(\"Setting device cache for \" + deviceId);\n                    deviceCache[deviceId] = version === \"spBv1.0\" \n                            ? payload.metrics \n                            : payload.metric;\n                }\n                // Publish device birth\n                sparkplugClient.publishDeviceBirth(deviceId, payload);\n            } else if (messageType === \"DDATA\") {\n                if (cacheEnabled) {\n                    if (cachedMetrics === undefined) {\n                        node.error(config.edgenode + \" received a DDATA for unknown device \" + deviceId);\n                        return;\n                    }\n\n                    var metrics = version === \"spBv1.0\" \n                            ? payload.metrics \n                            : payload.metric;\n\n                    // Update metrics in device cache\n                    // Loop over incoming metrics\n                    metrics.forEach(function(metric) {\n                        // Loop through cached metrics to check if the incoming metric is cached\n                        // then update the value in the cache.\n                        if(!cachedMetrics.some(function(cachedMetric) {\n                                if (cachedMetric.name === metric.name) {\n                                    // Update metric value\n                                    cachedMetric.value = metric.value;\n                                    return true;\n                                }\n                                return false;\n                            })) {\n                            node.warn(config.edgenode + \" received a DDATA message with an unknown metric\");\n                            // Add new metric\n                            cachedMetrics.push(metric);\n                        }\n                    });\n                }\n                // Publish device data\n                sparkplugClient.publishDeviceData(deviceId, payload);\n            } else if (messageType === \"DDEATH\") {\n                // Clear device cache\n                delete deviceCache[deviceId];\n                // Publish device data\n                sparkplugClient.publishDeviceDeath(deviceId, payload);\n            }\n        });\n\n        /*\n         * Received 'close' message.\n         */\n        this.on('close', function() {\n            // Stop the sparkplug client\n            sparkplugClient.stop();\n        });\n    };\n\n    // Register the sparkplug node\n    RED.nodes.registerType(\"sparkplug\", SparkplugNode, {\n        credentials: {\n            user: {type:\"text\"},\n            password: {type:\"password\"}\n        }\n    });\n}\n"
  },
  {
    "path": "javascript/core/node-red-contrib-sparkplug-payload/README.md",
    "content": "node-red-contrib-sparkplug-payload\n=========\n\nA node that provides tools for encoding and decoding payload objects and \nstrings using the [Sparkplug Google Protocol Buffer Schema](https://www.eclipse.org/tahu/spec/Sparkplug%20Topic%20Namespace%20and%20State%20ManagementV2.2-with%20appendix%20B%20format%20-%20Eclipse.pdf).\n\nThis node is designed to facilitate the creation of your own Sparkplug nodes in\n node red for adding data consumers for dashboarding or for implementing your \n own Sparkplug client using the built in MQTT node.\n\n## Installation\n\n  npm install node-red-contrib-sparkplug-payload\n\n## Usage\n\nSimply hook up the input to a source of Sparkplug formatted JSON strings or \nobjects and the encoded protobuf will be provided at the output.\n\nSimilarly if the input is hooked up to a source of protobufs encoded in the \nSparkplug format, a decoded payload object will be provided at the output."
  },
  {
    "path": "javascript/core/node-red-contrib-sparkplug-payload/package.json",
    "content": "{\n  \"name\": \"node-red-contrib-sparkplug-payload\",\n  \"version\": \"1.0.0\",\n  \"description\": \"A node that provides tools for encoding and decoding payload objects and strings using the Sparkplug Google Protocol Buffer Schema\",\n  \"main\": \"sparkplug-payload.js\",\n  \"author\": \"Rikki Coles <r.coles@amrc.co.uk> (https://www.amrc.co.uk)\",\n  \"license\": \"EPL-2.0\",\n  \"repository\": {\n    \"type\": \"git\",\n    \"url\": \"git+https://github.com/eclipse/tahu.git\"\n  },\n  \"bugs\": {\n    \"url\": \"https://github.com/eclipse/tahu/issues\"\n  },\n  \"homepage\": \"https://github.com/eclipse/tahu\",\n  \"dependencies\": {\n    \"sparkplug-payload\": \"^1.0.1\"\n  },\n  \"keywords\": [\n    \"tahu\",\n    \"mqtt\",\n    \"sparkplug\",\n    \"node-red\"\n  ],\n  \"node-red\": {\n    \"nodes\": {\n      \"sparkplug-payload\": \"sparkplug-payload.js\"\n    }\n  },\n  \"files\":[\n    \"package.json\",\n    \"README.md\",\n    \"sparkplug-payload.html\",\n    \"sparkplug-payload.js\"\n  ]\n}"
  },
  {
    "path": "javascript/core/node-red-contrib-sparkplug-payload/sparkplug-payload.html",
    "content": "<script type=\"text/javascript\">\r\n    RED.nodes.registerType('sparkplug-payload',{\r\n        category: 'function',\r\n        color: '#C0DEED',\r\n        defaults: {\r\n            name: {value:\"\"}\r\n        },\r\n        inputs:1,\r\n        outputs:1,\r\n        icon: \"hash.svg\",\r\n        label: function() {\r\n            return this.name||\"sparkplug-payload\";\r\n        }\r\n    });\r\n</script>\r\n\r\n<script type=\"text/html\" data-template-name=\"sparkplug-payload\">\r\n    <div class=\"form-row\">\r\n        <label for=\"node-input-name\"><i class=\"fa fa-tag\"></i> Name</label>\r\n        <input type=\"text\" id=\"node-input-name\" placeholder=\"Name\">\r\n    </div>\r\n</script>\r\n\r\n<script type=\"text/html\" data-help-name=\"sparkplug-payload\">\r\n    <p>A node that converts JSON payloads in Sparkplug structure to protobuf encoded binaries or vice versa.</p>\r\n</script>"
  },
  {
    "path": "javascript/core/node-red-contrib-sparkplug-payload/sparkplug-payload.js",
    "content": "const sparkplug = require(\"sparkplug-payload\");\r\nconst Payload = sparkplug.get(\"spBv1.0\");\r\n\r\nmodule.exports = (RED) => {\r\n    function TranslatePayloadNode(config) {\r\n        RED.nodes.createNode(this, config);\r\n        let node = this;\r\n        let newPayload;\r\n        node.on('input', (msg) => {\r\n            if (typeof (msg.payload === \"string\")) {\r\n                try { // Check if JSON string and parse\r\n                    msg.payload = JSON.parse(msg.payload);\r\n                } catch {\r\n                    // Payload wasn't a JSON string\r\n                }\r\n            }\r\n\r\n            if (Buffer.isBuffer(msg.payload)) {// Payload is a protobuf\r\n                newPayload = Payload.decodePayload(msg.payload);\r\n\r\n            } else { // Payload might be an object\r\n                try {\r\n                    newPayload = Payload.encodePayload(msg.payload);\r\n\r\n                } catch (e) { // Payload wasn't a valid object\r\n                    this.error(e);\r\n                }\r\n            }\r\n            msg.payload = newPayload;\r\n            node.send(msg);\r\n        })\r\n    }\r\n    RED.nodes.registerType(\"sparkplug-payload\", TranslatePayloadNode);\r\n}"
  },
  {
    "path": "javascript/core/sparkplug-client/.gitignore",
    "content": "index.d.ts\nindex.js\nindex.js.map\nindex.d.ts.map\n"
  },
  {
    "path": "javascript/core/sparkplug-client/LICENSE",
    "content": "Eclipse Public License - v 2.0\n\n    THE ACCOMPANYING PROGRAM IS PROVIDED UNDER THE TERMS OF THIS ECLIPSE\n    PUBLIC LICENSE (\"AGREEMENT\"). ANY USE, REPRODUCTION OR DISTRIBUTION\n    OF THE PROGRAM CONSTITUTES RECIPIENT'S ACCEPTANCE OF THIS AGREEMENT.\n\n1. DEFINITIONS\n\n\"Contribution\" means:\n\n  a) in the case of the initial Contributor, the initial content\n     Distributed under this Agreement, and\n\n  b) in the case of each subsequent Contributor:\n     i) changes to the Program, and\n     ii) additions to the Program;\n  where such changes and/or additions to the Program originate from\n  and are Distributed by that particular Contributor. A Contribution\n  \"originates\" from a Contributor if it was added to the Program by\n  such Contributor itself or anyone acting on such Contributor's behalf.\n  Contributions do not include changes or additions to the Program that\n  are not Modified Works.\n\n\"Contributor\" means any person or entity that Distributes the Program.\n\n\"Licensed Patents\" mean patent claims licensable by a Contributor which\nare necessarily infringed by the use or sale of its Contribution alone\nor when combined with the Program.\n\n\"Program\" means the Contributions Distributed in accordance with this\nAgreement.\n\n\"Recipient\" means anyone who receives the Program under this Agreement\nor any Secondary License (as applicable), including Contributors.\n\n\"Derivative Works\" shall mean any work, whether in Source Code or other\nform, that is based on (or derived from) the Program and for which the\neditorial revisions, annotations, elaborations, or other modifications\nrepresent, as a whole, an original work of authorship.\n\n\"Modified Works\" shall mean any work in Source Code or other form that\nresults from an addition to, deletion from, or modification of the\ncontents of the Program, including, for purposes of clarity any new file\nin Source Code form that contains any contents of the Program. Modified\nWorks shall not include works that contain only declarations,\ninterfaces, types, classes, structures, or files of the Program solely\nin each case in order to link to, bind by name, or subclass the Program\nor Modified Works thereof.\n\n\"Distribute\" means the acts of a) distributing or b) making available\nin any manner that enables the transfer of a copy.\n\n\"Source Code\" means the form of a Program preferred for making\nmodifications, including but not limited to software source code,\ndocumentation source, and configuration files.\n\n\"Secondary License\" means either the GNU General Public License,\nVersion 2.0, or any later versions of that license, including any\nexceptions or additional permissions as identified by the initial\nContributor.\n\n2. GRANT OF RIGHTS\n\n  a) Subject to the terms of this Agreement, each Contributor hereby\n  grants Recipient a non-exclusive, worldwide, royalty-free copyright\n  license to reproduce, prepare Derivative Works of, publicly display,\n  publicly perform, Distribute and sublicense the Contribution of such\n  Contributor, if any, and such Derivative Works.\n\n  b) Subject to the terms of this Agreement, each Contributor hereby\n  grants Recipient a non-exclusive, worldwide, royalty-free patent\n  license under Licensed Patents to make, use, sell, offer to sell,\n  import and otherwise transfer the Contribution of such Contributor,\n  if any, in Source Code or other form. This patent license shall\n  apply to the combination of the Contribution and the Program if, at\n  the time the Contribution is added by the Contributor, such addition\n  of the Contribution causes such combination to be covered by the\n  Licensed Patents. The patent license shall not apply to any other\n  combinations which include the Contribution. No hardware per se is\n  licensed hereunder.\n\n  c) Recipient understands that although each Contributor grants the\n  licenses to its Contributions set forth herein, no assurances are\n  provided by any Contributor that the Program does not infringe the\n  patent or other intellectual property rights of any other entity.\n  Each Contributor disclaims any liability to Recipient for claims\n  brought by any other entity based on infringement of intellectual\n  property rights or otherwise. As a condition to exercising the\n  rights and licenses granted hereunder, each Recipient hereby\n  assumes sole responsibility to secure any other intellectual\n  property rights needed, if any. For example, if a third party\n  patent license is required to allow Recipient to Distribute the\n  Program, it is Recipient's responsibility to acquire that license\n  before distributing the Program.\n\n  d) Each Contributor represents that to its knowledge it has\n  sufficient copyright rights in its Contribution, if any, to grant\n  the copyright license set forth in this Agreement.\n\n  e) Notwithstanding the terms of any Secondary License, no\n  Contributor makes additional grants to any Recipient (other than\n  those set forth in this Agreement) as a result of such Recipient's\n  receipt of the Program under the terms of a Secondary License\n  (if permitted under the terms of Section 3).\n\n3. REQUIREMENTS\n\n3.1 If a Contributor Distributes the Program in any form, then:\n\n  a) the Program must also be made available as Source Code, in\n  accordance with section 3.2, and the Contributor must accompany\n  the Program with a statement that the Source Code for the Program\n  is available under this Agreement, and informs Recipients how to\n  obtain it in a reasonable manner on or through a medium customarily\n  used for software exchange; and\n\n  b) the Contributor may Distribute the Program under a license\n  different than this Agreement, provided that such license:\n     i) effectively disclaims on behalf of all other Contributors all\n     warranties and conditions, express and implied, including\n     warranties or conditions of title and non-infringement, and\n     implied warranties or conditions of merchantability and fitness\n     for a particular purpose;\n\n     ii) effectively excludes on behalf of all other Contributors all\n     liability for damages, including direct, indirect, special,\n     incidental and consequential damages, such as lost profits;\n\n     iii) does not attempt to limit or alter the recipients' rights\n     in the Source Code under section 3.2; and\n\n     iv) requires any subsequent distribution of the Program by any\n     party to be under a license that satisfies the requirements\n     of this section 3.\n\n3.2 When the Program is Distributed as Source Code:\n\n  a) it must be made available under this Agreement, or if the\n  Program (i) is combined with other material in a separate file or\n  files made available under a Secondary License, and (ii) the initial\n  Contributor attached to the Source Code the notice described in\n  Exhibit A of this Agreement, then the Program may be made available\n  under the terms of such Secondary Licenses, and\n\n  b) a copy of this Agreement must be included with each copy of\n  the Program.\n\n3.3 Contributors may not remove or alter any copyright, patent,\ntrademark, attribution notices, disclaimers of warranty, or limitations\nof liability (\"notices\") contained within the Program from any copy of\nthe Program which they Distribute, provided that Contributors may add\ntheir own appropriate notices.\n\n4. COMMERCIAL DISTRIBUTION\n\nCommercial distributors of software may accept certain responsibilities\nwith respect to end users, business partners and the like. While this\nlicense is intended to facilitate the commercial use of the Program,\nthe Contributor who includes the Program in a commercial product\noffering should do so in a manner which does not create potential\nliability for other Contributors. Therefore, if a Contributor includes\nthe Program in a commercial product offering, such Contributor\n(\"Commercial Contributor\") hereby agrees to defend and indemnify every\nother Contributor (\"Indemnified Contributor\") against any losses,\ndamages and costs (collectively \"Losses\") arising from claims, lawsuits\nand other legal actions brought by a third party against the Indemnified\nContributor to the extent caused by the acts or omissions of such\nCommercial Contributor in connection with its distribution of the Program\nin a commercial product offering. The obligations in this section do not\napply to any claims or Losses relating to any actual or alleged\nintellectual property infringement. In order to qualify, an Indemnified\nContributor must: a) promptly notify the Commercial Contributor in\nwriting of such claim, and b) allow the Commercial Contributor to control,\nand cooperate with the Commercial Contributor in, the defense and any\nrelated settlement negotiations. The Indemnified Contributor may\nparticipate in any such claim at its own expense.\n\nFor example, a Contributor might include the Program in a commercial\nproduct offering, Product X. That Contributor is then a Commercial\nContributor. If that Commercial Contributor then makes performance\nclaims, or offers warranties related to Product X, those performance\nclaims and warranties are such Commercial Contributor's responsibility\nalone. Under this section, the Commercial Contributor would have to\ndefend claims against the other Contributors related to those performance\nclaims and warranties, and if a court requires any other Contributor to\npay any damages as a result, the Commercial Contributor must pay\nthose damages.\n\n5. NO WARRANTY\n\nEXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, AND TO THE EXTENT\nPERMITTED BY APPLICABLE LAW, THE PROGRAM IS PROVIDED ON AN \"AS IS\"\nBASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER EXPRESS OR\nIMPLIED INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OR CONDITIONS OF\nTITLE, NON-INFRINGEMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR\nPURPOSE. Each Recipient is solely responsible for determining the\nappropriateness of using and distributing the Program and assumes all\nrisks associated with its exercise of rights under this Agreement,\nincluding but not limited to the risks and costs of program errors,\ncompliance with applicable laws, damage to or loss of data, programs\nor equipment, and unavailability or interruption of operations.\n\n6. DISCLAIMER OF LIABILITY\n\nEXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, AND TO THE EXTENT\nPERMITTED BY APPLICABLE LAW, NEITHER RECIPIENT NOR ANY CONTRIBUTORS\nSHALL HAVE ANY LIABILITY FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,\nEXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING WITHOUT LIMITATION LOST\nPROFITS), HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN\nCONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)\nARISING IN ANY WAY OUT OF THE USE OR DISTRIBUTION OF THE PROGRAM OR THE\nEXERCISE OF ANY RIGHTS GRANTED HEREUNDER, EVEN IF ADVISED OF THE\nPOSSIBILITY OF SUCH DAMAGES.\n\n7. GENERAL\n\nIf any provision of this Agreement is invalid or unenforceable under\napplicable law, it shall not affect the validity or enforceability of\nthe remainder of the terms of this Agreement, and without further\naction by the parties hereto, such provision shall be reformed to the\nminimum extent necessary to make such provision valid and enforceable.\n\nIf Recipient institutes patent litigation against any entity\n(including a cross-claim or counterclaim in a lawsuit) alleging that the\nProgram itself (excluding combinations of the Program with other software\nor hardware) infringes such Recipient's patent(s), then such Recipient's\nrights granted under Section 2(b) shall terminate as of the date such\nlitigation is filed.\n\nAll Recipient's rights under this Agreement shall terminate if it\nfails to comply with any of the material terms or conditions of this\nAgreement and does not cure such failure in a reasonable period of\ntime after becoming aware of such noncompliance. If all Recipient's\nrights under this Agreement terminate, Recipient agrees to cease use\nand distribution of the Program as soon as reasonably practicable.\nHowever, Recipient's obligations under this Agreement and any licenses\ngranted by Recipient relating to the Program shall continue and survive.\n\nEveryone is permitted to copy and distribute copies of this Agreement,\nbut in order to avoid inconsistency the Agreement is copyrighted and\nmay only be modified in the following manner. The Agreement Steward\nreserves the right to publish new versions (including revisions) of\nthis Agreement from time to time. No one other than the Agreement\nSteward has the right to modify this Agreement. The Eclipse Foundation\nis the initial Agreement Steward. The Eclipse Foundation may assign the\nresponsibility to serve as the Agreement Steward to a suitable separate\nentity. Each new version of the Agreement will be given a distinguishing\nversion number. The Program (including Contributions) may always be\nDistributed subject to the version of the Agreement under which it was\nreceived. In addition, after a new version of the Agreement is published,\nContributor may elect to Distribute the Program (including its\nContributions) under the new version.\n\nExcept as expressly stated in Sections 2(a) and 2(b) above, Recipient\nreceives no rights or licenses to the intellectual property of any\nContributor under this Agreement, whether expressly, by implication,\nestoppel or otherwise. All rights in the Program not expressly granted\nunder this Agreement are reserved. Nothing in this Agreement is intended\nto be enforceable by any entity that is not a Contributor or Recipient.\nNo third-party beneficiary rights are created under this Agreement.\n\nExhibit A - Form of Secondary Licenses Notice\n\n\"This Source Code may also be made available under the following \nSecondary Licenses when the conditions for such availability set forth \nin the Eclipse Public License, v. 2.0 are satisfied: {name license(s),\nversion(s), and exceptions or additional permissions here}.\"\n\n  Simply including a copy of this Agreement, including this Exhibit A\n  is not sufficient to license the Source Code under Secondary Licenses.\n\n  If it is not possible or desirable to put the notice in a particular\n  file, then You may include the notice in a location (such as a LICENSE\n  file in a relevant directory) where a recipient would be likely to\n  look for such a notice.\n\n  You may add additional accurate notices of copyright ownership.\n"
  },
  {
    "path": "javascript/core/sparkplug-client/README.md",
    "content": "Sparkplug Client\n=========\n\nA client library providing a MQTT client for MQTT device communication using\nthe Sparkplug Specification from Cirrus Link Solutions.  \n\nhttps://s3.amazonaws.com/ignition-modules/Current/Sparkplug+Specification.pdf\n\nThe client will connect to an MQTT Server and act as an MQTT Edge of Network\n(EoN) Node.  It will publish birth certificates (NBIRTH), node data messages\n(NDATA), and process node command messages (NCMD) that have been sent from\nanother MQTT client.\n\nThe client also provides and interface for MQTT Device application code to\npublish device birth certificates (DBIRTH), device data messages (DDATA),\ndevice death certificates (DDEATH), and receive device command messages (DCMD)\nthat have been sent from another MQTT client.\n\n## Installation\n\n  npm install sparkplug-client\n\n## Usage\n\n### Creating and configuring a new Sparkplug client\n\nA configuration object is required when creating a new client.  A configuration\nmust contain the following properties:\n\n* serverUrl: The URL of the MQTT server.\n* username: The username for the MQTT server connection.\n* password: The password for the MQTT server connection.\n* groupId: An ID representing a logical grouping of MQTT EoN Nodes and Devices\n  into the infrastructure.\n* edgeNode: An ID that uniquely identifies the MQTT EoN Node within the\n  infrastructure.\n* clientId: A unique ID for the MQTT client connection.\n* publishDeath: A flag indicating if a Node DEATH Certificate (NDEATH) should\n  be published when the client is stopped (defaults to false).\n* version: The Sparkplug version (currently: A or B).  This will indicate how\n  the payload of the published Sparkplug messages are formatted.\n* keepalive: The MQTT client keep alive interval in seconds (defaults to 30). \n\nHere is a code example of creating and configuring a new client:\n\n```javascript\nvar sparkplug = require('sparkplug-client'),\n    config = {\n        'serverUrl' : 'tcp://localhost:1883',\n        'username' : 'username',\n        'password' : 'password',\n        'groupId' : 'Sparkplug Devices',\n        'edgeNode' : 'Test Edge Node',\n        'clientId' : 'JavaScriptSimpleEdgeNode',\n        'version' : 'spBv1.0'\n    },\n    client = sparkplug.newClient(config);\n```\n\n### Stopping the client\n\nOnce a client has been created and configured it will automatically connect to\nthe MQTT Server.  the client provides a function for stopping the client and\ncleanly disconnecting from the MQTT Server.  Once a client has been stopped, a\nnew client must be created and configured in order to re-establish a connection\nwith the server.\n\nHere is a code example of stopping a client:\n\n```javascript\n// Stop the sparkplug client\nclient.stop();\n```\n\n### Publishing messages\n\nThis client provides functions for publishing three types of messages: a device\nbirth certificate (DBIRTH), device data message (DDATA), device death\ncertificate (DDEATH)\n\n#### Message Payloads ####\n\nThe payload format for Sparkplug messages differs based on the Sparkplug version.\nA full description of each versions payload format is beyond the scope of this\nreadme and can be found in the Sparkplug specification linked above.  The examples\nin this readme will be using Sparkplug B.  \n\nHere is a quick summary of the main changes in version B (over A):\n\n* Added more supported data types for metric values\n* Added support for generic property sets\n* Removed required \"position\" field\n* Change the name of the metrics list field from \"metric\" to \"metrics\".\n\n#### Publish Options\n\nEach of the publish methods below can optionally take an object as an additional \nargument.  This object contains any configured options for the publish.\n\n##### Compression Option\n\nA payload can be compressed before it is published by enabling payload compression\nin the options object and passing it to a plush command.  For example:\n\n```javascript\nvar options = {\n    \"compress\" : true\n};\n\n// Publish device data\nclient.publishDeviceData(deviceId, payload, options);\n```\n\nAdditionally the compression algorithm can be specified as well.  Currently \nsupported algorithms are: DEFLATE and GZIP.  DEFLATE will be used if not algorithm\nis specified. For example:\n\n```javascript\nvar options = {\n    \"compress\" : true,\n    \"algorithm\" : \"GZIP\"\n};\n\n// Publish device data\nclient.publishDeviceData(deviceId, payload, options);\n```\n\n#### Edge Node Birth Certificate (NBIRTH)\n\nA Sparkplug node birth certificate (NBIRTH) message will contain all data points,\nprocess variables, and/or metrics for the edge node. The payload for this message\nwill differ slightly between the different Sparkplug versions.\n\n* timestamp:  A UTC timestamp represented by 64 bit integer.\n* metrics:  An array of metric objects. Each metric in the array must contain\n  the following:\n  * name:  The name of the metric.\n  * value:  The value of the metric.\n  * type:  The type of the metric.  The following types are supported: int, \n    int8, int16, int32, int64, uint8, uint16, uint32, uint64, float, double, \n    boolean, string, datetime, text, uuid, dataset, bytes, file, or template.\n\nHere is a code example of publishing a NBIRTH message:\n\n```javascript\nvar payload = {\n        \"timestamp\" : 1465577611580,\n        \"metrics\" : [\n            {\n                \"name\" : \"my_int\",\n                \"value\" : 456,\n                \"type\" : \"Int32\"\n            },\n            {\n                \"name\" : \"my_float\",\n                \"value\" : 1.23,\n                \"type\" : \"Float\"\n            }\n        ]\n    };\n\n// Publish device birth\nclient.publishNodeBirth(payload);\n```\n\n\n#### Device Birth Certificate (DBIRTH)\n\nA Sparkplug device birth certificate (DBIRTH) message will contain all data points,\nprocess variables, and/or metrics for the device. The payload for this message\nwill differ slightly between the different Sparkplug versions.\n\n* timestamp:  A UTC timestamp represented by 64 bit integer.\n* metrics:  An array of metric objects. Each metric in the array must contain\n  the following:\n  * name:  The name of the metric.\n  * value:  The value of the metric.\n  * type:  The type of the metric.  The following types are supported: int, \n    int8, int16, int32, int64, uint8, uint16, uint32, uint64, float, double, \n    boolean, string, datetime, text, uuid, dataset, bytes, file, or template.\n\nHere is a code example of publishing a DBIRTH message:\n\n```javascript\nvar deviceId = \"testDevice\",\n    payload = {\n        \"timestamp\" : 1465577611580,\n        \"metrics\" : [\n            {\n                \"name\" : \"my_int\",\n                \"value\" : 456,\n                \"type\" : \"Int32\"\n            },\n            {\n                \"name\" : \"my_float\",\n                \"value\" : 1.23,\n                \"type\" : \"Float\"\n            }\n        ]\n    };\n\n// Publish device birth\nclient.publishDeviceBirth(deviceId, payload);\n```\n\n\n#### Node Data Message (NDATA)\n\nAn edge node data message (NDATA) will look similar to NBIRTH but is not required\nto publish all metrics. However, it must publish at least one metric.\n\nHere is a code example of publishing a DBIRTH message:\n\n```javascript\nvar payload = {\n        \"timestamp\" : 1465456711580,\n        \"metrics\" : [\n            {\n                \"name\" : \"my_int\",\n                \"value\" : 412,\n                \"type\" : \"Int32\"\n            }\n        ]\n    };\n\n// Publish device data\nclient.publishNodeData(payload);\n```\n\n\n#### Device Data Message (DDATA)\n\nA device data message (DDATA) will look similar to DBIRTH but is not required\nto publish all metrics. However, it must publish at least one metric.\n\nHere is a code example of publishing a DBIRTH message:\n\n```javascript\nvar deviceId = \"testDevice\",\n    payload = {\n        \"timestamp\" : 1465456711580,\n        \"metrics\" : [\n            {\n                \"name\" : \"my_int\",\n                \"value\" : 412,\n                \"type\" : \"Int32\"\n            }\n        ]\n    };\n\n// Publish device data\nclient.publishDeviceData(deviceId, payload);\n```\n\n#### Node Death Certificate (NDEATH)\n\nAn edge node death certificate (NDEATH) is published to indicated that the edge\nnode has gone offline or has lost a connection.  It registered as an MQTT LWT\nby the SparkplugClient instance and published on the applications behalf.\n\n\n#### Device Death Certificate (DDEATH)\n\nA device death certificate (DDEATH) can be published to indicated that the\ndevice has gone offline or has lost a connection.  It should contain only a\ntimestamp.\n\nHere is a code example of publishing a DDEATH message:\n\n```javascript\nvar deviceId = \"testDevice\",\n    payload = {\n        \"timestamp\" : 1465456711580\n    };\n\n// Publish device death\nclient.publishDeviceDeath(deviceId, payload);\n```\n\n### Receiving events\n\nThe client uses an EventEmitter to emit events to device applications.  The\nclient emits a \"rebirth\" event, \"command\" event, and five MQTT connection\nevents: \"connect\", \"reconnect\", \"offline\", \"error\", and \"close\".\n\n#### Birth Event\n\nA \"birth\" event is used to signal the device application that a DBIRTH \nmessage is requested.  This event will be be emitted immediately after the \nclient initially connects or re-connects with the MQTT Server.\n\nHere is a code example of handling a \"birth\" event:\n\n```javascript\nsparkplugClient.on('birth', function () {\n    // Publish Node BIRTH certificate\n    sparkplugClient.publishNodeBirth(getNodeBirthPayload());\n    // Publish Device BIRTH certificate\n    sparkplugClient.publishDeviceBirth(deviceId, getDeviceBirthPayload());\n});\n```\n\n#### Command Events\n\nA Device Command event is used to communicate a Device Command message (DCMD)\nfrom another MQTT client to a device. A 'dcmd' event will include the device ID \nand a payload containing a list of metrics (as described above).  Any metrics \nincluded in the payload represent attempts to write a new value to the data \npoints or process variables that they represent.  After the device application\nprocesses the request the device application should publish a DDATA message \ncontaining any metrics that have changed or been updated.\n\nHere is a code example of handling a \"dcmd\" event:\n\n```javascript\nclient.on('dcmd', function (deviceId, payload) {\n    console.log(\"received 'dcmd' event\");\n    console.log(\"device: \" + device);\n    console.log(\"payload: \" + payload);\n\n    //\n    // Process metrics and create new payload containing changed metrics\n    //\n\n    client.publishDeviceData(deviceId, newPayload);\n});\n```\n\nA Node Command event is used to communicate an Edge Node Command message (DCMD) \nor Edge Node Command message (NCMD) from another MQTT client to a device.  An \n'ncmd' event will include a payload containing a list of metrics (as described \nabove).  Any metrics included in the payload may represent attempts to write a \nnew value to the data points or process variables that they represent or they\nmay represent control messages sent to the edge node such as a \"rebirth\" \nrequest.\n\nHere is a code example of handling a \"ncmd\" event:\n\n```javascript\nclient.on('ncmd', function (payload) {\n    console.log(\"received 'ncmd' event\");\n    console.log(\"payload: \" + payload);\n\n    //\n    // Process metrics and create new payload containing changed metrics\n    //\n\n    client.publishNodeData(newPayload);\n});\n```\n\n#### Connect Event\n\nA \"connect\" event is emitted when the client has connected to the server.\n\nHere is a code example of handling a \"connect\" event:\n\n```javascript\nclient.on('connect', function () {\n    console.log(\"received 'connect' event\");\n});\n```\n\n#### Reconnect Event\n\nA \"reconnect\" event is emitted when the client is attempting to reconnect to\nthe server.\n\nHere is a code example of handling a \"reconnect\" event:\n\n```javascript\nclient.on('reconnect', function () {\n    console.log(\"received 'reconnect' event\");\n});\n```\n\n#### Offline Event\n\nAn \"offline\" event is emitted when the client loses connection with the server.\n\nHere is a code example of handling an \"offline\" event:\n\n```javascript\nclient.on('offline', function () {\n    console.log(\"received 'offline' event\");\n});\n```\n\n#### Error Event\n\nAn \"error\" event is emitted when the client has experienced an error while\ntrying to connect to the server.\n\nHere is a code example of handling a \"error\" event:\n\n```javascript\nclient.on('error', function (error) {\n    console.log(\"received 'error' event: \" + error);\n});\n```\n\n#### Close Event\n\nA \"close\" event is emitted when the client's connection to the server has been\nclosed.\n\nHere is a code example of handling a \"close\" event:\n\n```javascript\nclient.on('close', function () {\n    console.log(\"received 'close' event\");\n});\n```\n\n## Release History\n\n* 1.0.0 Initial release\n* 1.0.2 Bug Fixes\n* 1.1.0 Added more emitted events (connect, reconnect, error, close)\n* 1.2.0 Added 'publishDeath' config option, updated MQTT.js version\n* 2.0.0 Added support for Sparkplug B and made the version configurable.\n* 3.0.0 Added events for Node Birth/Command events. Renamed 'command' event\n        to distiguish between 'dcmd' (device commands) and 'ncmd' (node \n        commands). Renamed 'rebirth' event to 'birth'. Updated dependency\n        versions and removed bytebuffer as a dependency.\n* 3.1.0 Added support for payload compression/decompression with DEFLATE\n        and Gzip algorithms, added logging with Winston to replace console\n        logging, and other minor bug fixes. Moved sparkplug payload libraries\n        to their own project and updated dependecies.\n* 3.2.0 Added new 'offline' emitted event, added configurable keep alive,\n        updated log messages and set default level to 'info', and disabled \n        ping rescheduling within the client.\n* 3.2.1 Updated License and repo links, cleaned up logging.\n* 3.2.2 Bug Fixes\n* 3.2.3 Bug Fixes, added typescript\n* 3.2.4 Updated sparkplug-payload dependency version.\n\n## License\n\nCopyright (c) 2016-2023 Cirrus Link Solutions and others\n\nAll rights reserved. This program and the accompanying materials\nare made available under the terms of the Eclipse Public License v1.0\nwhich accompanies this distribution, and is available at\nhttp://www.eclipse.org/legal/epl-v10.html\n\nContributors: Cirrus Link Solutions and others\n"
  },
  {
    "path": "javascript/core/sparkplug-client/index.ts",
    "content": "/**\n * Copyright (c) 2016-2017 Cirrus Link Solutions\n *\n *  All rights reserved. This program and the accompanying materials\n *  are made available under the terms of the Eclipse Public License v1.0\n *  which accompanies this distribution, and is available at\n *  http://www.eclipse.org/legal/epl-v10.html\n *\n * Contributors:\n *   Cirrus Link Solutions\n */\nimport * as mqtt from 'mqtt';\nimport type { IClientOptions, MqttClient } from 'mqtt';\nimport events from 'events';\nimport * as sparkplug from 'sparkplug-payload';\nimport type { UPayload } from 'sparkplug-payload/lib/sparkplugbpayload';\nimport type { Reader } from 'protobufjs';\nimport pako from 'pako';\nimport createDebug from 'debug';\n\nconst sparkplugbpayload = sparkplug.get(\"spBv1.0\")!;\n\nconst compressed = \"SPBV1.0_COMPRESSED\";\n\n// setup logging\nconst debugLog = createDebug('sparkplug-client:debug');\nconst infoLog = createDebug('sparkplug-client:info');\nconst logger = {\n    debug: (formatter: string, ...args: unknown[]) => debugLog(formatter, ...args),\n    info: (formatter: string, ...args: unknown[]) => infoLog(formatter, ...args),\n}\n\nfunction getRequiredProperty<C extends Record<string, unknown>, P extends keyof C & string>(config: C, propName: P): C[P] {\n    if (config[propName] !== undefined) {\n        return config[propName];\n    }\n    throw new Error(\"Missing required configuration property '\" + propName + \"'\");\n}\n\nfunction getProperty<C, P extends keyof C, DEFAULT extends C[P]>(config: C, propName: P, defaultValue: DEFAULT): Exclude<C[P], undefined> | DEFAULT {\n    if (config[propName] !== undefined) {\n        return config[propName] as Exclude<C[P], undefined>;\n    } else {\n        return defaultValue;\n    }\n}\n\nexport type ISparkplugClientOptions = {\n    serverUrl: string;\n    username: string;\n    password: string;\n    groupId: string;\n    edgeNode: string;\n    clientId: string;\n    publishDeath?: boolean;\n    version?: string;\n    keepalive?: number;\n    mqttOptions?: Omit<IClientOptions, 'clientId' | 'clean' | 'keepalive' | 'reschedulePings' | 'connectTimeout' | 'username' | 'password' | 'will'>;\n}\n\nexport type PayloadOptions = {\n    algorithm?: 'GZIP' | 'DEFLATE';\n    /** @default false */\n    compress?: boolean;\n}\n\ninterface SparkplugClient extends events.EventEmitter {\n    /** MQTT client event */\n    on(event: 'connect' | 'close' | 'reconnect' | 'offline', listener: () => void): this;\n    /** MQTT client event */\n    on(event: 'error', listener: (error: Error) => void): this;\n    /** emitted when birth messages are ready to be sent*/\n    on(event: 'birth', listener: () => void): this;\n    /** emitted when a node command is received */\n    on(event: 'ncmd', listener: (payload: UPayload) => void): this;\n    /** emitted when a device command is received */\n    on(event: 'dcmd', listener: (device: string, payload: UPayload) => void): this;\n    /** emitted when a payload is received with a version unsupported by this client */\n    on(event: 'message', listener: (topic: string, payload: UPayload) => void): this;\n\n    emit(event: 'connect' | 'close' | 'reconnect' | 'offline' | 'birth'): boolean;\n    emit(event: 'error', error: Error): boolean;\n    emit(event: 'ncmd', payload: UPayload): boolean;\n    emit(event: 'dcmd', device: string, payload: UPayload): boolean;\n    emit(event: 'message', topic: string, payload: UPayload): boolean;\n}\n\nexport { UPayload };\n\n/*\n * Sparkplug Client\n */\nclass SparkplugClient extends events.EventEmitter {\n\n    // Constants\n    private readonly type_int32: number = 7;\n    private readonly type_boolean: number = 11;\n    private readonly type_string: number = 12;\n    private readonly versionB: string = \"spBv1.0\";\n\n    // Config Variables\n    private serverUrl: string;\n    private groupId: string;\n    private edgeNode: string;\n    private publishDeath: boolean;\n    private version: string;\n    private mqttOptions: IClientOptions;\n\n    // MQTT Client Variables\n    private bdSeq = 0;\n    private seq = 0;\n    private client: null | MqttClient = null;\n    private connecting = false;\n    private connected = false;\n\n    constructor(config: ISparkplugClientOptions) {\n        super();\n        this.groupId = getRequiredProperty(config, \"groupId\");\n        this.edgeNode = getRequiredProperty(config, \"edgeNode\");\n        this.publishDeath = getProperty(config, \"publishDeath\", false);\n        this.version = getProperty(config, \"version\", this.versionB);\n\n        // Client connection options\n        this.serverUrl = getRequiredProperty(config, \"serverUrl\");\n        const username = getRequiredProperty(config, \"username\");\n        const password = getRequiredProperty(config, \"password\");\n        const clientId = getRequiredProperty(config, \"clientId\");\n        const keepalive = getProperty(config, \"keepalive\", 5);\n        this.mqttOptions = {\n            ...config.mqttOptions || {}, // allow additional options\n            clientId,\n            clean: true,\n            keepalive,\n            reschedulePings: false,\n            connectTimeout: 30000,\n            username,\n            password,\n            will: {\n                topic: this.version + \"/\" + this.groupId + \"/NDEATH/\" + this.edgeNode,\n                payload: Buffer.from(this.encodePayload(this.getDeathPayload())),\n                qos: 0,\n                retain: false,\n            },\n        };\n\n        this.init();\n    }\n\n    // Increments a sequence number\n    private incrementSeqNum(): number {\n        if (this.seq == 256) {\n            this.seq = 0;\n        }\n        return this.seq++;\n    }\n\n    private encodePayload(payload: UPayload): Uint8Array {\n        return sparkplugbpayload.encodePayload(payload);\n    };\n\n    private decodePayload(payload: Uint8Array | Reader): UPayload {\n        return sparkplugbpayload.decodePayload(payload);\n    }\n\n    private addSeqNumber(payload: UPayload): void {\n        payload.seq = this.incrementSeqNum();\n    }\n\n    // Get DEATH payload\n    private getDeathPayload(): any {\n        return {\n            \"timestamp\": new Date().getTime(),\n            \"metrics\": [{\n                \"name\": \"bdSeq\",\n                \"value\": this.bdSeq,\n                \"type\": \"uint64\"\n            }]\n        }\n    }\n\n    // Publishes DEATH certificates for the edge node\n    private publishNDeath(client: MqttClient): void {\n        let payload, topic;\n\n        // Publish DEATH certificate for edge node\n        logger.info(\"Publishing Edge Node Death\");\n        payload = this.getDeathPayload();\n        topic = this.version + \"/\" + this.groupId + \"/NDEATH/\" + this.edgeNode;\n        client.publish(topic, Buffer.from(this.encodePayload(payload)));\n        this.messageAlert(\"published\", topic, payload);\n    }\n\n    // Logs a message alert to the console\n    private messageAlert(alert: string, topic: string, payload: any): void {\n        logger.debug(\"Message \" + alert);\n        logger.debug(\" topic: \" + topic);\n        logger.debug(\" payload: \" + JSON.stringify(payload));\n    }\n\n    private compressPayload(payload: Uint8Array, options?: PayloadOptions): UPayload {\n        let algorithm: NonNullable<PayloadOptions['algorithm']> | null = null,\n            compressedPayload,\n            resultPayload: UPayload = {\n                \"uuid\": compressed,\n                \"metrics\": []\n            };\n\n        logger.debug(\"Compressing payload \" + JSON.stringify(options));\n\n        // See if any options have been set\n        if (options !== undefined && options !== null) {\n            // Check algorithm\n            if (options['algorithm']) {\n                algorithm = options['algorithm'];\n            }\n        }\n\n        if (algorithm === null || algorithm.toUpperCase() === \"DEFLATE\") {\n            logger.debug(\"Compressing with DEFLATE!\");\n            resultPayload.body = pako.deflate(payload);\n        } else if (algorithm.toUpperCase() === \"GZIP\") {\n            logger.debug(\"Compressing with GZIP\");\n            resultPayload.body = pako.gzip(payload);\n        } else {\n            throw new Error(\"Unknown or unsupported algorithm \" + algorithm);\n        }\n\n        // Create and add the algorithm metric if is has been specified in the options\n        if (algorithm !== null) {\n            resultPayload.metrics = [{\n                \"name\": \"algorithm\",\n                \"value\": algorithm.toUpperCase(),\n                \"type\": \"String\"\n            }];\n        }\n\n        return resultPayload;\n    }\n\n    private decompressPayload(payload: UPayload): Uint8Array {\n        let metrics = payload.metrics || [],\n            algorithm: null | NonNullable<PayloadOptions['algorithm']> = null;\n        const body = payload.body || new Uint8Array();\n\n        logger.debug(\"Decompressing payload\");\n\n        const algorithmMetric = metrics.find(m => m.name === 'algorithm');\n        if (algorithmMetric && typeof algorithmMetric.value === 'string') {\n            algorithm = algorithmMetric.value as NonNullable<PayloadOptions['algorithm']>;\n        }\n\n        if (algorithm === null || algorithm.toUpperCase() === \"DEFLATE\") {\n            logger.debug(\"Decompressing with DEFLATE!\");\n            return pako.inflate(body);\n        } else if (algorithm.toUpperCase() === \"GZIP\") {\n            logger.debug(\"Decompressing with GZIP\");\n            return pako.ungzip(body);\n        } else {\n            throw new Error(\"Unknown or unsupported algorithm \" + algorithm);\n        }\n    }\n\n    private maybeCompressPayload(payload: UPayload, options?: PayloadOptions): UPayload {\n        if (options?.compress) {\n            // Compress the payload\n            return this.compressPayload(this.encodePayload(payload), options);\n        } else {\n            // Don't compress the payload\n            return payload;\n        }\n    }\n\n    private maybeDecompressPayload(payload: UPayload): UPayload {\n        if (payload.uuid !== undefined && payload.uuid === compressed) {\n            // Decompress the payload\n            return this.decodePayload(this.decompressPayload(payload));\n        } else {\n            // The payload is not compressed\n            return payload;\n        }\n    }\n\n    subscribeTopic(topic: string, options: mqtt.IClientSubscribeOptions = { \"qos\": 0 }, callback?: mqtt.ClientSubscribeCallback) {\n        logger.info(\"Subscribing to topic:\", topic);\n        this.client!.subscribe(topic, options, callback);\n    }\n\n    unsubscribeTopic(topic: string, options?: any, callback?: mqtt.PacketCallback) {\n        logger.info(\"Unsubscribing topic:\", topic);\n        this.client!.unsubscribe(topic, options, callback);\n    }\n\n    // Publishes Node BIRTH certificates for the edge node\n    publishNodeBirth(payload: UPayload, options?: PayloadOptions) {\n        let topic = this.version + \"/\" + this.groupId + \"/NBIRTH/\" + this.edgeNode;\n        // Reset sequence number\n        this.seq = 0;\n        // Add seq number\n        this.addSeqNumber(payload);\n        // Add bdSeq number\n        let metrics = payload.metrics\n        if (metrics !== undefined && metrics !== null) {\n            metrics.push({\n                \"name\": \"bdSeq\",\n                \"type\": \"UInt64\",\n                \"value\": this.bdSeq\n            });\n        }\n\n        // Publish BIRTH certificate for edge node\n        logger.info(\"Publishing Edge Node Birth\");\n        let p = this.maybeCompressPayload(payload, options);\n        this.client!.publish(topic, Buffer.from(this.encodePayload(p)));\n        this.messageAlert(\"published\", topic, p);\n    }\n\n    // Publishes Node Data messages for the edge node\n    publishNodeData(payload: UPayload, options?: PayloadOptions) {\n        let topic = this.version + \"/\" + this.groupId + \"/NDATA/\" + this.edgeNode;\n        // Add seq number\n        this.addSeqNumber(payload);\n        // Publish\n        logger.info(\"Publishing NDATA\");\n        this.client!.publish(topic, Buffer.from(this.encodePayload(this.maybeCompressPayload(payload, options))));\n        this.messageAlert(\"published\", topic, payload);\n    }\n\n    // Publishes Node BIRTH certificates for the edge node\n    publishDeviceData(deviceId: string, payload: UPayload, options?: PayloadOptions) {\n        let topic = this.version + \"/\" + this.groupId + \"/DDATA/\" + this.edgeNode + \"/\" + deviceId;\n        // Add seq number\n        this.addSeqNumber(payload);\n        // Publish\n        logger.info(\"Publishing DDATA for device \" + deviceId);\n        this.client!.publish(topic, Buffer.from(this.encodePayload(this.maybeCompressPayload(payload, options))));\n        this.messageAlert(\"published\", topic, payload);\n    };\n\n    // Publishes Node BIRTH certificates for the edge node\n    publishDeviceBirth(deviceId: string, payload: UPayload, options?: PayloadOptions) {\n        let topic = this.version + \"/\" + this.groupId + \"/DBIRTH/\" + this.edgeNode + \"/\" + deviceId;\n        // Add seq number\n        this.addSeqNumber(payload);\n        // Publish\n        logger.info(\"Publishing DBIRTH for device \" + deviceId);\n        let p = this.maybeCompressPayload(payload, options);\n        this.client!.publish(topic, Buffer.from(this.encodePayload(p)));\n        this.messageAlert(\"published\", topic, p);\n    }\n\n    // Publishes Node BIRTH certificates for the edge node\n    publishDeviceDeath(deviceId: string, payload: UPayload) {\n        let topic = this.version + \"/\" + this.groupId + \"/DDEATH/\" + this.edgeNode + \"/\" + deviceId,\n            options = {};\n        // Add seq number\n        this.addSeqNumber(payload);\n        // Publish\n        logger.info(\"Publishing DDEATH for device \" + deviceId);\n        this.client!.publish(topic, Buffer.from(this.encodePayload(this.maybeCompressPayload(payload, options))));\n        this.messageAlert(\"published\", topic, payload);\n    }\n\n    stop() {\n        logger.debug(\"publishDeath: \" + this.publishDeath);\n        if (this.publishDeath) {\n            // Publish the DEATH certificate\n            this.publishNDeath(this.client!);\n        }\n        this.client!.end();\n    }\n\n    // Configures and connects the client\n    private init() {\n\n        // Connect to the MQTT server\n        this.connecting = true;\n        logger.debug(\"Attempting to connect: \" + this.serverUrl);\n        logger.debug(\"              options: \" + JSON.stringify(this.mqttOptions));\n        this.client = mqtt.connect(this.serverUrl, this.mqttOptions);\n        logger.debug(\"Finished attempting to connect\");\n\n        /*\n         * 'connect' handler\n         */\n        this.client.on('connect', () => {\n            logger.info(\"Client has connected\");\n            this.connecting = false;\n            this.connected = true;\n            this.emit(\"connect\");\n\n            // Subscribe to control/command messages for both the edge node and the attached devices\n            logger.info(\"Subscribing to control/command messages for both the edge node and the attached devices\");\n            this.client!.subscribe(this.version + \"/\" + this.groupId + \"/NCMD/\" + this.edgeNode + \"/#\", { \"qos\": 0 });\n            this.client!.subscribe(this.version + \"/\" + this.groupId + \"/DCMD/\" + this.edgeNode + \"/#\", { \"qos\": 0 });\n\n            // Emit the \"birth\" event to notify the application to send a births\n            this.emit(\"birth\");\n        });\n\n        /*\n         * 'error' handler\n         */\n        this.client.on('error', (error) => {\n            if (this.connecting) {\n                this.emit(\"error\", error);\n                this.client!.end();\n            }\n        });\n\n        /*\n         * 'close' handler\n         */\n        this.client.on('close', () => {\n            if (this.connected) {\n                this.connected = false;\n                this.emit(\"close\");\n            }\n        });\n\n        /*\n         * 'reconnect' handler\n         */\n        this.client.on(\"reconnect\", () => {\n            this.emit(\"reconnect\");\n        });\n\n        /*\n         * 'reconnect' handler\n         */\n        this.client.on(\"offline\", () => {\n            this.emit(\"offline\");\n        });\n\n        /*\n         * 'packetsend' handler\n         */\n        this.client.on(\"packetsend\", (packet) => {\n            logger.debug(\"packetsend: \" + packet.cmd);\n        });\n\n        /*\n         * 'packetreceive' handler\n         */\n        this.client.on(\"packetreceive\", (packet) => {\n            logger.debug(\"packetreceive: \" + packet.cmd);\n        });\n\n        /*\n         * 'message' handler\n         */\n        this.client.on('message', (topic, message) => {\n            let payload = this.maybeDecompressPayload(this.decodePayload(message)),\n                timestamp = payload.timestamp,\n                splitTopic,\n                metrics;\n\n            this.messageAlert(\"arrived\", topic, payload);\n\n            // Split the topic up into tokens\n            splitTopic = topic.split(\"/\");\n            if (splitTopic[0] === this.version\n                && splitTopic[1] === this.groupId\n                && splitTopic[2] === \"NCMD\"\n                && splitTopic[3] === this.edgeNode) {\n                // Emit the \"command\" event\n                this.emit(\"ncmd\", payload);\n            } else if (splitTopic[0] === this.version\n                && splitTopic[1] === this.groupId\n                && splitTopic[2] === \"DCMD\"\n                && splitTopic[3] === this.edgeNode) {\n                // Emit the \"command\" event for the given deviceId\n                this.emit(\"dcmd\", splitTopic[4], payload);\n            } else {\n                this.emit(\"message\", topic, payload);\n            }\n        });\n    }\n}\n\nexport function newClient(config: ISparkplugClientOptions): SparkplugClient {\n    return new SparkplugClient(config);\n}\n"
  },
  {
    "path": "javascript/core/sparkplug-client/package.json",
    "content": "{\n  \"name\": \"sparkplug-client\",\n  \"version\": \"3.2.4\",\n  \"description\": \"A client module for MQTT communication using the Sparkplug specification from Cirrus Link Solutions\",\n  \"main\": \"index.js\",\n  \"scripts\": {\n    \"build\": \"tsc\",\n    \"build:watch\": \"tsc --watch\",\n    \"prepack\": \"tsc\"\n  },\n  \"repository\": {\n    \"type\": \"git\",\n    \"url\": \"git+https://github.com/eclipse/tahu.git\"\n  },\n  \"keywords\": [\n    \"tahu\",\n    \"mqtt\",\n    \"sparkplug\"\n  ],\n  \"author\": \"Chad Kienle <chad.kienle@cirrus-link.com> (http://www.cirrus-link.com)\",\n  \"license\": \"EPL-2.0\",\n  \"bugs\": {\n    \"url\": \"https://github.com/eclipse/tahu/issues\"\n  },\n  \"homepage\": \"https://github.com/eclipse/tahu\",\n  \"files\": [\n    \"index.js\",\n    \"index.js.map\",\n    \"index.d.ts\",\n    \"README.md\",\n    \"LICENSE\"\n  ],\n  \"dependencies\": {\n    \"debug\": \"^4.3.4\",\n    \"mqtt\": \"^4.2.8\",\n    \"pako\": \"^2.0.4\",\n    \"sparkplug-payload\": \"^1.0.3\"\n  },\n  \"devDependencies\": {\n    \"@types/debug\": \"^4.1.7\",\n    \"@types/node\": \"^16.11.7\",\n    \"@types/pako\": \"^2.0.0\",\n    \"protobufjs\": \"^6.11.3\",\n    \"typescript\": \"^4.6.3\"\n  }\n}\n"
  },
  {
    "path": "javascript/core/sparkplug-client/tsconfig.json",
    "content": "{\n  \"compilerOptions\": {\n    \"target\": \"es5\",\n    \"module\": \"commonjs\",\n    \"esModuleInterop\": true,                             /* Emit additional JavaScript to ease support for importing CommonJS modules. This enables `allowSyntheticDefaultImports` for type compatibility. */\n    \"forceConsistentCasingInFileNames\": true,            /* Ensure that casing is correct in imports. */\n    \"strict\": true,                                      /* Enable all strict type-checking options. */\n    \"skipLibCheck\": true,                                 /* Skip type checking all .d.ts files. */\n    \"sourceMap\": true,\n    \"declaration\": true,\n    \"declarationMap\": true\n  },\n  \"include\": [\n    \"index.ts\"\n  ]\n}\n"
  },
  {
    "path": "javascript/core/sparkplug-payload/.gitignore",
    "content": "*.js\nlib/*.d.ts\nindex.d.ts\n"
  },
  {
    "path": "javascript/core/sparkplug-payload/README.md",
    "content": "Sparkplug Payload\n=========\n\nA library that provides tools for encoding and decoding payload objects using\nthe Sparkplug Google Protocol Buffer Schema described in the Sparkplug \nSpecification from Cirrus Link Solutions.\n\nhttps://s3.amazonaws.com/cirrus-link-com/Sparkplug+Topic+Namespace+and+State+ManagementV2.1+Apendix++Payload+B+format.pdf\n\n## Installation\n\n  npm install sparkplug-b-payload\n\n## Usage\n\nThis library supports the Sparkplug Google Protocol Buffer Schemas for the\nfollowing Sparkplug namespaces:\n\n* spBv1.0\n* spAv1.0 (deprecated)\n\n\n### Encoding a payload\n\nHere is a code example of encoding a payload:\n\n```javascript\nvar sparkplug = require('sparkplug-payload').get(\"spBv1.0\"),\n    payload = {\n        \"timestamp\" : new Date().getTime(),\n        \"metrics\" : [\n            {\n                \"name\" : \"intMetric\",\n                \"value\" : 1,\n                \"type\" : \"Int32\"\n            }\n        ]\n    },\n    encoded = sparkplug.encodePayload(payload);\n```\n\n### Decoding a payload\n\nHere is a code example of decoding an encoded payload:\n\n```javascript\nvar decoded = sparkplug.decodePayload(encoded);\n```\n\n## Release History\n\n* 1.0.0 Initial release\n* 1.0.1 Bug fixes\n* 1.0.2 Bug fixes, added typescript\n* 1.0.3 Bug fixes\n\n## License\n\nCopyright (c) 2017-2023 Cirrus Link Solutions and others\n\nAll rights reserved. This program and the accompanying materials\nare made available under the terms of the Eclipse Public License v1.0\nwhich accompanies this distribution, and is available at\nhttp://www.eclipse.org/legal/epl-v10.html\n\nContributors: Cirrus Link Solutions and others\n"
  },
  {
    "path": "javascript/core/sparkplug-payload/index.ts",
    "content": "/********************************************************************************\n * Copyright (c) 2017, 2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\nimport * as sparkplugbpayload from './lib/sparkplugbpayload';\n\nexport function get(namespace: string | undefined | null) {\n    if (namespace !== undefined && namespace !== null) {\n        if (namespace === \"spBv1.0\") {\n            return sparkplugbpayload;\n        }\n    }\n    return null;\n};\n"
  },
  {
    "path": "javascript/core/sparkplug-payload/lib/sparkplugbpayload.ts",
    "content": "/********************************************************************************\n * Copyright (c) 2016-2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\n\nimport * as ProtoRoot from './sparkplugPayloadProto';\nimport Long from 'long';\nimport type * as IProtoRoot from './sparkplugPayloadProto';\nimport type { Reader } from 'protobufjs';\n\nconst Payload = ProtoRoot.org.eclipse.tahu.protobuf.Payload;\nconst Template = Payload.Template;\nconst Parameter = Template.Parameter;\nconst DataSet = Payload.DataSet;\nconst DataSetValue = DataSet.DataSetValue;\nconst Row = DataSet.Row;\nconst PropertyValue = Payload.PropertyValue;\nconst PropertySet = Payload.PropertySet;\nconst PropertySetList = Payload.PropertySetList;\nconst MetaData = Payload.MetaData;\nconst Metric = Payload.Metric;\n\n// import generated interfaces\ntype IPayload = IProtoRoot.org.eclipse.tahu.protobuf.IPayload;\ntype ITemplate = IProtoRoot.org.eclipse.tahu.protobuf.Payload.ITemplate;\ntype IParameter = IProtoRoot.org.eclipse.tahu.protobuf.Payload.Template.IParameter;\ntype IDataSet = IProtoRoot.org.eclipse.tahu.protobuf.Payload.IDataSet;\ntype IDataSetValue = IProtoRoot.org.eclipse.tahu.protobuf.Payload.DataSet.IDataSetValue;\ntype IRow = IProtoRoot.org.eclipse.tahu.protobuf.Payload.DataSet.IRow;\ntype IPropertyValue = IProtoRoot.org.eclipse.tahu.protobuf.Payload.IPropertyValue;\ntype IPropertySet = IProtoRoot.org.eclipse.tahu.protobuf.Payload.IPropertySet;\ntype IPropertySetList = IProtoRoot.org.eclipse.tahu.protobuf.Payload.IPropertySetList;\ntype IMetaData = IProtoRoot.org.eclipse.tahu.protobuf.Payload.IMetaData;\ntype IMetric = IProtoRoot.org.eclipse.tahu.protobuf.Payload.IMetric;\n\n// \"user types\"\nexport type TypeStr = \"Int8\"\n    | \"Int16\"\n    | \"Int32\"\n    | \"Int64\"\n    | \"UInt8\"\n    | \"UInt16\"\n    | \"UInt32\"\n    | \"UInt64\"\n    | \"Float\"\n    | \"Double\"\n    | \"Boolean\"\n    | \"String\"\n    | \"DateTime\"\n    | \"Text\"\n    | \"UUID\"\n    | \"DataSet\"\n    | \"Bytes\"\n    | \"File\"\n    | \"Template\"\n    | \"PropertySet\"\n    | \"PropertySetList\";\n\nexport interface UMetric extends IMetric {\n    value: null | number | Long.Long | boolean | string | Uint8Array | UDataSet | UTemplate;\n    type: TypeStr;\n    properties?: Record<string, UPropertyValue>\n}\nexport interface UPropertyValue extends Omit<IPropertyValue, 'type'> { // TODO is the type supposed to be like the metric type in the readme?\n    value: null | number | Long.Long | boolean | string | UPropertySet | UPropertySetList;\n    type: TypeStr;\n}\nexport interface UParameter extends Omit<IParameter, 'type'> { // TODO is the type supposed to be like the metric type in the readme?\n    value: number | Long.Long | boolean | string | UPropertySet | UPropertySetList;\n    type: TypeStr;\n}\nexport interface UTemplate extends Omit<ITemplate, 'metrics' | 'parameters'> { // TODO is the type supposed to be like the metric type in the readme?\n    metrics?: UMetric[];\n    parameters?: UParameter[];\n}\nexport interface UDataSet extends Omit<IDataSet, 'types' | 'rows'> {\n    types: TypeStr[];\n    rows: UDataSetValue[][];\n}\nexport type UDataSetValue = number | Long.Long | boolean | string;\nexport type UPropertySet = Record<string, UPropertyValue>;\nexport type UPropertySetList = UPropertySet[];\nexport type UserValue = UMetric['value'] | UPropertyValue['value'] | UDataSet | UDataSetValue | UPropertySet | UPropertySetList;\nexport interface UPayload extends IPayload {\n    metrics?: UMetric[] | null;\n}\n\n/**\n * Sets the value of an object given it's type expressed as an integer\n * \n * only used during encode functions\n */\nfunction setValue (type: number, value: UserValue, object: IMetric | IPropertyValue) {\n    // TODO not sure about type casts\n    switch (type) {\n        case 1: // Int8\n        case 2: // Int16\n        case 3: // Int32\n        case 5: // UInt8\n        case 6: // UInt16\n            object.intValue = value as number;\n            break;\n        case 4: // Int64\n        case 7: // UInt32\n        case 8: // UInt64\n        case 13: // DateTime\n            object.longValue = value as number | Long;\n            break;\n        case 9: // Float\n            object.floatValue = value as number;\n            break;\n        case 10: // Double\n            object.doubleValue = value as number;\n            break;\n        case 11: // Boolean\n            object.booleanValue = value as boolean;\n            break;\n        case 12: // String\n        case 14: // Text\n        case 15: // UUID\n            object.stringValue = value as string;\n            break;\n        case 16: // DataSet\n            (object as IMetric).datasetValue = encodeDataSet(value as UDataSet);\n            break;\n        case 17: // Bytes\n        case 18: // File\n            (object as IMetric).bytesValue = value as Uint8Array;\n            break;\n        case 19: // Template\n            (object as IMetric).templateValue = encodeTemplate(value as UTemplate);\n            break;\n        case 20: // PropertySet\n            (object as IPropertyValue).propertysetValue = encodePropertySet(value as UPropertySet);\n            break;\n        case 21:\n            (object as IPropertyValue).propertysetsValue = encodePropertySetList(value as UPropertySetList);\n            break;\n    } \n}\n\n/** only used during decode functions */\nfunction getValue<T extends UserValue> (type: number | null | undefined, object: IMetric | IPropertyValue): T | undefined | null {\n    // TODO change type casts\n    switch (type) {\n        case 1: // Int8\n        case 2: // Int16\n        case 3: // Int32\n            return new Int32Array([object.intValue!])[0] as T;\n        case 5: // UInt8\n        case 6: // UInt16\n            return object.intValue as T;\n        case 4: // Int64\n            if (object.longValue instanceof Long) {\n                return object.longValue.toSigned() as T;\n            } else {\n                return object.longValue as T;\n            }\n        case 7: // UInt32\n            if (object.longValue instanceof Long) {\n                return object.longValue.toInt() as T;\n            } else {\n                return object.longValue as T;\n            }\n        case 8: // UInt64\n        case 13: // DateTime\n            return object.longValue! as T;\n        case 9: // Float\n            return object.floatValue! as T;\n        case 10: // Double\n            return object.doubleValue! as T;\n        case 11: // Boolean\n            return object.booleanValue! as T;\n        case 12: // String\n        case 14: // Text\n        case 15: // UUID\n            return object.stringValue! as T;\n        case 16: // DataSet\n            return decodeDataSet((object as IMetric).datasetValue!) as T;\n        case 17: // Bytes\n        case 18: // File\n            return (object as IMetric).bytesValue as T;\n        case 19: // Template\n            return decodeTemplate((object as IMetric).templateValue!) as T;\n        case 20: // PropertySet\n            return decodePropertySet((object as IPropertyValue).propertysetValue!) as T;\n        case 21:\n            return decodePropertySetList((object as IPropertyValue).propertysetsValue!) as T;\n        default:\n            return null;\n    } \n}\n\nfunction isSet<T> (value: T): value is Exclude<T, null | undefined> {\n    return value !== null && value !== undefined;\n}\n\nfunction getDataSetValue (type: number | null | undefined, object: IDataSetValue): UDataSetValue {\n    switch (type) {\n        case 7: // UInt32\n            if (object.longValue instanceof Long) return object.longValue.toInt();\n            else if (isSet(object.longValue)) return object.longValue;\n        case 4: // UInt64\n            if (isSet(object.longValue)) return object.longValue;\n        case 9: // Float\n            if (isSet(object.floatValue)) return object.floatValue;\n        case 10: // Double\n            if (isSet(object.doubleValue)) return object.doubleValue;\n        case 11: // Boolean\n            if (isSet(object.booleanValue)) return object.booleanValue;\n        case 12: // String\n            if (isSet(object.stringValue)) return object.stringValue;\n        default:\n            throw new Error(`Invalid DataSetValue: ${JSON.stringify(object)}`);\n    }\n}\n\nfunction getTemplateParamValue (type: number | null | undefined, object: IParameter): UParameter['value'] {\n    switch (type) {\n        case 7: // UInt32\n            if (object.longValue instanceof Long) return object.longValue.toInt();\n            else if (isSet(object.longValue)) return object.longValue;\n        case 4: // UInt64\n            if (isSet(object.longValue)) return object.longValue;\n        case 9: // Float\n            if (isSet(object.floatValue)) return object.floatValue;\n        case 10: // Double\n            if (isSet(object.doubleValue)) return object.doubleValue;\n        case 11: // Boolean\n            if (isSet(object.booleanValue)) return object.booleanValue;\n        case 12: // String\n            if (isSet(object.stringValue)) return object.stringValue;\n        default:\n            throw new Error(`Invalid Parameter value: ${JSON.stringify(object)}`);\n    }\n}\n\n/** transforms a user friendly type and converts it to its corresponding type code */\nfunction encodeType (typeString: string): number {\n    switch (typeString.toUpperCase()) {\n        case \"INT8\":\n            return 1;\n        case \"INT16\":\n            return 2;\n        case \"INT32\":\n        case \"INT\":\n            return 3;\n        case \"INT64\":\n        case \"LONG\":\n            return 4;\n        case \"UINT8\":\n            return 5;\n        case \"UINT16\":\n            return 6;\n        case \"UINT32\":\n            return 7;\n        case \"UINT64\":\n            return 8;\n        case \"FLOAT\":\n            return 9;\n        case \"DOUBLE\":\n            return 10;\n        case \"BOOLEAN\":\n            return 11;\n        case \"STRING\":\n            return 12;\n        case \"DATETIME\":\n            return 13;\n        case \"TEXT\":\n            return 14;\n        case \"UUID\":\n            return 15;\n        case \"DATASET\":\n            return 16;\n        case \"BYTES\":\n            return 17;\n        case \"FILE\":\n            return 18;\n        case \"TEMPLATE\":\n            return 19;\n        case \"PROPERTYSET\":\n            return 20;\n        case \"PROPERTYSETLIST\":\n            return 21;\n        default:\n            return 0;\n    }\n}\n\n/** transforms a type code into a user friendly type */\n// @ts-expect-error TODO no consistent return\nfunction decodeType (typeInt: number | null | undefined): TypeStr {\n    switch (typeInt) {\n        case 1:\n            return \"Int8\";\n        case 2:\n            return \"Int16\";\n        case 3:\n            return \"Int32\";\n        case 4: \n            return \"Int64\";\n        case 5:\n            return \"UInt8\";\n        case 6:\n            return \"UInt16\";\n        case 7:\n            return \"UInt32\";\n        case 8:\n            return \"UInt64\";\n        case 9:\n            return \"Float\";\n        case 10:\n            return \"Double\";\n        case 11:\n            return \"Boolean\";\n        case 12:\n            return \"String\";\n        case 13:\n            return \"DateTime\";\n        case 14:\n            return \"Text\";\n        case 15:\n            return \"UUID\";\n        case 16:\n            return \"DataSet\";\n        case 17:\n            return \"Bytes\";\n        case 18:\n            return \"File\";\n        case 19:\n            return \"Template\";\n        case 20:\n            return \"PropertySet\";\n        case 21:\n            return \"PropertySetList\";\n    }\n}\n\nfunction encodeTypes (typeArray: string[]): number[]  {\n    var types: number[] = [];\n    for (var i = 0; i < typeArray.length; i++) {\n        types.push(encodeType(typeArray[i]));\n    }\n    return types;\n}\n\nfunction decodeTypes (typeArray: number[]): TypeStr[] {\n    var types: TypeStr[] = [];\n    for (var i = 0; i < typeArray.length; i++) {\n        types.push(decodeType(typeArray[i]));\n    }\n    return types;\n}\n\nfunction encodeDataSet (object: UDataSet): ProtoRoot.org.eclipse.tahu.protobuf.Payload.DataSet {\n    const num = object.numOfColumns,\n        names = object.columns,\n        types = encodeTypes(object.types),\n        rows = object.rows,\n        newDataSet = DataSet.create({\n            \"numOfColumns\" : num, \n            \"columns\" : object.columns, \n            \"types\" : types \n        }),\n        newRows = [];\n    // Loop over all the rows\n    for (let i = 0; i < rows.length; i++) {\n        const newRow = Row.create(),\n            row = rows[i],\n            elements: IDataSetValue[] = [];\n        // Loop over all the elements in each row\n        // @ts-expect-error TODO check if num is set\n        for (let t = 0; t < num; t++) {\n            const newValue = DataSetValue.create();\n            setValue(types[t], row[t], newValue);\n            elements.push(newValue);\n        }\n        newRow.elements = elements;\n        newRows.push(newRow);\n    }\n    newDataSet.rows = newRows;\n    return newDataSet;\n}\n\nfunction decodeDataSet (protoDataSet: IDataSet): UDataSet {\n    const protoTypes = protoDataSet.types!; // TODO check exists\n    const dataSet: UDataSet = {\n        types: decodeTypes(protoTypes),\n        rows: [],\n    };\n    const types = decodeTypes(protoTypes),\n        protoRows = protoDataSet.rows || [], // TODO check exists\n        num = protoDataSet.numOfColumns;\n    \n    // Loop over all the rows\n    for (var i = 0; i < protoRows.length; i++) {\n        var protoRow = protoRows[i],\n            protoElements = protoRow.elements || [], // TODO check exists\n            rowElements: UDataSetValue[] = [];\n        // Loop over all the elements in each row\n        // @ts-expect-error TODO check exists\n        for (var t = 0; t < num; t++) {\n            rowElements.push(getDataSetValue(protoTypes[t], protoElements[t])!);\n        }\n        dataSet.rows.push(rowElements);\n    }\n\n    dataSet.numOfColumns = num;\n    dataSet.types = types;\n    dataSet.columns = protoDataSet.columns;\n\n    return dataSet;\n}\n\nfunction encodeMetaData (object: IMetaData): ProtoRoot.org.eclipse.tahu.protobuf.Payload.MetaData {\n    var metadata = MetaData.create(),\n        isMultiPart = object.isMultiPart,\n        contentType = object.contentType,\n        size = object.size,\n        seq = object.seq,\n        fileName = object.fileName,\n        fileType = object.fileType,\n        md5 = object.md5,\n        description = object.description;\n\n    if (isMultiPart !== undefined && isMultiPart !== null) {\n        metadata.isMultiPart = isMultiPart;\n    }\n\n    if (contentType !== undefined && contentType !== null) {\n        metadata.contentType = contentType;\n    }\n\n    if (size !== undefined && size !== null) {\n        metadata.size = size;\n    }\n\n    if (seq !== undefined && seq !== null) {\n        metadata.seq = seq;\n    }\n\n    if (fileName !== undefined && fileName !== null) {\n        metadata.fileName = fileName;\n    }\n\n    if (fileType !== undefined && fileType !== null) {\n        metadata.fileType = fileType;\n    }\n\n    if (md5 !== undefined && md5 !== null) {\n        metadata.md5 = md5;\n    }\n\n    if (description !== undefined && description !== null) {\n        metadata.description = description;\n    }\n\n    return metadata;\n}\n\nfunction decodeMetaData (protoMetaData: IMetaData): IMetaData {\n    var metadata: IMetaData = {},\n        isMultiPart = protoMetaData.isMultiPart,\n        contentType = protoMetaData.contentType,\n        size = protoMetaData.size,\n        seq = protoMetaData.seq,\n        fileName = protoMetaData.fileName,\n        fileType = protoMetaData.fileType,\n        md5 = protoMetaData.md5,\n        description = protoMetaData.description;\n\n    if (isMultiPart !== undefined && isMultiPart !== null) {\n        metadata.isMultiPart = isMultiPart;\n    }\n\n    if (contentType !== undefined && contentType !== null) {\n        metadata.contentType = contentType;\n    }\n\n    if (size !== undefined && size !== null) {\n        metadata.size = size;\n    }\n\n    if (seq !== undefined && seq !== null) {\n        metadata.seq = seq;\n    }\n\n    if (fileName !== undefined && fileName !== null) {\n        metadata.fileName = fileName;\n    }\n\n    if (fileType !== undefined && fileType !== null) {\n        metadata.fileType = fileType;\n    }\n\n    if (md5 !== undefined && md5 !== null) {\n        metadata.md5 = md5;\n    }\n\n    if (description !== undefined && description !== null) {\n        metadata.description = description;\n    }\n\n    return metadata;\n}\n\nfunction encodePropertyValue (object: UPropertyValue): ProtoRoot.org.eclipse.tahu.protobuf.Payload.PropertyValue {\n    var type = encodeType(object.type),\n        newPropertyValue = PropertyValue.create({\n            \"type\" : type\n        });\n\n    if (object.value !== undefined && object.value === null) {\n        newPropertyValue.isNull = true;\n    }\n\n    setValue(type, object.value, newPropertyValue);\n\n    return newPropertyValue;\n}\n\nfunction decodePropertyValue (protoValue: IPropertyValue): UPropertyValue {\n    const propertyValue: UPropertyValue = {\n        // @ts-expect-error TODO check exists\n        value: getValue(protoValue.type, protoValue),\n        type: decodeType(protoValue.type),\n    };\n\n    if (protoValue.isNull !== undefined && protoValue.isNull === true) {\n        propertyValue.value = null;\n    } else {\n        propertyValue.value = getValue(protoValue.type, protoValue)!;\n    }\n\n    propertyValue.type = decodeType(protoValue.type);\n\n    return propertyValue;\n}\n\nfunction encodePropertySet (object: Record<string, UPropertyValue>): ProtoRoot.org.eclipse.tahu.protobuf.Payload.PropertySet {\n    const keys = [],\n        values = [];\n\n    for (var key in object) {\n        if (object.hasOwnProperty(key)) {\n            keys.push(key);\n            values.push(encodePropertyValue(object[key]))  \n        }\n    }\n\n    return PropertySet.create({\n        \"keys\" : keys, \n        \"values\" : values\n    });\n}\n\nfunction decodePropertySet (protoSet: IPropertySet): Record<string, UPropertyValue> {\n    const propertySet: Record<string, UPropertyValue> = {},\n        protoKeys = protoSet.keys || [], // TODO check exists\n        protoValues = protoSet.values || []; // TODO check exists\n\n    for (var i = 0; i < protoKeys.length; i++) {\n        propertySet[protoKeys[i]] = decodePropertyValue(protoValues[i]);\n    }\n\n    return propertySet;\n}\n\nfunction encodePropertySetList (object: Record<string, UPropertyValue>[]): ProtoRoot.org.eclipse.tahu.protobuf.Payload.PropertySetList {\n    const propertySets: IPropertySet[] = [];\n    for (let i = 0; i < object.length; i++) {\n        propertySets.push(encodePropertySet(object[i]));\n    }\n    return PropertySetList.create({\n        \"propertyset\" : propertySets\n    });\n}\n\nfunction decodePropertySetList (protoSetList: IPropertySetList): Record<string, UPropertyValue>[]  {\n    const propertySets: Record<string, UPropertyValue>[] = [],\n        protoSets = protoSetList.propertyset || []; // TODO check exists\n    for (let i = 0; i < protoSets.length; i++) {\n        propertySets.push(decodePropertySet(protoSets[i]));\n    }\n    return propertySets;\n}\n\nfunction encodeParameter (object: UParameter): ProtoRoot.org.eclipse.tahu.protobuf.Payload.Template.Parameter {\n    const type = encodeType(object.type),\n        newParameter = Parameter.create({\n            \"name\" : object.name, \n            \"type\" : type\n        });\n    setValue(type, object.value, newParameter);\n    return newParameter;\n}\n\nfunction decodeParameter (protoParameter: IParameter): UParameter {\n    const protoType = protoParameter.type,\n        parameter: UParameter = {\n            value: getTemplateParamValue(protoType, protoParameter),\n            type: decodeType(protoType),\n        };\n\n    parameter.name = protoParameter.name;\n    parameter.type = decodeType(protoType);\n    // @ts-expect-error TODO check exists\n    parameter.value = getValue(protoType, protoParameter);\n\n    return parameter;\n}\n\nfunction encodeTemplate (object: UTemplate): ProtoRoot.org.eclipse.tahu.protobuf.Payload.Template {\n    let template = Template.create(),\n        metrics = object.metrics,\n        parameters = object.parameters,\n        isDef = object.isDefinition,\n        ref = object.templateRef,\n        version = object.version;\n\n    if (version !== undefined && version !== null) {\n        template.version = version;\n    }\n\n    if (ref !== undefined && ref !== null) {\n        template.templateRef = ref;\n    }\n\n    if (isDef !== undefined && isDef !== null) {\n        template.isDefinition = isDef;\n    }\n\n    // Build up the metric\n    if (object.metrics !== undefined && object.metrics !== null) {\n        const newMetrics = []\n            metrics = object.metrics;\n        // loop over array of metrics\n        for (let i = 0; i < metrics.length; i++) {\n            newMetrics.push(encodeMetric(metrics[i]));\n        }\n        template.metrics = newMetrics;\n    }\n\n    // Build up the parameters\n    if (object.parameters !== undefined && object.parameters !== null) {\n        const newParameter = [];\n        // loop over array of parameters\n        for (let i = 0; i < object.parameters.length; i++) {\n            newParameter.push(encodeParameter(object.parameters[i]));\n        }\n        template.parameters = newParameter;\n    }\n\n    return template;\n}\n\nfunction decodeTemplate (protoTemplate: ITemplate): UTemplate {\n    const template: UTemplate = {},\n        protoMetrics = protoTemplate.metrics,\n        protoParameters = protoTemplate.parameters,\n        isDef = protoTemplate.isDefinition,\n        ref = protoTemplate.templateRef,\n        version = protoTemplate.version;\n\n    if (version !== undefined && version !== null) {\n        template.version = version;    \n    }\n\n    if (ref !== undefined && ref !== null) {\n        template.templateRef = ref;    \n    }\n\n    if (isDef !== undefined && isDef !== null) {\n        template.isDefinition = isDef;    \n    }\n\n    // Build up the metric\n    if (protoMetrics !== undefined && protoMetrics !== null) {\n        const metrics = []\n        // loop over array of proto metrics, decoding each one\n        for (let i = 0; i < protoMetrics.length; i++) {\n            metrics.push(decodeMetric(protoMetrics[i]));\n        }\n        template.metrics = metrics;\n    }\n\n    // Build up the parameters\n    if (protoParameters !== undefined && protoParameters !== null) {\n        const parameter: UParameter[] = [];\n        // loop over array of parameters\n        for (let i = 0; i < protoParameters.length; i++) {\n            parameter.push(decodeParameter(protoParameters[i]));\n        }\n        template.parameters = parameter;\n    }\n\n    return template;\n}\n\nfunction encodeMetric (metric: UMetric): ProtoRoot.org.eclipse.tahu.protobuf.Payload.Metric {\n    const newMetric = Metric.create({\n            name : metric.name\n        }),\n        value = metric.value,\n        datatype = encodeType(metric.type),\n        alias = metric.alias,\n        isHistorical = metric.isHistorical,\n        isTransient = metric.isTransient,\n        metadata = metric.metadata,\n        timestamp = metric.timestamp,\n        properties = metric.properties;\n    \n    // Get metric type and value\n    newMetric.datatype = datatype;\n    setValue(datatype, value, newMetric);\n\n    if (timestamp !== undefined && timestamp !== null) {\n        newMetric.timestamp = timestamp;\n    }\n\n    if (alias !== undefined && alias !== null) {\n        newMetric.alias = alias;\n    }\n\n    if (isHistorical !== undefined && isHistorical !== null) {\n        newMetric.isHistorical = isHistorical;\n    }\n\n    if (isTransient !== undefined && isTransient !== null) {\n        newMetric.isTransient = isTransient;\n    }\n\n    if (value !== undefined && value === null) {\n        newMetric.isNull = true;\n    }\n\n    if (metadata !== undefined && metadata !== null) {\n        newMetric.metadata = encodeMetaData(metadata);\n    }\n\n    if (properties !== undefined && properties !== null) {\n        newMetric.properties = encodePropertySet(properties);\n    }\n\n    return newMetric;\n}\n\nfunction decodeMetric (protoMetric: Partial<IMetric>): UMetric {\n    const metric: UMetric = {\n        // @ts-expect-error TODO check exists\n        value: getValue(protoMetric.datatype, protoMetric),\n        type: decodeType(protoMetric.datatype)\n    };\n\n    if (protoMetric.hasOwnProperty(\"name\")) {\n        metric.name = protoMetric.name;\n    }\n\n    if (protoMetric.hasOwnProperty(\"isNull\") && protoMetric.isNull === true) {\n        metric.value = null;\n    } else {\n        // @ts-expect-error TODO check exists\n        metric.value = getValue(protoMetric.datatype, protoMetric);\n    }\n\n    if (protoMetric.hasOwnProperty(\"timestamp\")) {\n        metric.timestamp = protoMetric.timestamp;\n    }\n\n    if (protoMetric.hasOwnProperty(\"alias\")) {\n        metric.alias = protoMetric.alias;\n    }\n\n    if (protoMetric.hasOwnProperty(\"isHistorical\")) {\n        metric.isHistorical = protoMetric.isHistorical;\n    }\n\n    if (protoMetric.hasOwnProperty(\"isTransient\")) {\n        metric.isTransient = protoMetric.isTransient;\n    }\n\n    if (protoMetric.hasOwnProperty(\"metadata\") && protoMetric.metadata) {\n        metric.metadata = decodeMetaData(protoMetric.metadata);\n    }\n\n    if (protoMetric.hasOwnProperty(\"properties\") && protoMetric.properties) {\n        metric.properties = decodePropertySet(protoMetric.properties);\n    }\n\n    return metric;\n}\n\nexport function encodePayload(object: UPayload): Uint8Array {\n    var payload = Payload.create({\n        \"timestamp\" : object.timestamp\n    });\n\n    // Build up the metric\n    if (object.metrics !== undefined && object.metrics !== null) {\n        var newMetrics = [],\n            metrics = object.metrics;\n        // loop over array of metric\n        for (var i = 0; i < metrics.length; i++) {\n            newMetrics.push(encodeMetric(metrics[i]));\n        }\n        payload.metrics = newMetrics;\n    }\n\n    if (object.seq !== undefined && object.seq !== null) {\n        payload.seq = object.seq;\n    }\n\n    if (object.uuid !== undefined && object.uuid !== null) {\n        payload.uuid = object.uuid;\n    }\n\n    if (object.body !== undefined && object.body !== null) {\n        payload.body = object.body;\n    }\n\n    return Payload.encode(payload).finish();\n}\n\nexport function decodePayload(proto: Uint8Array | Reader): UPayload {\n    var sparkplugPayload = Payload.decode(proto),\n        payload: UPayload = {};\n\n    if (sparkplugPayload.hasOwnProperty(\"timestamp\")) {\n        payload.timestamp = sparkplugPayload.timestamp;\n    }\n\n    if (sparkplugPayload.hasOwnProperty(\"metrics\")) {\n        const metrics: UMetric[] = [];\n        for (var i = 0; i < sparkplugPayload.metrics.length; i++) {\n            metrics.push(decodeMetric(sparkplugPayload.metrics[i]));\n        }\n        payload.metrics = metrics;\n    }\n\n    if (sparkplugPayload.hasOwnProperty(\"seq\")) {\n        payload.seq = sparkplugPayload.seq;\n    }\n\n    if (sparkplugPayload.hasOwnProperty(\"uuid\")) {\n        payload.uuid = sparkplugPayload.uuid;\n    }\n\n    if (sparkplugPayload.hasOwnProperty(\"body\")) {\n        payload.body = sparkplugPayload.body;\n    }\n\n    return payload;\n}\n"
  },
  {
    "path": "javascript/core/sparkplug-payload/package.json",
    "content": "{\n  \"name\": \"sparkplug-payload\",\n  \"version\": \"1.0.3\",\n  \"description\": \"A library for encoding and decoding Sparkplug payloads\",\n  \"main\": \"index.js\",\n  \"scripts\": {\n    \"build\": \"npm run build:proto && npm run build:ts\",\n    \"build:ts\": \"tsc\",\n    \"build:proto\": \"pbjs -t static-module -w commonjs -o lib/sparkplugPayloadProto.js ../../../sparkplug_b/sparkplug_b.proto && pbts -o lib/sparkplugPayloadProto.d.ts lib/sparkplugPayloadProto.js\"\n  },\n  \"repository\": {\n    \"type\": \"git\",\n    \"url\": \"git+https://github.com/eclipse/tahu.git\"\n  },\n  \"keywords\": [\n    \"tahu\",\n    \"mqtt\",\n    \"sparkplug\",\n    \"payload\"\n  ],\n  \"author\": \"Chad Kienle <chad.kienle@cirrus-link.com> (http://www.cirrus-link.com)\",\n  \"license\": \"EPL-2.0\",\n  \"bugs\": {\n    \"url\": \"https://github.com/eclipse/tahu/issues\"\n  },\n  \"homepage\": \"https://github.com/eclipse/tahu\",\n  \"files\": [\n    \"index.js\",\n    \"index.d.ts\",\n    \"lib/*.d.ts\",\n    \"lib/*.js\",\n    \"README.md\",\n    \"LICENSE\"\n  ],\n  \"dependencies\": {\n    \"@types/long\": \"^4.0.0\",\n    \"long\": \"^4.0.0\",\n    \"protobufjs\": \"^6.11.3\"\n  },\n  \"devDependencies\": {\n    \"@types/node\": \"^16.11.29\",\n    \"typescript\": \"^4.6.3\"\n  }\n}\n"
  },
  {
    "path": "javascript/core/sparkplug-payload/tsconfig.json",
    "content": "{\n  \"compilerOptions\": {\n    /* Visit https://aka.ms/tsconfig.json to read more about this file */\n\n    /* Projects */\n    // \"incremental\": true,                              /* Enable incremental compilation */\n    // \"composite\": true,                                /* Enable constraints that allow a TypeScript project to be used with project references. */\n    // \"tsBuildInfoFile\": \"./\",                          /* Specify the folder for .tsbuildinfo incremental compilation files. */\n    // \"disableSourceOfProjectReferenceRedirect\": true,  /* Disable preferring source files instead of declaration files when referencing composite projects */\n    // \"disableSolutionSearching\": true,                 /* Opt a project out of multi-project reference checking when editing. */\n    // \"disableReferencedProjectLoad\": true,             /* Reduce the number of projects loaded automatically by TypeScript. */\n\n    /* Language and Environment */\n    \"target\": \"es2016\",                                  /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */\n    // \"lib\": [],                                        /* Specify a set of bundled library declaration files that describe the target runtime environment. */\n    // \"jsx\": \"preserve\",                                /* Specify what JSX code is generated. */\n    // \"experimentalDecorators\": true,                   /* Enable experimental support for TC39 stage 2 draft decorators. */\n    // \"emitDecoratorMetadata\": true,                    /* Emit design-type metadata for decorated declarations in source files. */\n    // \"jsxFactory\": \"\",                                 /* Specify the JSX factory function used when targeting React JSX emit, e.g. 'React.createElement' or 'h' */\n    // \"jsxFragmentFactory\": \"\",                         /* Specify the JSX Fragment reference used for fragments when targeting React JSX emit e.g. 'React.Fragment' or 'Fragment'. */\n    // \"jsxImportSource\": \"\",                            /* Specify module specifier used to import the JSX factory functions when using `jsx: react-jsx*`.` */\n    // \"reactNamespace\": \"\",                             /* Specify the object invoked for `createElement`. This only applies when targeting `react` JSX emit. */\n    // \"noLib\": true,                                    /* Disable including any library files, including the default lib.d.ts. */\n    // \"useDefineForClassFields\": true,                  /* Emit ECMAScript-standard-compliant class fields. */\n\n    /* Modules */\n    \"module\": \"commonjs\",                                /* Specify what module code is generated. */\n    // \"rootDir\": \"./\",                                  /* Specify the root folder within your source files. */\n    // \"moduleResolution\": \"node\",                       /* Specify how TypeScript looks up a file from a given module specifier. */\n    // \"baseUrl\": \"./\",                                  /* Specify the base directory to resolve non-relative module names. */\n    // \"paths\": {},                                      /* Specify a set of entries that re-map imports to additional lookup locations. */\n    // \"rootDirs\": [],                                   /* Allow multiple folders to be treated as one when resolving modules. */\n    // \"typeRoots\": [],                                  /* Specify multiple folders that act like `./node_modules/@types`. */\n    // \"types\": [],                                      /* Specify type package names to be included without being referenced in a source file. */\n    // \"allowUmdGlobalAccess\": true,                     /* Allow accessing UMD globals from modules. */\n    // \"resolveJsonModule\": true,                        /* Enable importing .json files */\n    // \"noResolve\": true,                                /* Disallow `import`s, `require`s or `<reference>`s from expanding the number of files TypeScript should add to a project. */\n\n    /* JavaScript Support */\n    // \"allowJs\": true,                                  /* Allow JavaScript files to be a part of your program. Use the `checkJS` option to get errors from these files. */\n    // \"checkJs\": true,                                  /* Enable error reporting in type-checked JavaScript files. */\n    // \"maxNodeModuleJsDepth\": 1,                        /* Specify the maximum folder depth used for checking JavaScript files from `node_modules`. Only applicable with `allowJs`. */\n\n    /* Emit */\n    \"declaration\": true,                              /* Generate .d.ts files from TypeScript and JavaScript files in your project. */\n    // \"declarationMap\": true,                           /* Create sourcemaps for d.ts files. */\n    // \"emitDeclarationOnly\": true,                      /* Only output d.ts files and not JavaScript files. */\n    // \"sourceMap\": true,                                /* Create source map files for emitted JavaScript files. */\n    // \"outFile\": \"./\",                                  /* Specify a file that bundles all outputs into one JavaScript file. If `declaration` is true, also designates a file that bundles all .d.ts output. */\n    // \"outDir\": \"./\",                                   /* Specify an output folder for all emitted files. */\n    // \"removeComments\": true,                           /* Disable emitting comments. */\n    // \"noEmit\": true,                                   /* Disable emitting files from a compilation. */\n    // \"importHelpers\": true,                            /* Allow importing helper functions from tslib once per project, instead of including them per-file. */\n    // \"importsNotUsedAsValues\": \"remove\",               /* Specify emit/checking behavior for imports that are only used for types */\n    // \"downlevelIteration\": true,                       /* Emit more compliant, but verbose and less performant JavaScript for iteration. */\n    // \"sourceRoot\": \"\",                                 /* Specify the root path for debuggers to find the reference source code. */\n    // \"mapRoot\": \"\",                                    /* Specify the location where debugger should locate map files instead of generated locations. */\n    // \"inlineSourceMap\": true,                          /* Include sourcemap files inside the emitted JavaScript. */\n    // \"inlineSources\": true,                            /* Include source code in the sourcemaps inside the emitted JavaScript. */\n    // \"emitBOM\": true,                                  /* Emit a UTF-8 Byte Order Mark (BOM) in the beginning of output files. */\n    // \"newLine\": \"crlf\",                                /* Set the newline character for emitting files. */\n    // \"stripInternal\": true,                            /* Disable emitting declarations that have `@internal` in their JSDoc comments. */\n    // \"noEmitHelpers\": true,                            /* Disable generating custom helper functions like `__extends` in compiled output. */\n    // \"noEmitOnError\": true,                            /* Disable emitting files if any type checking errors are reported. */\n    // \"preserveConstEnums\": true,                       /* Disable erasing `const enum` declarations in generated code. */\n    // \"declarationDir\": \"./types\",                           /* Specify the output directory for generated declaration files. */\n    // \"preserveValueImports\": true,                     /* Preserve unused imported values in the JavaScript output that would otherwise be removed. */\n\n    /* Interop Constraints */\n    // \"isolatedModules\": true,                          /* Ensure that each file can be safely transpiled without relying on other imports. */\n    // \"allowSyntheticDefaultImports\": true,             /* Allow 'import x from y' when a module doesn't have a default export. */\n    \"esModuleInterop\": true,                             /* Emit additional JavaScript to ease support for importing CommonJS modules. This enables `allowSyntheticDefaultImports` for type compatibility. */\n    // \"preserveSymlinks\": true,                         /* Disable resolving symlinks to their realpath. This correlates to the same flag in node. */\n    \"forceConsistentCasingInFileNames\": true,            /* Ensure that casing is correct in imports. */\n\n    /* Type Checking */\n    \"strict\": true,                                      /* Enable all strict type-checking options. */\n    // \"noImplicitAny\": true,                            /* Enable error reporting for expressions and declarations with an implied `any` type.. */\n    // \"strictNullChecks\": true,                         /* When type checking, take into account `null` and `undefined`. */\n    // \"strictFunctionTypes\": true,                      /* When assigning functions, check to ensure parameters and the return values are subtype-compatible. */\n    // \"strictBindCallApply\": true,                      /* Check that the arguments for `bind`, `call`, and `apply` methods match the original function. */\n    // \"strictPropertyInitialization\": true,             /* Check for class properties that are declared but not set in the constructor. */\n    // \"noImplicitThis\": true,                           /* Enable error reporting when `this` is given the type `any`. */\n    // \"useUnknownInCatchVariables\": true,               /* Type catch clause variables as 'unknown' instead of 'any'. */\n    // \"alwaysStrict\": true,                             /* Ensure 'use strict' is always emitted. */\n    // \"noUnusedLocals\": true,                           /* Enable error reporting when a local variables aren't read. */\n    // \"noUnusedParameters\": true,                       /* Raise an error when a function parameter isn't read */\n    // \"exactOptionalPropertyTypes\": true,               /* Interpret optional property types as written, rather than adding 'undefined'. */\n    // \"noImplicitReturns\": true,                        /* Enable error reporting for codepaths that do not explicitly return in a function. */\n    // \"noFallthroughCasesInSwitch\": true,               /* Enable error reporting for fallthrough cases in switch statements. */\n    // \"noUncheckedIndexedAccess\": true,                 /* Include 'undefined' in index signature results */\n    // \"noImplicitOverride\": true,                       /* Ensure overriding members in derived classes are marked with an override modifier. */\n    // \"noPropertyAccessFromIndexSignature\": true,       /* Enforces using indexed accessors for keys declared using an indexed type */\n    // \"allowUnusedLabels\": true,                        /* Disable error reporting for unused labels. */\n    // \"allowUnreachableCode\": true,                     /* Disable error reporting for unreachable code. */\n\n    /* Completeness */\n    // \"skipDefaultLibCheck\": true,                      /* Skip type checking .d.ts files that are included with TypeScript. */\n    \"skipLibCheck\": true                                 /* Skip type checking all .d.ts files. */\n  }\n}\n"
  },
  {
    "path": "javascript/examples/simple/example.js",
    "content": "/********************************************************************************\n * Copyright (c) 2016-2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\nvar SparkplugClient = require('sparkplug-client');\n\n/*\n * Main sample function which includes the run() function for running the sample\n */\nvar sample = (function () {\n    var config = {\n            'serverUrl' : 'tcp://localhost:1883',\n            'username' : 'admin',\n            'password' : 'changeme',\n            'groupId' : 'Sparkplug Devices',\n            'edgeNode' : 'JavaScript Edge Node',\n            'clientId' : 'JavaScriptSimpleEdgeNode',\n            'version' : 'spBv1.0'\n        },\n        hwVersion = 'Emulated Hardware',\n        swVersion = 'v1.0.0',\n        deviceId = 'Emulated Device',\n        sparkPlugClient,\n        publishPeriod = 5000,\n        \n    // Generates a random integer\n    randomInt = function() {\n        return 1 + Math.floor(Math.random() * 10);\n    },\n\n    // Get BIRTH payload for the edge node\n    getNodeBirthPayload = function() {\n        return {\n            \"timestamp\" : new Date().getTime(),\n            \"metrics\" : [\n                {\n                    \"name\" : \"Node Control/Rebirth\",\n                    \"type\" : \"boolean\",\n                    \"value\" : false\n                },\n                {\n                    \"name\" : \"Template1\",\n                    \"type\" : \"template\",\n                    \"value\" : {\n                        \"isDefinition\" : true,\n                        \"metrics\" : [\n                            { \"name\" : \"myBool\", \"value\" : false, \"type\" : \"boolean\" },\n                            { \"name\" : \"myInt\", \"value\" : 0, \"type\" : \"int\" }\n                        ],\n                        \"parameters\" : [\n                            {\n                                \"name\" : \"param1\",\n                                \"type\" : \"string\",\n                                \"value\" : \"value1\"\n                            }\n                        ]\n                    }\n                }\n            ]\n        };\n    },\n\n    // Get BIRTH payload for the device\n    getDeviceBirthPayload = function() {\n        return {\n            \"timestamp\" : new Date().getTime(),\n            \"metrics\" : [\n                { \"name\" : \"my_boolean\", \"value\" : Math.random() > 0.5, \"type\" : \"boolean\" },\n                { \"name\" : \"my_double\", \"value\" : Math.random() * 0.123456789, \"type\" : \"double\" },\n                { \"name\" : \"my_float\", \"value\" : Math.random() * 0.123, \"type\" : \"float\" },\n                { \"name\" : \"my_int\", \"value\" : randomInt(), \"type\" : \"int\" },\n                { \"name\" : \"my_long\", \"value\" : randomInt() * 214748364700, \"type\" : \"long\" },\n                { \"name\" : \"Inputs/0\", \"value\" :  true, \"type\" : \"boolean\" },\n                { \"name\" : \"Inputs/1\", \"value\" :  0, \"type\" : \"int\" },\n                { \"name\" : \"Inputs/2\", \"value\" :  1.23, \"type\" : \"float\" },\n                { \"name\" : \"Outputs/0\", \"value\" :  true, \"type\" : \"boolean\" },\n                { \"name\" : \"Outputs/1\", \"value\" :  0, \"type\" : \"int\" },\n                { \"name\" : \"Outputs/2\", \"value\" :  1.23, \"type\" : \"float\" },\n                { \"name\" : \"Properties/hw_version\", \"value\" :  hwVersion, \"type\" : \"string\" },\n                { \"name\" : \"Properties/sw_version\", \"value\" :  swVersion, \"type\" : \"string\" },\n                { \n                    \"name\" : \"my_dataset\",\n                    \"type\" : \"dataset\",\n                    \"value\" : {\n                        \"numOfColumns\" : 2,\n                        \"types\" : [ \"string\", \"string\" ],\n                        \"columns\" : [ \"str1\", \"str2\" ],\n                        \"rows\" : [ \n                            [ \"x\", \"a\"],\n                            [ \"y\", \"b\" ]\n                        ]\n                    }\n                },\n                {\n                    \"name\" : \"TemplateInstance1\",\n                    \"type\" : \"template\",\n                    \"value\" : {\n                        \"templateRef\" : \"Template1\",\n                        \"isDefinition\" : false,\n                        \"metrics\" : [\n                            { \"name\" : \"myBool\", \"value\" : true, \"type\" : \"boolean\" },\n                            { \"name\" : \"myInt\", \"value\" : 100, \"type\" : \"int\" }\n                        ],\n                        \"parameters\" : [\n                            {\n                                \"name\" : \"param1\",\n                                \"type\" : \"string\",\n                                \"value\" : \"value2\"\n                            }\n                        ]\n                    }\n                }\n            ]\n        };\n    },\n    \n    // Get data payload for the device\n    getDataPayload = function() {\n        return {\n            \"timestamp\" : new Date().getTime(),\n            \"metrics\" : [\n                { \"name\" : \"my_boolean\", \"value\" : Math.random() > 0.5, \"type\" : \"boolean\" },\n                { \"name\" : \"my_double\", \"value\" : Math.random() * 0.123456789, \"type\" : \"double\" },\n                { \"name\" : \"my_float\", \"value\" : Math.random() * 0.123, \"type\" : \"float\" },\n                { \"name\" : \"my_int\", \"value\" : randomInt(), \"type\" : \"int\" },\n                { \"name\" : \"my_long\", \"value\" : randomInt() * 214748364700, \"type\" : \"long\" }\n            ]\n        };\n    },\n    \n    // Runs the sample\n    run = function() {\n        // Create the SparkplugClient\n        sparkplugClient = SparkplugClient.newClient(config);\n        \n        // Create Incoming Message Handler\n        sparkplugClient.on('message', function(topic, payload) {\n            console.log(topic, payload);\n        })\n\n        // Create 'birth' handler\n        sparkplugClient.on('birth', function () {\n            // Publish Node BIRTH certificate\n            sparkplugClient.publishNodeBirth(getNodeBirthPayload());\n            // Publish Device BIRTH certificate\n            sparkplugClient.publishDeviceBirth(deviceId, getDeviceBirthPayload());\n        });\n\n        // Create node command handler\n        sparkplugClient.on('ncmd', function (payload) {\n            var timestamp = payload.timestamp,\n                metrics = payload.metrics;\n\n            if (metrics !== undefined && metrics !== null) {\n                for (var i = 0; i < metrics.length; i++) {\n                    var metric = metrics[i];\n                    if (metric.name == \"Node Control/Rebirth\" && metric.value) {\n                        console.log(\"Received 'Rebirth' command\");\n                        // Publish Node BIRTH certificate\n                        sparkplugClient.publishNodeBirth(getNodeBirthPayload());\n                        // Publish Device BIRTH certificate\n                        sparkplugClient.publishDeviceBirth(deviceId, getDeviceBirthPayload());\n                    }\n                }\n            }     \n        });\n        \n        // Create device command handler\n        sparkplugClient.on('dcmd', function (deviceId, payload) {\n            var timestamp = payload.timestamp,\n                metrics = payload.metrics,\n                inboundMetricMap = {},\n                outboundMetric = [],\n                outboundPayload;\n            \n            console.log(\"Command recevied for device \" + deviceId);\n            \n            // Loop over the metrics and store them in a map\n            if (metrics !== undefined && metrics !== null) {\n                for (var i = 0; i < metrics.length; i++) {\n                    var metric = metrics[i];\n                    inboundMetricMap[metric.name] = metric.value;\n                }\n            }\n            if (inboundMetricMap[\"Outputs/0\"] !== undefined && inboundMetricMap[\"Outputs/0\"] !== null) {\n                console.log(\"Outputs/0: \" + inboundMetricMap[\"Outputs/0\"]);\n                outboundMetric.push({ \"name\" : \"Inputs/0\", \"value\" : inboundMetricMap[\"Outputs/0\"], \"type\" : \"boolean\" });\n                outboundMetric.push({ \"name\" : \"Outputs/0\", \"value\" : inboundMetricMap[\"Outputs/0\"], \"type\" : \"boolean\" });\n                console.log(\"Updated value for Inputs/0 \" + inboundMetricMap[\"Outputs/0\"]);\n            } else if (inboundMetricMap[\"Outputs/1\"] !== undefined && inboundMetricMap[\"Outputs/1\"] !== null) {\n                console.log(\"Outputs/1: \" + inboundMetricMap[\"Outputs/1\"]);\n                outboundMetric.push({ \"name\" : \"Inputs/1\", \"value\" : inboundMetricMap[\"Outputs/1\"], \"type\" : \"int\" });\n                outboundMetric.push({ \"name\" : \"Outputs/1\", \"value\" : inboundMetricMap[\"Outputs/1\"], \"type\" : \"int\" });\n                console.log(\"Updated value for Inputs/1 \" + inboundMetricMap[\"Outputs/1\"]);\n            } else if (inboundMetricMap[\"Outputs/2\"] !== undefined && inboundMetricMap[\"Outputs/2\"] !== null) {\n                console.log(\"Outputs/2: \" + inboundMetricMap[\"Outputs/2\"]);\n                outboundMetric.push({ \"name\" : \"Inputs/2\", \"value\" : inboundMetricMap[\"Outputs/2\"], \"type\" : \"float\" });\n                outboundMetric.push({ \"name\" : \"Outputs/2\", \"value\" : inboundMetricMap[\"Outputs/2\"], \"type\" : \"float\" });\n                console.log(\"Updated value for Inputs/2 \" + inboundMetricMap[\"Outputs/2\"]);\n            }\n\n            outboundPayload = {\n                    \"timestamp\" : new Date().getTime(),\n                    \"metrics\" : outboundMetric\n            };\n\n            // Publish device data\n            sparkplugClient.publishDeviceData(deviceId, outboundPayload);             \n        });\n        \n        for (var i = 1; i < 101; i++) {\n            // Set up a device data publish for i*publishPeriod milliseconds from now\n            setTimeout(function() {\n                // Publish device data\n                sparkplugClient.publishDeviceData(deviceId, getDataPayload());\n                \n                // End the client connection after the last publish\n                if (i === 100) {\n                    sparkplugClient.stop();\n                }\n            }, i*publishPeriod);\n        }\n    };\n    \n    return {run:run};\n}());\n\n// Run the sample\nsample.run();\n"
  },
  {
    "path": "javascript/examples/simple/package.json",
    "content": "{\n  \"name\": \"example\",\n  \"version\": \"1.0.1-SNAPSHOT\",\n  \"description\": \"An example Sparkplug B module written in JavaScript\",\n  \"license\": \"EPL-1.0\",\n  \"repository\": {\n    \"type\": \"git\",\n    \"url\": \"https://github.com/Cirrus-Link/Sparkplug.git\"\n  },\n  \"dependencies\": {\n    \"sparkplug-client\": \"^3.2.2\"\n  }\n}\n"
  },
  {
    "path": "nodered/examples/emulated-device.js",
    "content": "/********************************************************************************\n * Copyright (c) 2016-2018 Cirrus Link Solutions and others\n *\n * This program and the accompanying materials are made available under the\n * terms of the Eclipse Public License 2.0 which is available at\n * http://www.eclipse.org/legal/epl-2.0.\n *\n * SPDX-License-Identifier: EPL-2.0\n *\n * Contributors:\n *   Cirrus Link Solutions - initial implementation\n ********************************************************************************/\nvar deviceId = \"Emulated Device\"\n    hwVersion = \"Emulated Hardware\",\n    swVersion = \"v1.0.0\",\n\n/*\n * Generates a random integer\n */\nrandomInt = function() {\n    return 1 + Math.floor(Math.random() * 10);\n};\n\ngetTopic = function(type) {\n    return deviceId + \"/\" + type;\n}\n\n/*\n * Returns the full birth payload for the emulated device\n */\ngetBirthPayload = function() {\n    return {\n        \"timestamp\" : new Date().getTime(),\n        \"metrics\" : [\n            { \n                \"name\" : \"my_boolean\", \n                \"value\" : Math.random() > 0.5, \n                \"type\" : \"boolean\",\n                \"properties\" : {\n                    \"EngUnit\" : {\n                        \"value\" : \"ChadsUnits\",\n                        \"type\" : \"string\"\n                    }\n                } \n            },\n            { \"name\" : \"my_double\", \"value\" : Math.random() * 0.123456789, \"type\" : \"double\" },\n            { \"name\" : \"my_float\", \"value\" : Math.random() * 0.123, \"type\" : \"float\" },\n            { \"name\" : \"my_int\", \"value\" : randomInt(), \"type\" : \"int\" },\n            { \"name\" : \"my_long\", \"value\" : randomInt() * 214748364700, \"type\" : \"long\" },\n            { \"name\" : \"Inputs/0\", \"value\" :  true, \"type\" : \"boolean\" },\n            { \"name\" : \"Inputs/1\", \"value\" :  0, \"type\" : \"int\" },\n            { \"name\" : \"Inputs/2\", \"value\" :  1.23, \"type\" : \"float\" },\n            { \"name\" : \"Outputs/0\", \"value\" :  true, \"type\" : \"boolean\" },\n            { \"name\" : \"Outputs/1\", \"value\" :  0, \"type\" : \"int\" },\n            { \"name\" : \"Outputs/2\", \"value\" :  1.23, \"type\" : \"float\" },\n            { \"name\" : \"Properties/hw_version\", \"value\" :  hwVersion, \"type\" : \"string\" },\n            { \"name\" : \"Properties/sw_version\", \"value\" :  swVersion, \"type\" : \"string\" }\n        ]\n    };\n};\n\n/*\n * Returns the death payload for the emulated device\n */\ngetDeathPayload = function() {\n    return {\n        \"timestamp\" : new Date().getTime()\n    };\n};\n\n/*\n * Returns the data payload for the device\n */\ngetDataPayload = function(msg) {\n    return {\n        \"timestamp\" : msg.payload.timestamp !== undefined ? msg.payload.timestamp : new Date().getTime(),\n        \"metrics\" : [\n            { \"name\" : \"my_boolean\", \"value\" : Math.random() > 0.5, \"type\" : \"boolean\" },\n            { \"name\" : \"my_double\", \"value\" : Math.random() * 0.123456789, \"type\" : \"double\" },\n            { \"name\" : \"my_float\", \"value\" : Math.random() * 0.123, \"type\" : \"float\" },\n            { \"name\" : \"my_int\", \"value\" : randomInt(), \"type\" : \"int\" },\n            { \"name\" : \"my_long\", \"value\" : randomInt() * 214748364700, \"type\" : \"long\" }\n            ]\n    };\n};\n\n/*\n * Process the incoming message by topic.  The following actions will be taked based on the incoming message topic:\n * \n * topic = deviceId\n *   The emulated device is receiving a DCMD. Process the incoming (writable) metrics and publish all changed metrics.\n *   \n * topic = rebirth\n *   A rebirth command is requested, publish the devices full metrics.\n *   \n * topic = death\n *   Publish a device death message indicating that the device has gone offline.\n *   \n * topic = timestamp\n *   Publish the default device data payload using the new timestamp.\n */\nif (msg.topic === deviceId) {\n    var metrics = msg.payload.metrics,\n        inboundMetricMap = {},\n        outboundMetric = [],\n        outboundPayload;\n\n    console.log(deviceId + \" received 'DCMD' command\");\n\n    // Loop over the metrics and store them in a map\n    if (metrics !== undefined && metrics !== null) {\n        for (var i = 0; i < metrics.length; i++) {\n            var m = metrics[i];\n            inboundMetricMap[m.name] = m.value;\n        }\n    }\n\n    if (inboundMetricMap[\"Outputs/0\"] !== undefined && inboundMetricMap[\"Outputs/0\"] !== null) {\n        console.log(\"Outputs/0: \" + inboundMetricMap[\"Outputs/0\"]);\n        outboundMetric.push({ \"name\" : \"Inputs/0\", \"value\" : inboundMetricMap[\"Outputs/0\"], \"type\" : \"boolean\" });\n        outboundMetric.push({ \"name\" : \"Outputs/0\", \"value\" : inboundMetricMap[\"Outputs/0\"], \"type\" : \"boolean\" });\n        console.log(\"Updated value for Inputs/0 \" + inboundMetricMap[\"Outputs/0\"]);\n    } else if (inboundMetricMap[\"Outputs/1\"] !== undefined && inboundMetricMap[\"Outputs/1\"] !== null) {\n        console.log(\"Outputs/1: \" + inboundMetricMap[\"Outputs/1\"]);\n        outboundMetric.push({ \"name\" : \"Inputs/1\", \"value\" : inboundMetricMap[\"Outputs/1\"], \"type\" : \"int\" });\n        outboundMetric.push({ \"name\" : \"Outputs/1\", \"value\" : inboundMetricMap[\"Outputs/1\"], \"type\" : \"int\" });\n        console.log(\"Updated value for Inputs/1 \" + inboundMetricMap[\"Outputs/1\"]);\n    } else if (inboundMetricMap[\"Outputs/2\"] !== undefined && inboundMetricMap[\"Outputs/2\"] !== null) {\n        console.log(\"Outputs/2: \" + inboundMetricMap[\"Outputs/2\"]);\n        outboundMetric.push({ \"name\" : \"Inputs/2\", \"value\" : inboundMetricMap[\"Outputs/2\"], \"type\" : \"float\" });\n        outboundMetric.push({ \"name\" : \"Outputs/2\", \"value\" : inboundMetricMap[\"Outputs/2\"], \"type\" : \"float\" });\n        console.log(\"Updated value for Inputs/2 \" + inboundMetricMap[\"Outputs/2\"]);\n    }\n\n    outboundPayload = {\n            \"timestamp\" : new Date().getTime(),\n            \"metrics\" : outboundMetric\n    };\n\n    return {\n        \"topic\" : getTopic(\"DDATA\"),\n        \"payload\" : outboundPayload\n    };\n    \n} else if (msg.topic === \"rebirth\") {\n    console.log(deviceId + \" received 'rebirth' command\");\n    return {\n        \"topic\" : getTopic(\"DBIRTH\"),\n        \"payload\" : getBirthPayload()\n    };\n    \n} else if (msg.topic === \"timestamp\"){\n    console.log(deviceId + \" received 'timestamp' message\");\n    return {\n        \"topic\" : getTopic(\"DDATA\"),\n        \"payload\" : getDataPayload(msg)\n    };\n} else if (msg.topic === \"death\"){\n    console.log(deviceId + \" received 'timestamp' message\");\n    return {\n        \"topic\" : getTopic(\"DDEATH\"),\n        \"payload\" : getDeathPayload(msg)\n    };\n}\n\nreturn null;\n"
  },
  {
    "path": "nodered/examples/package.json",
    "content": "{\n  \"name\" : \"example\",\n  \"version\" : \"1.0.0-SNAPSHOT\",\n  \"description\" : \"An example Sparkplug B Node-RED node\",\n  \"license\" : \"EPL-1.0\",\n  \"repository\": {\n    \"type\": \"git\",\n    \"url\": \"https://github.com/Cirrus-Link/Sparkplug.git\"\n  }\n}\n"
  },
  {
    "path": "notice.html",
    "content": "<?xml version=\"1.0\" encoding=\"ISO-8859-1\" ?>\n<!DOCTYPE html PUBLIC \"-//W3C//DTD XHTML 1.0 Strict//EN\" \"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd\">\n<html xmlns=\"http://www.w3.org/1999/xhtml\">\n<head>\n<meta http-equiv=\"Content-Type\" content=\"text/html; charset=ISO-8859-1\" />\n<title>Eclipse Foundation Software User Agreement</title>\n</head>\n\n<body lang=\"EN-US\">\n\t<h2>Eclipse Foundation Software User Agreement</h2>\n\t<p>November 22, 2017</p>\n\n\t<h3>Usage Of Content</h3>\n\n\t<p>THE ECLIPSE FOUNDATION MAKES AVAILABLE SOFTWARE, DOCUMENTATION,\n\t\tINFORMATION AND/OR OTHER MATERIALS FOR OPEN SOURCE PROJECTS\n\t\t(COLLECTIVELY &quot;CONTENT&quot;). USE OF THE CONTENT IS GOVERNED BY\n\t\tTHE TERMS AND CONDITIONS OF THIS AGREEMENT AND/OR THE TERMS AND\n\t\tCONDITIONS OF LICENSE AGREEMENTS OR NOTICES INDICATED OR REFERENCED\n\t\tBELOW. BY USING THE CONTENT, YOU AGREE THAT YOUR USE OF THE CONTENT IS\n\t\tGOVERNED BY THIS AGREEMENT AND/OR THE TERMS AND CONDITIONS OF ANY\n\t\tAPPLICABLE LICENSE AGREEMENTS OR NOTICES INDICATED OR REFERENCED\n\t\tBELOW. IF YOU DO NOT AGREE TO THE TERMS AND CONDITIONS OF THIS\n\t\tAGREEMENT AND THE TERMS AND CONDITIONS OF ANY APPLICABLE LICENSE\n\t\tAGREEMENTS OR NOTICES INDICATED OR REFERENCED BELOW, THEN YOU MAY NOT\n\t\tUSE THE CONTENT.</p>\n\n\t<h3>Applicable Licenses</h3>\n\n\t<p>\n\t\tUnless otherwise indicated, all Content made available by the Eclipse\n\t\tFoundation is provided to you under the terms and conditions of the\n\t\tEclipse Public License Version 2.0 (&quot;EPL&quot;). A copy of the\n\t\tEPL is provided with this Content and is also available at <a\n\t\t\thref=\"http://www.eclipse.org/legal/epl-2.0\">http://www.eclipse.org/legal/epl-2.0</a>.\n\t\tFor purposes of the EPL, &quot;Program&quot; will mean the Content.\n\t</p>\n\n\t<p>Content includes, but is not limited to, source code, object\n\t\tcode, documentation and other files maintained in the Eclipse\n\t\tFoundation source code repository (&quot;Repository&quot;) in software\n\t\tmodules (&quot;Modules&quot;) and made available as downloadable\n\t\tarchives (&quot;Downloads&quot;).</p>\n\n\t<ul>\n\t\t<li>Content may be structured and packaged into modules to\n\t\t\tfacilitate delivering, extending, and upgrading the Content. Typical\n\t\t\tmodules may include plug-ins (&quot;Plug-ins&quot;), plug-in\n\t\t\tfragments (&quot;Fragments&quot;), and features\n\t\t\t(&quot;Features&quot;).</li>\n\t\t<li>Each Plug-in or Fragment may be packaged as a sub-directory\n\t\t\tor JAR (Java&trade; ARchive) in a directory named\n\t\t\t&quot;plugins&quot;.</li>\n\t\t<li>A Feature is a bundle of one or more Plug-ins and/or\n\t\t\tFragments and associated material. Each Feature may be packaged as a\n\t\t\tsub-directory in a directory named &quot;features&quot;. Within a\n\t\t\tFeature, files named &quot;feature.xml&quot; may contain a list of\n\t\t\tthe names and version numbers of the Plug-ins and/or Fragments\n\t\t\tassociated with that Feature.</li>\n\t\t<li>Features may also include other Features (&quot;Included\n\t\t\tFeatures&quot;). Within a Feature, files named\n\t\t\t&quot;feature.xml&quot; may contain a list of the names and version\n\t\t\tnumbers of Included Features.</li>\n\t</ul>\n\n\t<p>The terms and conditions governing Plug-ins and Fragments should\n\t\tbe contained in files named &quot;about.html&quot;\n\t\t(&quot;Abouts&quot;). The terms and conditions governing Features and\n\t\tIncluded Features should be contained in files named\n\t\t&quot;license.html&quot; (&quot;Feature Licenses&quot;). Abouts and\n\t\tFeature Licenses may be located in any directory of a Download or\n\t\tModule including, but not limited to the following locations:</p>\n\n\t<ul>\n\t\t<li>The top-level (root) directory</li>\n\t\t<li>Plug-in and Fragment directories</li>\n\t\t<li>Inside Plug-ins and Fragments packaged as JARs</li>\n\t\t<li>Sub-directories of the directory named &quot;src&quot; of\n\t\t\tcertain Plug-ins</li>\n\t\t<li>Feature directories</li>\n\t</ul>\n\n\t<p>Note: if a Feature made available by the Eclipse Foundation is\n\t\tinstalled using the Provisioning Technology (as defined below), you\n\t\tmust agree to a license (&quot;Feature Update License&quot;) during\n\t\tthe installation process. If the Feature contains Included Features,\n\t\tthe Feature Update License should either provide you with the terms\n\t\tand conditions governing the Included Features or inform you where you\n\t\tcan locate them. Feature Update Licenses may be found in the\n\t\t&quot;license&quot; property of files named\n\t\t&quot;feature.properties&quot; found within a Feature. Such Abouts,\n\t\tFeature Licenses, and Feature Update Licenses contain the terms and\n\t\tconditions (or references to such terms and conditions) that govern\n\t\tyour use of the associated Content in that directory.</p>\n\n\t<p>THE ABOUTS, FEATURE LICENSES, AND FEATURE UPDATE LICENSES MAY\n\t\tREFER TO THE EPL OR OTHER LICENSE AGREEMENTS, NOTICES OR TERMS AND\n\t\tCONDITIONS. SOME OF THESE OTHER LICENSE AGREEMENTS MAY INCLUDE (BUT\n\t\tARE NOT LIMITED TO):</p>\n\n\t<ul>\n\t\t<li>Eclipse Public License Version 1.0 (available at <a\n\t\t\thref=\"http://www.eclipse.org/legal/epl-v10.html\">http://www.eclipse.org/legal/epl-v10.html</a>)\n\t\t</li>\n\t\t<li>Eclipse Distribution License Version 1.0 (available at <a\n\t\t\thref=\"http://www.eclipse.org/licenses/edl-v10.html\">http://www.eclipse.org/licenses/edl-v1.0.html</a>)\n\t\t</li>\n\t\t<li>Common Public License Version 1.0 (available at <a\n\t\t\thref=\"http://www.eclipse.org/legal/cpl-v10.html\">http://www.eclipse.org/legal/cpl-v10.html</a>)\n\t\t</li>\n\t\t<li>Apache Software License 1.1 (available at <a\n\t\t\thref=\"http://www.apache.org/licenses/LICENSE\">http://www.apache.org/licenses/LICENSE</a>)\n\t\t</li>\n\t\t<li>Apache Software License 2.0 (available at <a\n\t\t\thref=\"http://www.apache.org/licenses/LICENSE-2.0\">http://www.apache.org/licenses/LICENSE-2.0</a>)\n\t\t</li>\n\t\t<li>Mozilla Public License Version 1.1 (available at <a\n\t\t\thref=\"http://www.mozilla.org/MPL/MPL-1.1.html\">http://www.mozilla.org/MPL/MPL-1.1.html</a>)\n\t\t</li>\n\t</ul>\n\n\t<p>IT IS YOUR OBLIGATION TO READ AND ACCEPT ALL SUCH TERMS AND\n\t\tCONDITIONS PRIOR TO USE OF THE CONTENT. If no About, Feature License,\n\t\tor Feature Update License is provided, please contact the Eclipse\n\t\tFoundation to determine what terms and conditions govern that\n\t\tparticular Content.</p>\n\n\n\t<h3>Use of Provisioning Technology</h3>\n\n\t<p>\n\t\tThe Eclipse Foundation makes available provisioning software, examples\n\t\tof which include, but are not limited to, p2 and the Eclipse Update\n\t\tManager (&quot;Provisioning Technology&quot;) for the purpose of\n\t\tallowing users to install software, documentation, information and/or\n\t\tother materials (collectively &quot;Installable Software&quot;). This\n\t\tcapability is provided with the intent of allowing such users to\n\t\tinstall, extend and update Eclipse-based products. Information about\n\t\tpackaging Installable Software is available at <a\n\t\t\thref=\"http://eclipse.org/equinox/p2/repository_packaging.html\">http://eclipse.org/equinox/p2/repository_packaging.html</a>\n\t\t(&quot;Specification&quot;).\n\t</p>\n\n\t<p>You may use Provisioning Technology to allow other parties to\n\t\tinstall Installable Software. You shall be responsible for enabling\n\t\tthe applicable license agreements relating to the Installable Software\n\t\tto be presented to, and accepted by, the users of the Provisioning\n\t\tTechnology in accordance with the Specification. By using Provisioning\n\t\tTechnology in such a manner and making it available in accordance with\n\t\tthe Specification, you further acknowledge your agreement to, and the\n\t\tacquisition of all necessary rights to permit the following:</p>\n\n\t<ol>\n\t\t<li>A series of actions may occur (&quot;Provisioning\n\t\t\tProcess&quot;) in which a user may execute the Provisioning\n\t\t\tTechnology on a machine (&quot;Target Machine&quot;) with the intent\n\t\t\tof installing, extending or updating the functionality of an\n\t\t\tEclipse-based product.</li>\n\t\t<li>During the Provisioning Process, the Provisioning Technology\n\t\t\tmay cause third party Installable Software or a portion thereof to be\n\t\t\taccessed and copied to the Target Machine.</li>\n\t\t<li>Pursuant to the Specification, you will provide to the user\n\t\t\tthe terms and conditions that govern the use of the Installable\n\t\t\tSoftware (&quot;Installable Software Agreement&quot;) and such\n\t\t\tInstallable Software Agreement shall be accessed from the Target\n\t\t\tMachine in accordance with the Specification. Such Installable\n\t\t\tSoftware Agreement must inform the user of the terms and conditions\n\t\t\tthat govern the Installable Software and must solicit acceptance by\n\t\t\tthe end user in the manner prescribed in such Installable Software\n\t\t\tAgreement. Upon such indication of agreement by the user, the\n\t\t\tprovisioning Technology will complete installation of the Installable\n\t\t\tSoftware.</li>\n\t</ol>\n\n\t<h3>Cryptography</h3>\n\n\t<p>Content may contain encryption software. The country in which\n\t\tyou are currently may have restrictions on the import, possession, and\n\t\tuse, and/or re-export to another country, of encryption software.\n\t\tBEFORE using any encryption software, please check the country's laws,\n\t\tregulations and policies concerning the import, possession, or use,\n\t\tand re-export of encryption software, to see if this is permitted.</p>\n\n\t<p>\n\t\t<small>Java and all Java-based trademarks are trademarks of\n\t\t\tOracle Corporation in the United States, other countries, or both.</small>\n\t</p>\n</body>\n</html>\n"
  },
  {
    "path": "python/core/__init__.py",
    "content": ""
  },
  {
    "path": "python/core/array_packer.py",
    "content": "\"\"\"*******************************************************************************\n * Copyright (c) 2021 Ian Craggs\n *\n * All rights reserved. This program and the accompanying materials\n * are made available under the terms of the Eclipse Public License v2.0\n * and Eclipse Distribution License v1.0 which accompany this distribution. \n *\n * The Eclipse Public License is available at \n *    https://www.eclipse.org/legal/epl-2.0/\n * and the Eclipse Distribution License is available at \n *   http://www.eclipse.org/org/documents/edl-v10.php.\n *\n * Contributors:\n *    @rahulrauki - inital Array packing and unpacking as per SparkPlug B guidelines\n *******************************************************************************\"\"\"\n\n\nimport struct\n\n#/********************************************************************************\n# * Purpose of the module is to provide helper function for encoding and decoding\n# * of Array Types ( 22 - 34 ) according to SparkPlug B Specification \n# *\n# * The module uses built-in struct module for packing and unpacking of bytes\n# ********************************************************************************/\n\n# Packing template function using in-built struct module\ndef convert_to_packed_bytes(array, format_specifier):\n    packed_bytes = struct.pack('<{}{}'.format(len(array), format_specifier), *array)\n    return packed_bytes\n\n# Functions for packing each type of array as mentioned in the SparkPlug B Specification\ndef convert_to_packed_int8_array(array):\n    return convert_to_packed_bytes(array, 'b')\n\ndef convert_to_packed_int16_array(array):\n    return convert_to_packed_bytes(array, 'h')\n\ndef convert_to_packed_int32_array(array):\n    return convert_to_packed_bytes(array, 'i')\n\ndef convert_to_packed_int64_array(array):\n    return convert_to_packed_bytes(array, 'q')\n\ndef convert_to_packed_uint8_array(array):\n    return convert_to_packed_bytes(array, 'B')\n\ndef convert_to_packed_uint16_array(array):\n    return convert_to_packed_bytes(array, 'H')\n\ndef convert_to_packed_uint32_array(array):\n    return convert_to_packed_bytes(array, 'I')\n\ndef convert_to_packed_uint64_array(array):\n    return convert_to_packed_bytes(array, 'Q')\n\ndef convert_to_packed_float_array(array):\n    return convert_to_packed_bytes(array, 'f')\n\ndef convert_to_packed_double_array(array):\n    return convert_to_packed_bytes(array, 'd')\n\ndef convert_to_packed_boolean_array(boolean_array):\n    # calculate the number of packed bytes required\n    packed_bytes_count = (len(boolean_array) + 7) // 8\n    # convert the boolean array into a packed byte string\n    packed_bytes = bytearray(packed_bytes_count)\n    for i, value in enumerate(boolean_array):\n        packed_bytes[i // 8] |= value << (i % 8)\n    # return the packed bytes preceded by a 4-byte integer representing the number of boolean values\n    return struct.pack(\"<I\", len(boolean_array)) + packed_bytes\n\ndef convert_to_packed_string_array(array):\n    # convert strings to bytes and encode to hex\n    hex_string_array = [string.encode().hex() for string in array]\n    # convert hex string to bytes and terminate with null character\n    packed_bytes = [bytes(hex_string, 'utf-8') + b'\\x00' for hex_string in hex_string_array]\n    # joining the bytes to form a null terminated byte string\n    return b''.join(packed_bytes)\n\ndef convert_to_packed_datetime_array(array):\n    # convert receievd epoch time to 8-byte (int64) array\n    packed_bytes = convert_to_packed_int64_array(array)\n    return packed_bytes\n\n\n\n# Un-packing template function\ndef convert_from_packed_bytes(packed_bytes, format_specifier, length):\n    return struct.unpack('<{}{}'.format(length, format_specifier), packed_bytes)\n\n# Functions for un-packing packed byte arrays for every type\ndef convert_from_packed_int8_array(packed_bytes):\n    return convert_from_packed_bytes(packed_bytes, 'b', len(packed_bytes))\n\ndef convert_from_packed_int16_array(packed_bytes):\n    return convert_from_packed_bytes(packed_bytes, 'h', len(packed_bytes) // 2)\n\ndef convert_from_packed_int32_array(packed_bytes):\n    return convert_from_packed_bytes(packed_bytes, 'i', len(packed_bytes) // 4)\n\ndef convert_from_packed_int64_array(packed_bytes):\n    return convert_from_packed_bytes(packed_bytes, 'q', len(packed_bytes) // 8)\n\ndef convert_from_packed_uint8_array(packed_bytes):\n    return convert_from_packed_bytes(packed_bytes, 'B', len(packed_bytes))\n\ndef convert_from_packed_uint16_array(packed_bytes):\n    return convert_from_packed_bytes(packed_bytes, 'H', len(packed_bytes) // 2)\n\ndef convert_from_packed_uint32_array(packed_bytes):\n    return convert_from_packed_bytes(packed_bytes, 'I', len(packed_bytes) // 4)\n\ndef convert_from_packed_uint64_array(packed_bytes):\n    return convert_from_packed_bytes(packed_bytes, 'Q', len(packed_bytes) // 8)\n\ndef convert_from_packed_float_array(packed_bytes):\n    return convert_from_packed_bytes(packed_bytes, 'f', len(packed_bytes) // 4)\n\ndef convert_from_packed_double_array(packed_bytes):\n    return convert_from_packed_bytes(packed_bytes, 'd', len(packed_bytes) // 8)\n\ndef convert_from_packed_boolean_array(packed_bytes):\n    # unpack the 4-byte integer representing the number of boolean values\n    boolean_count, = struct.unpack(\"<I\", packed_bytes[:4])\n    # unpack the packed bytes into a list of booleans\n    boolean_array = []\n    for i in range(boolean_count):\n        # True is represented by 1 and False by 0 in the array\n        boolean_array.append((packed_bytes[4 + i // 8] >> (i % 8)) & 1)\n    return boolean_array\n\ndef convert_from_packed_string_array(packed_bytes):\n    string_array = []\n    # packed bytes are decoded and stripped of null characters\n    decoded_hex_string = packed_bytes.decode('utf-8').split('\\x00')\n    for hex_string in decoded_hex_string:\n        # resulting hex string is converted to byte and then to decoded to strings\n        string_array.append(bytes.fromhex(hex_string).decode())\n    return string_array\n\ndef convert_from_packed_datetime_array(packed_bytes):\n    # unpack the packed bytes the result will be epoch values\n    epoch_array = convert_from_packed_int64_array(packed_bytes)\n    # epoch milliseconds are returned as is\n    return epoch_array"
  },
  {
    "path": "python/core/host_session_establishment.py",
    "content": "\"\"\"*******************************************************************************\n * Copyright (c) 2021 Ian Craggs\n *\n * All rights reserved. This program and the accompanying materials\n * are made available under the terms of the Eclipse Public License v2.0\n * and Eclipse Distribution License v1.0 which accompany this distribution. \n *\n * The Eclipse Public License is available at \n *    https://www.eclipse.org/legal/epl-2.0/\n * and the Eclipse Distribution License is available at \n *   http://www.eclipse.org/org/documents/edl-v10.php.\n *\n * Contributors:\n *    Ian Craggs - initial API and implementation and/or initial documentation\n *******************************************************************************\"\"\"\n\n\nimport paho.mqtt.client as mqtt\nimport time\n\n\"\"\"\n\n\n\"\"\"\nbroker = \"localhost\"\nport = 1883\nhost_application_id = \"HOSTAPPID\"\n\ndef control_on_message(client, userdata, msg):\n    if msg.topic == \"SPARKPLUG_TCK/RESULT\":\n        print(\"*** Result ***\",  msg.payload)\n\ndef control_on_connect(client, userdata, flags, rc):\n    print(\"Control client connected with result code \"+str(rc))\n    # Subscribing in on_connect() means that if we lose the connection and\n    # reconnect then subscriptions will be renewed.\n    client.subscribe(\"SPARKPLUG_TCK/#\")\n\ndef control_on_subscribe(client, userdata, mid, granted_qos):\n    print(\"Control client subscribed\")\n    rc = client.publish(\"SPARKPLUG_TCK/TEST_CONTROL\", \"NEW host SessionEstablishment \" + host_application_id, qos=1)\n\npublished = False\ndef control_on_publish(client, userdata, mid):\n    print(\"Control client published\")\n    global published\n    published = True\n\ncontrol_client = mqtt.Client(\"sparkplug_control\")\ncontrol_client.on_connect = control_on_connect\ncontrol_client.on_subscribe = control_on_subscribe\ncontrol_client.on_publish = control_on_publish\ncontrol_client.on_message = control_on_message\ncontrol_client.connect(broker, port)\ncontrol_client.loop_start()\n\n# wait for publish to complete\nwhile published == False:\n    time.sleep(0.1)\n\ndef test_on_connect(client, userdata, flags, rc):\n    print(\"Test client connected with result code \"+str(rc))\n    client.subscribe(\"spAv1.0/#\")\n\ndef test_on_subscribe(client, userdata, mid, granted_qos):\n    print(\"Test client subscribed\")\n    client.publish(\"STATE/\"+host_application_id, \"ONLINE\", qos=1)\n\npublished = False\ndef test_on_publish(client, userdata, mid):\n    print(\"Test client published\")\n    global published\n    published = True\n\nclient = mqtt.Client(\"clientid\", clean_session=True)\nclient.on_connect = test_on_connect\nclient.on_subscribe = test_on_subscribe\nclient.on_publish = test_on_publish\nclient.will_set(topic=\"STATE/\"+host_application_id, payload=\"OFFLINE\", qos=1, retain=True)\nclient.connect(broker, port)\nclient.loop_start()\n\nwhile published == False:\n    time.sleep(0.1)\n\nclient.loop_stop()\n\npublished = False\ncontrol_client.publish(\"SPARKPLUG_TCK/TEST_CONTROL\", \"END TEST\")\nwhile published == False:\n    time.sleep(0.1)\n\ncontrol_client.loop_stop()\n\n\n\n\n"
  },
  {
    "path": "python/core/readme.md",
    "content": "# To generate the base protobuf sparkplug_b Python library\nprotoc -I=../../sparkplug_b/ --python_out=. ../../sparkplug_b/sparkplug_b.proto \n"
  },
  {
    "path": "python/core/sparkplug_b.py",
    "content": "#/********************************************************************************\n# * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n# *\n# * This program and the accompanying materials are made available under the\n# * terms of the Eclipse Public License 2.0 which is available at\n# * http://www.eclipse.org/legal/epl-2.0.\n# *\n# * SPDX-License-Identifier: EPL-2.0\n# *\n# * Contributors:\n# *   Cirrus Link Solutions - initial implementation\n# ********************************************************************************/\nimport sparkplug_b_pb2\nimport time\nfrom sparkplug_b_pb2 import Payload\nfrom array_packer import *\n\nseqNum = 0\nbdSeq = 0\n\nclass DataSetDataType:\n    Unknown = 0\n    Int8 = 1\n    Int16 = 2\n    Int32 = 3\n    Int64 = 4\n    UInt8 = 5\n    UInt16 = 6\n    UInt32 = 7\n    UInt64 = 8\n    Float = 9\n    Double = 10\n    Boolean = 11\n    String = 12\n    DateTime = 13\n    Text = 14\n\nclass MetricDataType:\n    Unknown = 0\n    Int8 = 1\n    Int16 = 2\n    Int32 = 3\n    Int64 = 4\n    UInt8 = 5\n    UInt16 = 6\n    UInt32 = 7\n    UInt64 = 8\n    Float = 9\n    Double = 10\n    Boolean = 11\n    String = 12\n    DateTime = 13\n    Text = 14\n    UUID = 15\n    DataSet = 16\n    Bytes = 17\n    File = 18\n    Template = 19\n\nclass ParameterDataType:\n    Unknown = 0\n    Int8 = 1\n    Int16 = 2\n    Int32 = 3\n    Int64 = 4\n    UInt8 = 5\n    UInt16 = 6\n    UInt32 = 7\n    UInt64 = 8\n    Float = 9\n    Double = 10\n    Boolean = 11\n    String = 12\n    DateTime = 13\n    Text = 14\n\nclass ParameterDataType:\n    Unknown = 0\n    Int8 = 1\n    Int16 = 2\n    Int32 = 3\n    Int64 = 4\n    UInt8 = 5\n    UInt16 = 6\n    UInt32 = 7\n    UInt64 = 8\n    Float = 9\n    Double = 10\n    Boolean = 11\n    String = 12\n    DateTime = 13\n    Text = 14\n\n######################################################################\n# Always request this before requesting the Node Birth Payload\n######################################################################\ndef getNodeDeathPayload():\n    payload = sparkplug_b_pb2.Payload()\n    addMetric(payload, \"bdSeq\", None, MetricDataType.Int64, getBdSeqNum())\n    return payload\n######################################################################\n\n######################################################################\n# Always request this after requesting the Node Death Payload\n######################################################################\ndef getNodeBirthPayload():\n    global seqNum\n    seqNum = 0\n    payload = sparkplug_b_pb2.Payload()\n    payload.timestamp = int(round(time.time() * 1000))\n    payload.seq = getSeqNum()\n    addMetric(payload, \"bdSeq\", None, MetricDataType.Int64, bdSeq - 1)\n    return payload\n######################################################################\n\n######################################################################\n# Get the DBIRTH payload\n######################################################################\ndef getDeviceBirthPayload():\n    payload = sparkplug_b_pb2.Payload()\n    payload.timestamp = int(round(time.time() * 1000))\n    payload.seq = getSeqNum()\n    return payload\n######################################################################\n\n######################################################################\n# Get a DDATA payload\n######################################################################\ndef getDdataPayload():\n    return getDeviceBirthPayload()\n######################################################################\n\n######################################################################\n# Helper method for adding dataset metrics to a payload\n######################################################################\ndef initDatasetMetric(payload, name, alias, columns, types):\n    metric = payload.metrics.add()\n    if name is not None:\n        metric.name = name\n    if alias is not None:\n        metric.alias = alias\n    metric.timestamp = int(round(time.time() * 1000))\n    metric.datatype = MetricDataType.DataSet\n\n    # Set up the dataset\n    metric.dataset_value.num_of_columns = len(types)\n    metric.dataset_value.columns.extend(columns)\n    metric.dataset_value.types.extend(types)\n    return metric.dataset_value\n######################################################################\n\n######################################################################\n# Helper method for adding dataset metrics to a payload\n######################################################################\ndef initTemplateMetric(payload, name, alias, templateRef):\n    metric = payload.metrics.add()\n    if name is not None:\n        metric.name = name\n    if alias is not None:\n        metric.alias = alias\n    metric.timestamp = int(round(time.time() * 1000))\n    metric.datatype = MetricDataType.Template\n\n    # Set up the template\n    if templateRef is not None:\n        metric.template_value.template_ref = templateRef\n        metric.template_value.is_definition = False\n    else:\n        metric.template_value.is_definition = True\n\n    return metric.template_value\n######################################################################\n\n######################################################################\n# Helper method for adding metrics to a container which can be a\n# payload or a template with a timestamp\n######################################################################\n#def addMetric(container, name, alias, type, value):\n#    metric.timestamp = int(round(time.time() * 1000))\n#    return addMetric(container, name, alias, type, value, timestamp)\n\n######################################################################\n# Helper method for adding metrics to a container which can be a\n# payload or a template\n######################################################################\ndef addMetric(container, name, alias, type, value, timestamp=int(round(time.time() * 1000))):\n    metric = container.metrics.add()\n    if name is not None:\n        metric.name = name\n    if alias is not None:\n        metric.alias = alias\n    metric.timestamp = timestamp\n\n    # print( \"Type: \" + str(type))\n\n    if type == MetricDataType.Int8:\n        metric.datatype = MetricDataType.Int8\n        if value < 0:\n            value = value + 2**8\n        metric.int_value = value\n    elif type == MetricDataType.Int16:\n        metric.datatype = MetricDataType.Int16\n        if value < 0:\n            value = value + 2**16\n        metric.int_value = value\n    elif type == MetricDataType.Int32:\n        metric.datatype = MetricDataType.Int32\n        if value < 0:\n            value = value + 2**32\n        metric.int_value = value\n    elif type == MetricDataType.Int64:\n        metric.datatype = MetricDataType.Int64\n        if value < 0:\n            value = value + 2**64\n        metric.long_value = value\n    elif type == MetricDataType.UInt8:\n        metric.datatype = MetricDataType.UInt8\n        metric.int_value = value\n    elif type == MetricDataType.UInt16:\n        metric.datatype = MetricDataType.UInt16\n        metric.int_value = value\n    elif type == MetricDataType.UInt32:\n        metric.datatype = MetricDataType.UInt32\n        metric.int_value = value\n    elif type == MetricDataType.UInt64:\n        metric.datatype = MetricDataType.UInt64\n        metric.long_value = value\n    elif type == MetricDataType.Float:\n        metric.datatype = MetricDataType.Float\n        metric.float_value = value\n    elif type == MetricDataType.Double:\n        metric.datatype = MetricDataType.Double\n        metric.double_value = value\n    elif type == MetricDataType.Boolean:\n        metric.datatype = MetricDataType.Boolean\n        metric.boolean_value = value\n    elif type == MetricDataType.String:\n        metric.datatype = MetricDataType.String\n        metric.string_value = value\n    elif type == MetricDataType.DateTime:\n        metric.datatype = MetricDataType.DateTime\n        metric.long_value = value\n    elif type == MetricDataType.Text:\n        metric.datatype = MetricDataType.Text\n        metric.string_value = value\n    elif type == MetricDataType.UUID:\n        metric.datatype = MetricDataType.UUID\n        metric.string_value = value\n    elif type == MetricDataType.Bytes:\n        metric.datatype = MetricDataType.Bytes\n        metric.bytes_value = value\n    elif type == MetricDataType.File:\n        metric.datatype = MetricDataType.File\n        metric.bytes_value = value\n    elif type == MetricDataType.Template:\n        metric.datatype = MetricDataType.Template\n        metric.template_value = value\n    elif type == MetricDataType.Int8Array:\n        metric.datatype = MetricDataType.Int8Array\n        metric.bytes_value = convert_to_packed_int8_array(value)\n    elif type == MetricDataType.Int16Array:\n        metric.datatype = MetricDataType.Int16Array\n        metric.bytes_value = convert_to_packed_int16_array(value)\n    elif type == MetricDataType.Int32Array:\n        metric.datatype = MetricDataType.Int32Array\n        metric.bytes_value = convert_to_packed_int32_array(value)\n    elif type == MetricDataType.Int64Array:\n        metric.datatype = MetricDataType.Int64Array\n        metric.bytes_value = convert_to_packed_int64_array(value)\n    elif type == MetricDataType.UInt8Array:\n        metric.datatype = MetricDataType.UInt8Array\n        metric.bytes_value = convert_to_packed_uint8_array(value)\n    elif type == MetricDataType.UInt16Array:\n        metric.datatype = MetricDataType.UInt16Array\n        metric.bytes_value = convert_to_packed_uint16_array(value)\n    elif type == MetricDataType.UInt32Array:\n        metric.datatype = MetricDataType.UInt32Array\n        metric.bytes_value = convert_to_packed_uint32_array(value)\n    elif type == MetricDataType.UInt64Array:\n        metric.datatype = MetricDataType.UInt64Array\n        metric.bytes_value = convert_to_packed_uint64_array(value)\n    elif type == MetricDataType.FloatArray:\n        metric.datatype = MetricDataType.FloatArray\n        metric.bytes_value = convert_to_packed_float_array(value)\n    elif type == MetricDataType.DoubleArray:\n        metric.datatype = MetricDataType.DoubleArray\n        metric.bytes_value = convert_to_packed_double_array(value)\n    elif type == MetricDataType.BooleanArray:\n        metric.datatype = MetricDataType.BooleanArray\n        metric.bytes_value = convert_to_packed_boolean_array(value)\n    elif type == MetricDataType.StringArray:\n        metric.datatype = MetricDataType.StringArray\n        metric.bytes_value = convert_to_packed_string_array(value)\n    elif type == MetricDataType.DateTimeArray:\n        metric.datatype = MetricDataType.DateTimeArray\n        metric.bytes_value = convert_to_packed_datetime_array(value)\n    else:\n        print( \"Invalid: \" + str(type))\n\n    # Return the metric\n    return metric\n######################################################################\n\n######################################################################\n# Helper method for adding metrics to a container which can be a\n# payload or a template\n######################################################################\ndef addHistoricalMetric(container, name, alias, type, value):\n    metric = addMetric(container, name, alias, type, value)\n    metric.is_historical = True\n\n    # Return the metric\n    return metric\n######################################################################\n\n######################################################################\n# Helper method for adding metrics to a container which can be a\n# payload or a template\n######################################################################\ndef addNullMetric(container, name, alias, type):\n    metric = container.metrics.add()\n    if name is not None:\n        metric.name = name\n    if alias is not None:\n        metric.alias = alias\n    metric.timestamp = int(round(time.time() * 1000))\n    metric.is_null = True\n\n    # print( \"Type: \" + str(type))\n\n    if type == MetricDataType.Int8:\n        metric.datatype = MetricDataType.Int8\n    elif type == MetricDataType.Int16:\n        metric.datatype = MetricDataType.Int16\n    elif type == MetricDataType.Int32:\n        metric.datatype = MetricDataType.Int32\n    elif type == MetricDataType.Int64:\n        metric.datatype = MetricDataType.Int64\n    elif type == MetricDataType.UInt8:\n        metric.datatype = MetricDataType.UInt8\n    elif type == MetricDataType.UInt16:\n        metric.datatype = MetricDataType.UInt16\n    elif type == MetricDataType.UInt32:\n        metric.datatype = MetricDataType.UInt32\n    elif type == MetricDataType.UInt64:\n        metric.datatype = MetricDataType.UInt64\n    elif type == MetricDataType.Float:\n        metric.datatype = MetricDataType.Float\n    elif type == MetricDataType.Double:\n        metric.datatype = MetricDataType.Double\n    elif type == MetricDataType.Boolean:\n        metric.datatype = MetricDataType.Boolean\n    elif type == MetricDataType.String:\n        metric.datatype = MetricDataType.String\n    elif type == MetricDataType.DateTime:\n        metric.datatype = MetricDataType.DateTime\n    elif type == MetricDataType.Text:\n        metric.datatype = MetricDataType.Text\n    elif type == MetricDataType.UUID:\n        metric.datatype = MetricDataType.UUID\n    elif type == MetricDataType.Bytes:\n        metric.datatype = MetricDataType.Bytes\n    elif type == MetricDataType.File:\n        metric.datatype = MetricDataType.File\n    elif type == MetricDataType.Template:\n        metric.datatype = MetricDataType.Template\n    elif type == MetricDataType.Int8Array:\n        metric.datatype = MetricDataType.Int8Array\n    elif type == MetricDataType.Int16Array:\n        metric.datatype = MetricDataType.Int16Array\n    elif type == MetricDataType.Int32Array:\n        metric.datatype = MetricDataType.Int32Array\n    elif type == MetricDataType.Int64Array:\n        metric.datatype = MetricDataType.Int64Array\n    elif type == MetricDataType.UInt8Array:\n        metric.datatype = MetricDataType.UInt8Array\n    elif type == MetricDataType.UInt16Array:\n        metric.datatype = MetricDataType.UInt16Array\n    elif type == MetricDataType.UInt32Array:\n        metric.datatype = MetricDataType.UInt32Array\n    elif type == MetricDataType.UInt64Array:\n        metric.datatype = MetricDataType.UInt64Array\n    elif type == MetricDataType.FloatArray:\n        metric.datatype = MetricDataType.FloatArray\n    elif type == MetricDataType.DoubleArray:\n        metric.datatype = MetricDataType.DoubleArray\n    elif type == MetricDataType.BooleanArray:\n        metric.datatype = MetricDataType.BooleanArray\n    elif type == MetricDataType.StringArray:\n        metric.datatype = MetricDataType.StringArray\n    elif type == MetricDataType.DateTimeArray:\n        metric.datatype = MetricDataType.DateTimeArray\n    else:\n        print( \"Invalid: \" + str(type))\n\n    # Return the metric\n    return metric\n######################################################################\n\n######################################################################\n# Helper method for getting the next sequence number\n######################################################################\ndef getSeqNum():\n    global seqNum\n    retVal = seqNum\n    # print(\"seqNum: \" + str(retVal))\n    seqNum += 1\n    if seqNum == 256:\n        seqNum = 0\n    return retVal\n######################################################################\n\n######################################################################\n# Helper method for getting the next birth/death sequence number\n######################################################################\ndef getBdSeqNum():\n    global bdSeq\n    retVal = bdSeq\n    # print(\"bdSeqNum: \" + str(retVal))\n    bdSeq += 1\n    if bdSeq == 256:\n        bdSeq = 0\n    return retVal\n######################################################################\n"
  },
  {
    "path": "python/core/sparkplug_b_pb2.py",
    "content": "# Generated by the protocol buffer compiler.  DO NOT EDIT!\n# source: sparkplug_b.proto\n\nimport sys\n_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))\nfrom google.protobuf import descriptor as _descriptor\nfrom google.protobuf import message as _message\nfrom google.protobuf import reflection as _reflection\nfrom google.protobuf import symbol_database as _symbol_database\nfrom google.protobuf import descriptor_pb2\n# @@protoc_insertion_point(imports)\n\n_sym_db = _symbol_database.Default()\n\n\n\n\nDESCRIPTOR = _descriptor.FileDescriptor(\n  name='sparkplug_b.proto',\n  package='org.eclipse.tahu.protobuf',\n  syntax='proto2',\n  serialized_pb=_b('\\n\\x11sparkplug_b.proto\\x12\\x19org.eclipse.tahu.protobuf\\\"\\xee\\x15\\n\\x07Payload\\x12\\x11\\n\\ttimestamp\\x18\\x01 \\x01(\\x04\\x12:\\n\\x07metrics\\x18\\x02 \\x03(\\x0b\\x32).org.eclipse.tahu.protobuf.Payload.Metric\\x12\\x0b\\n\\x03seq\\x18\\x03 \\x01(\\x04\\x12\\x0c\\n\\x04uuid\\x18\\x04 \\x01(\\t\\x12\\x0c\\n\\x04\\x62ody\\x18\\x05 \\x01(\\x0c\\x1a\\xa6\\x04\\n\\x08Template\\x12\\x0f\\n\\x07version\\x18\\x01 \\x01(\\t\\x12:\\n\\x07metrics\\x18\\x02 \\x03(\\x0b\\x32).org.eclipse.tahu.protobuf.Payload.Metric\\x12I\\n\\nparameters\\x18\\x03 \\x03(\\x0b\\x32\\x35.org.eclipse.tahu.protobuf.Payload.Template.Parameter\\x12\\x14\\n\\x0ctemplate_ref\\x18\\x04 \\x01(\\t\\x12\\x15\\n\\ris_definition\\x18\\x05 \\x01(\\x08\\x1a\\xca\\x02\\n\\tParameter\\x12\\x0c\\n\\x04name\\x18\\x01 \\x01(\\t\\x12\\x0c\\n\\x04type\\x18\\x02 \\x01(\\r\\x12\\x13\\n\\tint_value\\x18\\x03 \\x01(\\rH\\x00\\x12\\x14\\n\\nlong_value\\x18\\x04 \\x01(\\x04H\\x00\\x12\\x15\\n\\x0b\\x66loat_value\\x18\\x05 \\x01(\\x02H\\x00\\x12\\x16\\n\\x0c\\x64ouble_value\\x18\\x06 \\x01(\\x01H\\x00\\x12\\x17\\n\\rboolean_value\\x18\\x07 \\x01(\\x08H\\x00\\x12\\x16\\n\\x0cstring_value\\x18\\x08 \\x01(\\tH\\x00\\x12h\\n\\x0f\\x65xtension_value\\x18\\t \\x01(\\x0b\\x32M.org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtensionH\\x00\\x1a#\\n\\x17ParameterValueExtension*\\x08\\x08\\x01\\x10\\x80\\x80\\x80\\x80\\x02\\x42\\x07\\n\\x05value*\\x08\\x08\\x06\\x10\\x80\\x80\\x80\\x80\\x02\\x1a\\x97\\x04\\n\\x07\\x44\\x61taSet\\x12\\x16\\n\\x0enum_of_columns\\x18\\x01 \\x01(\\x04\\x12\\x0f\\n\\x07\\x63olumns\\x18\\x02 \\x03(\\t\\x12\\r\\n\\x05types\\x18\\x03 \\x03(\\r\\x12<\\n\\x04rows\\x18\\x04 \\x03(\\x0b\\x32..org.eclipse.tahu.protobuf.Payload.DataSet.Row\\x1a\\xaf\\x02\\n\\x0c\\x44\\x61taSetValue\\x12\\x13\\n\\tint_value\\x18\\x01 \\x01(\\rH\\x00\\x12\\x14\\n\\nlong_value\\x18\\x02 \\x01(\\x04H\\x00\\x12\\x15\\n\\x0b\\x66loat_value\\x18\\x03 \\x01(\\x02H\\x00\\x12\\x16\\n\\x0c\\x64ouble_value\\x18\\x04 \\x01(\\x01H\\x00\\x12\\x17\\n\\rboolean_value\\x18\\x05 \\x01(\\x08H\\x00\\x12\\x16\\n\\x0cstring_value\\x18\\x06 \\x01(\\tH\\x00\\x12h\\n\\x0f\\x65xtension_value\\x18\\x07 \\x01(\\x0b\\x32M.org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtensionH\\x00\\x1a!\\n\\x15\\x44\\x61taSetValueExtension*\\x08\\x08\\x01\\x10\\x80\\x80\\x80\\x80\\x02\\x42\\x07\\n\\x05value\\x1aZ\\n\\x03Row\\x12I\\n\\x08\\x65lements\\x18\\x01 \\x03(\\x0b\\x32\\x37.org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue*\\x08\\x08\\x02\\x10\\x80\\x80\\x80\\x80\\x02*\\x08\\x08\\x05\\x10\\x80\\x80\\x80\\x80\\x02\\x1a\\xe9\\x03\\n\\rPropertyValue\\x12\\x0c\\n\\x04type\\x18\\x01 \\x01(\\r\\x12\\x0f\\n\\x07is_null\\x18\\x02 \\x01(\\x08\\x12\\x13\\n\\tint_value\\x18\\x03 \\x01(\\rH\\x00\\x12\\x14\\n\\nlong_value\\x18\\x04 \\x01(\\x04H\\x00\\x12\\x15\\n\\x0b\\x66loat_value\\x18\\x05 \\x01(\\x02H\\x00\\x12\\x16\\n\\x0c\\x64ouble_value\\x18\\x06 \\x01(\\x01H\\x00\\x12\\x17\\n\\rboolean_value\\x18\\x07 \\x01(\\x08H\\x00\\x12\\x16\\n\\x0cstring_value\\x18\\x08 \\x01(\\tH\\x00\\x12K\\n\\x11propertyset_value\\x18\\t \\x01(\\x0b\\x32..org.eclipse.tahu.protobuf.Payload.PropertySetH\\x00\\x12P\\n\\x12propertysets_value\\x18\\n \\x01(\\x0b\\x32\\x32.org.eclipse.tahu.protobuf.Payload.PropertySetListH\\x00\\x12\\x62\\n\\x0f\\x65xtension_value\\x18\\x0b \\x01(\\x0b\\x32G.org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtensionH\\x00\\x1a\\\"\\n\\x16PropertyValueExtension*\\x08\\x08\\x01\\x10\\x80\\x80\\x80\\x80\\x02\\x42\\x07\\n\\x05value\\x1ag\\n\\x0bPropertySet\\x12\\x0c\\n\\x04keys\\x18\\x01 \\x03(\\t\\x12@\\n\\x06values\\x18\\x02 \\x03(\\x0b\\x32\\x30.org.eclipse.tahu.protobuf.Payload.PropertyValue*\\x08\\x08\\x03\\x10\\x80\\x80\\x80\\x80\\x02\\x1a`\\n\\x0fPropertySetList\\x12\\x43\\n\\x0bpropertyset\\x18\\x01 \\x03(\\x0b\\x32..org.eclipse.tahu.protobuf.Payload.PropertySet*\\x08\\x08\\x02\\x10\\x80\\x80\\x80\\x80\\x02\\x1a\\xa4\\x01\\n\\x08MetaData\\x12\\x15\\n\\ris_multi_part\\x18\\x01 \\x01(\\x08\\x12\\x14\\n\\x0c\\x63ontent_type\\x18\\x02 \\x01(\\t\\x12\\x0c\\n\\x04size\\x18\\x03 \\x01(\\x04\\x12\\x0b\\n\\x03seq\\x18\\x04 \\x01(\\x04\\x12\\x11\\n\\tfile_name\\x18\\x05 \\x01(\\t\\x12\\x11\\n\\tfile_type\\x18\\x06 \\x01(\\t\\x12\\x0b\\n\\x03md5\\x18\\x07 \\x01(\\t\\x12\\x13\\n\\x0b\\x64\\x65scription\\x18\\x08 \\x01(\\t*\\x08\\x08\\t\\x10\\x80\\x80\\x80\\x80\\x02\\x1a\\xbf\\x05\\n\\x06Metric\\x12\\x0c\\n\\x04name\\x18\\x01 \\x01(\\t\\x12\\r\\n\\x05\\x61lias\\x18\\x02 \\x01(\\x04\\x12\\x11\\n\\ttimestamp\\x18\\x03 \\x01(\\x04\\x12\\x10\\n\\x08\\x64\\x61tatype\\x18\\x04 \\x01(\\r\\x12\\x15\\n\\ris_historical\\x18\\x05 \\x01(\\x08\\x12\\x14\\n\\x0cis_transient\\x18\\x06 \\x01(\\x08\\x12\\x0f\\n\\x07is_null\\x18\\x07 \\x01(\\x08\\x12=\\n\\x08metadata\\x18\\x08 \\x01(\\x0b\\x32+.org.eclipse.tahu.protobuf.Payload.MetaData\\x12\\x42\\n\\nproperties\\x18\\t \\x01(\\x0b\\x32..org.eclipse.tahu.protobuf.Payload.PropertySet\\x12\\x13\\n\\tint_value\\x18\\n \\x01(\\rH\\x00\\x12\\x14\\n\\nlong_value\\x18\\x0b \\x01(\\x04H\\x00\\x12\\x15\\n\\x0b\\x66loat_value\\x18\\x0c \\x01(\\x02H\\x00\\x12\\x16\\n\\x0c\\x64ouble_value\\x18\\r \\x01(\\x01H\\x00\\x12\\x17\\n\\rboolean_value\\x18\\x0e \\x01(\\x08H\\x00\\x12\\x16\\n\\x0cstring_value\\x18\\x0f \\x01(\\tH\\x00\\x12\\x15\\n\\x0b\\x62ytes_value\\x18\\x10 \\x01(\\x0cH\\x00\\x12\\x43\\n\\rdataset_value\\x18\\x11 \\x01(\\x0b\\x32*.org.eclipse.tahu.protobuf.Payload.DataSetH\\x00\\x12\\x45\\n\\x0etemplate_value\\x18\\x12 \\x01(\\x0b\\x32+.org.eclipse.tahu.protobuf.Payload.TemplateH\\x00\\x12Y\\n\\x0f\\x65xtension_value\\x18\\x13 \\x01(\\x0b\\x32>.org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtensionH\\x00\\x1a \\n\\x14MetricValueExtension*\\x08\\x08\\x01\\x10\\x80\\x80\\x80\\x80\\x02\\x42\\x07\\n\\x05value*\\x08\\x08\\x06\\x10\\x80\\x80\\x80\\x80\\x02\\x42,\\n\\x19org.eclipse.tahu.protobufB\\x0fSparkplugBProto')\n)\n_sym_db.RegisterFileDescriptor(DESCRIPTOR)\n\n\n\n\n_PAYLOAD_TEMPLATE_PARAMETER_PARAMETERVALUEEXTENSION = _descriptor.Descriptor(\n  name='ParameterValueExtension',\n  full_name='org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=True,\n  syntax='proto2',\n  extension_ranges=[(1, 536870912), ],\n  oneofs=[\n  ],\n  serialized_start=677,\n  serialized_end=712,\n)\n\n_PAYLOAD_TEMPLATE_PARAMETER = _descriptor.Descriptor(\n  name='Parameter',\n  full_name='org.eclipse.tahu.protobuf.Payload.Template.Parameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='name', full_name='org.eclipse.tahu.protobuf.Payload.Template.Parameter.name', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\").decode('utf-8'),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='type', full_name='org.eclipse.tahu.protobuf.Payload.Template.Parameter.type', index=1,\n      number=2, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='int_value', full_name='org.eclipse.tahu.protobuf.Payload.Template.Parameter.int_value', index=2,\n      number=3, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='long_value', full_name='org.eclipse.tahu.protobuf.Payload.Template.Parameter.long_value', index=3,\n      number=4, type=4, cpp_type=4, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='float_value', full_name='org.eclipse.tahu.protobuf.Payload.Template.Parameter.float_value', index=4,\n      number=5, type=2, cpp_type=6, label=1,\n      has_default_value=False, default_value=float(0),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='double_value', full_name='org.eclipse.tahu.protobuf.Payload.Template.Parameter.double_value', index=5,\n      number=6, type=1, cpp_type=5, label=1,\n      has_default_value=False, default_value=float(0),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='boolean_value', full_name='org.eclipse.tahu.protobuf.Payload.Template.Parameter.boolean_value', index=6,\n      number=7, type=8, cpp_type=7, label=1,\n      has_default_value=False, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='string_value', full_name='org.eclipse.tahu.protobuf.Payload.Template.Parameter.string_value', index=7,\n      number=8, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\").decode('utf-8'),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='extension_value', full_name='org.eclipse.tahu.protobuf.Payload.Template.Parameter.extension_value', index=8,\n      number=9, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[_PAYLOAD_TEMPLATE_PARAMETER_PARAMETERVALUEEXTENSION, ],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  syntax='proto2',\n  extension_ranges=[],\n  oneofs=[\n    _descriptor.OneofDescriptor(\n      name='value', full_name='org.eclipse.tahu.protobuf.Payload.Template.Parameter.value',\n      index=0, containing_type=None, fields=[]),\n  ],\n  serialized_start=391,\n  serialized_end=721,\n)\n\n_PAYLOAD_TEMPLATE = _descriptor.Descriptor(\n  name='Template',\n  full_name='org.eclipse.tahu.protobuf.Payload.Template',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='version', full_name='org.eclipse.tahu.protobuf.Payload.Template.version', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\").decode('utf-8'),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='metrics', full_name='org.eclipse.tahu.protobuf.Payload.Template.metrics', index=1,\n      number=2, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='parameters', full_name='org.eclipse.tahu.protobuf.Payload.Template.parameters', index=2,\n      number=3, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='template_ref', full_name='org.eclipse.tahu.protobuf.Payload.Template.template_ref', index=3,\n      number=4, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\").decode('utf-8'),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='is_definition', full_name='org.eclipse.tahu.protobuf.Payload.Template.is_definition', index=4,\n      number=5, type=8, cpp_type=7, label=1,\n      has_default_value=False, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[_PAYLOAD_TEMPLATE_PARAMETER, ],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=True,\n  syntax='proto2',\n  extension_ranges=[(6, 536870912), ],\n  oneofs=[\n  ],\n  serialized_start=181,\n  serialized_end=731,\n)\n\n_PAYLOAD_DATASET_DATASETVALUE_DATASETVALUEEXTENSION = _descriptor.Descriptor(\n  name='DataSetValueExtension',\n  full_name='org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=True,\n  syntax='proto2',\n  extension_ranges=[(1, 536870912), ],\n  oneofs=[\n  ],\n  serialized_start=1125,\n  serialized_end=1158,\n)\n\n_PAYLOAD_DATASET_DATASETVALUE = _descriptor.Descriptor(\n  name='DataSetValue',\n  full_name='org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='int_value', full_name='org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.int_value', index=0,\n      number=1, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='long_value', full_name='org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.long_value', index=1,\n      number=2, type=4, cpp_type=4, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='float_value', full_name='org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.float_value', index=2,\n      number=3, type=2, cpp_type=6, label=1,\n      has_default_value=False, default_value=float(0),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='double_value', full_name='org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.double_value', index=3,\n      number=4, type=1, cpp_type=5, label=1,\n      has_default_value=False, default_value=float(0),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='boolean_value', full_name='org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.boolean_value', index=4,\n      number=5, type=8, cpp_type=7, label=1,\n      has_default_value=False, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='string_value', full_name='org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.string_value', index=5,\n      number=6, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\").decode('utf-8'),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='extension_value', full_name='org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.extension_value', index=6,\n      number=7, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[_PAYLOAD_DATASET_DATASETVALUE_DATASETVALUEEXTENSION, ],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  syntax='proto2',\n  extension_ranges=[],\n  oneofs=[\n    _descriptor.OneofDescriptor(\n      name='value', full_name='org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.value',\n      index=0, containing_type=None, fields=[]),\n  ],\n  serialized_start=864,\n  serialized_end=1167,\n)\n\n_PAYLOAD_DATASET_ROW = _descriptor.Descriptor(\n  name='Row',\n  full_name='org.eclipse.tahu.protobuf.Payload.DataSet.Row',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='elements', full_name='org.eclipse.tahu.protobuf.Payload.DataSet.Row.elements', index=0,\n      number=1, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=True,\n  syntax='proto2',\n  extension_ranges=[(2, 536870912), ],\n  oneofs=[\n  ],\n  serialized_start=1169,\n  serialized_end=1259,\n)\n\n_PAYLOAD_DATASET = _descriptor.Descriptor(\n  name='DataSet',\n  full_name='org.eclipse.tahu.protobuf.Payload.DataSet',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='num_of_columns', full_name='org.eclipse.tahu.protobuf.Payload.DataSet.num_of_columns', index=0,\n      number=1, type=4, cpp_type=4, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='columns', full_name='org.eclipse.tahu.protobuf.Payload.DataSet.columns', index=1,\n      number=2, type=9, cpp_type=9, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='types', full_name='org.eclipse.tahu.protobuf.Payload.DataSet.types', index=2,\n      number=3, type=13, cpp_type=3, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='rows', full_name='org.eclipse.tahu.protobuf.Payload.DataSet.rows', index=3,\n      number=4, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[_PAYLOAD_DATASET_DATASETVALUE, _PAYLOAD_DATASET_ROW, ],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=True,\n  syntax='proto2',\n  extension_ranges=[(5, 536870912), ],\n  oneofs=[\n  ],\n  serialized_start=734,\n  serialized_end=1269,\n)\n\n_PAYLOAD_PROPERTYVALUE_PROPERTYVALUEEXTENSION = _descriptor.Descriptor(\n  name='PropertyValueExtension',\n  full_name='org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=True,\n  syntax='proto2',\n  extension_ranges=[(1, 536870912), ],\n  oneofs=[\n  ],\n  serialized_start=1718,\n  serialized_end=1752,\n)\n\n_PAYLOAD_PROPERTYVALUE = _descriptor.Descriptor(\n  name='PropertyValue',\n  full_name='org.eclipse.tahu.protobuf.Payload.PropertyValue',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='type', full_name='org.eclipse.tahu.protobuf.Payload.PropertyValue.type', index=0,\n      number=1, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='is_null', full_name='org.eclipse.tahu.protobuf.Payload.PropertyValue.is_null', index=1,\n      number=2, type=8, cpp_type=7, label=1,\n      has_default_value=False, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='int_value', full_name='org.eclipse.tahu.protobuf.Payload.PropertyValue.int_value', index=2,\n      number=3, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='long_value', full_name='org.eclipse.tahu.protobuf.Payload.PropertyValue.long_value', index=3,\n      number=4, type=4, cpp_type=4, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='float_value', full_name='org.eclipse.tahu.protobuf.Payload.PropertyValue.float_value', index=4,\n      number=5, type=2, cpp_type=6, label=1,\n      has_default_value=False, default_value=float(0),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='double_value', full_name='org.eclipse.tahu.protobuf.Payload.PropertyValue.double_value', index=5,\n      number=6, type=1, cpp_type=5, label=1,\n      has_default_value=False, default_value=float(0),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='boolean_value', full_name='org.eclipse.tahu.protobuf.Payload.PropertyValue.boolean_value', index=6,\n      number=7, type=8, cpp_type=7, label=1,\n      has_default_value=False, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='string_value', full_name='org.eclipse.tahu.protobuf.Payload.PropertyValue.string_value', index=7,\n      number=8, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\").decode('utf-8'),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='propertyset_value', full_name='org.eclipse.tahu.protobuf.Payload.PropertyValue.propertyset_value', index=8,\n      number=9, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='propertysets_value', full_name='org.eclipse.tahu.protobuf.Payload.PropertyValue.propertysets_value', index=9,\n      number=10, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='extension_value', full_name='org.eclipse.tahu.protobuf.Payload.PropertyValue.extension_value', index=10,\n      number=11, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[_PAYLOAD_PROPERTYVALUE_PROPERTYVALUEEXTENSION, ],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  syntax='proto2',\n  extension_ranges=[],\n  oneofs=[\n    _descriptor.OneofDescriptor(\n      name='value', full_name='org.eclipse.tahu.protobuf.Payload.PropertyValue.value',\n      index=0, containing_type=None, fields=[]),\n  ],\n  serialized_start=1272,\n  serialized_end=1761,\n)\n\n_PAYLOAD_PROPERTYSET = _descriptor.Descriptor(\n  name='PropertySet',\n  full_name='org.eclipse.tahu.protobuf.Payload.PropertySet',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='keys', full_name='org.eclipse.tahu.protobuf.Payload.PropertySet.keys', index=0,\n      number=1, type=9, cpp_type=9, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='values', full_name='org.eclipse.tahu.protobuf.Payload.PropertySet.values', index=1,\n      number=2, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=True,\n  syntax='proto2',\n  extension_ranges=[(3, 536870912), ],\n  oneofs=[\n  ],\n  serialized_start=1763,\n  serialized_end=1866,\n)\n\n_PAYLOAD_PROPERTYSETLIST = _descriptor.Descriptor(\n  name='PropertySetList',\n  full_name='org.eclipse.tahu.protobuf.Payload.PropertySetList',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='propertyset', full_name='org.eclipse.tahu.protobuf.Payload.PropertySetList.propertyset', index=0,\n      number=1, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=True,\n  syntax='proto2',\n  extension_ranges=[(2, 536870912), ],\n  oneofs=[\n  ],\n  serialized_start=1868,\n  serialized_end=1964,\n)\n\n_PAYLOAD_METADATA = _descriptor.Descriptor(\n  name='MetaData',\n  full_name='org.eclipse.tahu.protobuf.Payload.MetaData',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='is_multi_part', full_name='org.eclipse.tahu.protobuf.Payload.MetaData.is_multi_part', index=0,\n      number=1, type=8, cpp_type=7, label=1,\n      has_default_value=False, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='content_type', full_name='org.eclipse.tahu.protobuf.Payload.MetaData.content_type', index=1,\n      number=2, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\").decode('utf-8'),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='size', full_name='org.eclipse.tahu.protobuf.Payload.MetaData.size', index=2,\n      number=3, type=4, cpp_type=4, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='seq', full_name='org.eclipse.tahu.protobuf.Payload.MetaData.seq', index=3,\n      number=4, type=4, cpp_type=4, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='file_name', full_name='org.eclipse.tahu.protobuf.Payload.MetaData.file_name', index=4,\n      number=5, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\").decode('utf-8'),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='file_type', full_name='org.eclipse.tahu.protobuf.Payload.MetaData.file_type', index=5,\n      number=6, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\").decode('utf-8'),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='md5', full_name='org.eclipse.tahu.protobuf.Payload.MetaData.md5', index=6,\n      number=7, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\").decode('utf-8'),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='description', full_name='org.eclipse.tahu.protobuf.Payload.MetaData.description', index=7,\n      number=8, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\").decode('utf-8'),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=True,\n  syntax='proto2',\n  extension_ranges=[(9, 536870912), ],\n  oneofs=[\n  ],\n  serialized_start=1967,\n  serialized_end=2131,\n)\n\n_PAYLOAD_METRIC_METRICVALUEEXTENSION = _descriptor.Descriptor(\n  name='MetricValueExtension',\n  full_name='org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=True,\n  syntax='proto2',\n  extension_ranges=[(1, 536870912), ],\n  oneofs=[\n  ],\n  serialized_start=2796,\n  serialized_end=2828,\n)\n\n_PAYLOAD_METRIC = _descriptor.Descriptor(\n  name='Metric',\n  full_name='org.eclipse.tahu.protobuf.Payload.Metric',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='name', full_name='org.eclipse.tahu.protobuf.Payload.Metric.name', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\").decode('utf-8'),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='alias', full_name='org.eclipse.tahu.protobuf.Payload.Metric.alias', index=1,\n      number=2, type=4, cpp_type=4, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='timestamp', full_name='org.eclipse.tahu.protobuf.Payload.Metric.timestamp', index=2,\n      number=3, type=4, cpp_type=4, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='datatype', full_name='org.eclipse.tahu.protobuf.Payload.Metric.datatype', index=3,\n      number=4, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='is_historical', full_name='org.eclipse.tahu.protobuf.Payload.Metric.is_historical', index=4,\n      number=5, type=8, cpp_type=7, label=1,\n      has_default_value=False, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='is_transient', full_name='org.eclipse.tahu.protobuf.Payload.Metric.is_transient', index=5,\n      number=6, type=8, cpp_type=7, label=1,\n      has_default_value=False, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='is_null', full_name='org.eclipse.tahu.protobuf.Payload.Metric.is_null', index=6,\n      number=7, type=8, cpp_type=7, label=1,\n      has_default_value=False, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='metadata', full_name='org.eclipse.tahu.protobuf.Payload.Metric.metadata', index=7,\n      number=8, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='properties', full_name='org.eclipse.tahu.protobuf.Payload.Metric.properties', index=8,\n      number=9, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='int_value', full_name='org.eclipse.tahu.protobuf.Payload.Metric.int_value', index=9,\n      number=10, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='long_value', full_name='org.eclipse.tahu.protobuf.Payload.Metric.long_value', index=10,\n      number=11, type=4, cpp_type=4, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='float_value', full_name='org.eclipse.tahu.protobuf.Payload.Metric.float_value', index=11,\n      number=12, type=2, cpp_type=6, label=1,\n      has_default_value=False, default_value=float(0),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='double_value', full_name='org.eclipse.tahu.protobuf.Payload.Metric.double_value', index=12,\n      number=13, type=1, cpp_type=5, label=1,\n      has_default_value=False, default_value=float(0),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='boolean_value', full_name='org.eclipse.tahu.protobuf.Payload.Metric.boolean_value', index=13,\n      number=14, type=8, cpp_type=7, label=1,\n      has_default_value=False, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='string_value', full_name='org.eclipse.tahu.protobuf.Payload.Metric.string_value', index=14,\n      number=15, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\").decode('utf-8'),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='bytes_value', full_name='org.eclipse.tahu.protobuf.Payload.Metric.bytes_value', index=15,\n      number=16, type=12, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='dataset_value', full_name='org.eclipse.tahu.protobuf.Payload.Metric.dataset_value', index=16,\n      number=17, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='template_value', full_name='org.eclipse.tahu.protobuf.Payload.Metric.template_value', index=17,\n      number=18, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='extension_value', full_name='org.eclipse.tahu.protobuf.Payload.Metric.extension_value', index=18,\n      number=19, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[_PAYLOAD_METRIC_METRICVALUEEXTENSION, ],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  syntax='proto2',\n  extension_ranges=[],\n  oneofs=[\n    _descriptor.OneofDescriptor(\n      name='value', full_name='org.eclipse.tahu.protobuf.Payload.Metric.value',\n      index=0, containing_type=None, fields=[]),\n  ],\n  serialized_start=2134,\n  serialized_end=2837,\n)\n\n_PAYLOAD = _descriptor.Descriptor(\n  name='Payload',\n  full_name='org.eclipse.tahu.protobuf.Payload',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='timestamp', full_name='org.eclipse.tahu.protobuf.Payload.timestamp', index=0,\n      number=1, type=4, cpp_type=4, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='metrics', full_name='org.eclipse.tahu.protobuf.Payload.metrics', index=1,\n      number=2, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='seq', full_name='org.eclipse.tahu.protobuf.Payload.seq', index=2,\n      number=3, type=4, cpp_type=4, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='uuid', full_name='org.eclipse.tahu.protobuf.Payload.uuid', index=3,\n      number=4, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\").decode('utf-8'),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='body', full_name='org.eclipse.tahu.protobuf.Payload.body', index=4,\n      number=5, type=12, cpp_type=9, label=1,\n      has_default_value=False, default_value=_b(\"\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[_PAYLOAD_TEMPLATE, _PAYLOAD_DATASET, _PAYLOAD_PROPERTYVALUE, _PAYLOAD_PROPERTYSET, _PAYLOAD_PROPERTYSETLIST, _PAYLOAD_METADATA, _PAYLOAD_METRIC, ],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=True,\n  syntax='proto2',\n  extension_ranges=[(6, 536870912), ],\n  oneofs=[\n  ],\n  serialized_start=49,\n  serialized_end=2847,\n)\n\n_PAYLOAD_TEMPLATE_PARAMETER_PARAMETERVALUEEXTENSION.containing_type = _PAYLOAD_TEMPLATE_PARAMETER\n_PAYLOAD_TEMPLATE_PARAMETER.fields_by_name['extension_value'].message_type = _PAYLOAD_TEMPLATE_PARAMETER_PARAMETERVALUEEXTENSION\n_PAYLOAD_TEMPLATE_PARAMETER.containing_type = _PAYLOAD_TEMPLATE\n_PAYLOAD_TEMPLATE_PARAMETER.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_TEMPLATE_PARAMETER.fields_by_name['int_value'])\n_PAYLOAD_TEMPLATE_PARAMETER.fields_by_name['int_value'].containing_oneof = _PAYLOAD_TEMPLATE_PARAMETER.oneofs_by_name['value']\n_PAYLOAD_TEMPLATE_PARAMETER.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_TEMPLATE_PARAMETER.fields_by_name['long_value'])\n_PAYLOAD_TEMPLATE_PARAMETER.fields_by_name['long_value'].containing_oneof = _PAYLOAD_TEMPLATE_PARAMETER.oneofs_by_name['value']\n_PAYLOAD_TEMPLATE_PARAMETER.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_TEMPLATE_PARAMETER.fields_by_name['float_value'])\n_PAYLOAD_TEMPLATE_PARAMETER.fields_by_name['float_value'].containing_oneof = _PAYLOAD_TEMPLATE_PARAMETER.oneofs_by_name['value']\n_PAYLOAD_TEMPLATE_PARAMETER.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_TEMPLATE_PARAMETER.fields_by_name['double_value'])\n_PAYLOAD_TEMPLATE_PARAMETER.fields_by_name['double_value'].containing_oneof = _PAYLOAD_TEMPLATE_PARAMETER.oneofs_by_name['value']\n_PAYLOAD_TEMPLATE_PARAMETER.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_TEMPLATE_PARAMETER.fields_by_name['boolean_value'])\n_PAYLOAD_TEMPLATE_PARAMETER.fields_by_name['boolean_value'].containing_oneof = _PAYLOAD_TEMPLATE_PARAMETER.oneofs_by_name['value']\n_PAYLOAD_TEMPLATE_PARAMETER.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_TEMPLATE_PARAMETER.fields_by_name['string_value'])\n_PAYLOAD_TEMPLATE_PARAMETER.fields_by_name['string_value'].containing_oneof = _PAYLOAD_TEMPLATE_PARAMETER.oneofs_by_name['value']\n_PAYLOAD_TEMPLATE_PARAMETER.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_TEMPLATE_PARAMETER.fields_by_name['extension_value'])\n_PAYLOAD_TEMPLATE_PARAMETER.fields_by_name['extension_value'].containing_oneof = _PAYLOAD_TEMPLATE_PARAMETER.oneofs_by_name['value']\n_PAYLOAD_TEMPLATE.fields_by_name['metrics'].message_type = _PAYLOAD_METRIC\n_PAYLOAD_TEMPLATE.fields_by_name['parameters'].message_type = _PAYLOAD_TEMPLATE_PARAMETER\n_PAYLOAD_TEMPLATE.containing_type = _PAYLOAD\n_PAYLOAD_DATASET_DATASETVALUE_DATASETVALUEEXTENSION.containing_type = _PAYLOAD_DATASET_DATASETVALUE\n_PAYLOAD_DATASET_DATASETVALUE.fields_by_name['extension_value'].message_type = _PAYLOAD_DATASET_DATASETVALUE_DATASETVALUEEXTENSION\n_PAYLOAD_DATASET_DATASETVALUE.containing_type = _PAYLOAD_DATASET\n_PAYLOAD_DATASET_DATASETVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_DATASET_DATASETVALUE.fields_by_name['int_value'])\n_PAYLOAD_DATASET_DATASETVALUE.fields_by_name['int_value'].containing_oneof = _PAYLOAD_DATASET_DATASETVALUE.oneofs_by_name['value']\n_PAYLOAD_DATASET_DATASETVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_DATASET_DATASETVALUE.fields_by_name['long_value'])\n_PAYLOAD_DATASET_DATASETVALUE.fields_by_name['long_value'].containing_oneof = _PAYLOAD_DATASET_DATASETVALUE.oneofs_by_name['value']\n_PAYLOAD_DATASET_DATASETVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_DATASET_DATASETVALUE.fields_by_name['float_value'])\n_PAYLOAD_DATASET_DATASETVALUE.fields_by_name['float_value'].containing_oneof = _PAYLOAD_DATASET_DATASETVALUE.oneofs_by_name['value']\n_PAYLOAD_DATASET_DATASETVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_DATASET_DATASETVALUE.fields_by_name['double_value'])\n_PAYLOAD_DATASET_DATASETVALUE.fields_by_name['double_value'].containing_oneof = _PAYLOAD_DATASET_DATASETVALUE.oneofs_by_name['value']\n_PAYLOAD_DATASET_DATASETVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_DATASET_DATASETVALUE.fields_by_name['boolean_value'])\n_PAYLOAD_DATASET_DATASETVALUE.fields_by_name['boolean_value'].containing_oneof = _PAYLOAD_DATASET_DATASETVALUE.oneofs_by_name['value']\n_PAYLOAD_DATASET_DATASETVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_DATASET_DATASETVALUE.fields_by_name['string_value'])\n_PAYLOAD_DATASET_DATASETVALUE.fields_by_name['string_value'].containing_oneof = _PAYLOAD_DATASET_DATASETVALUE.oneofs_by_name['value']\n_PAYLOAD_DATASET_DATASETVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_DATASET_DATASETVALUE.fields_by_name['extension_value'])\n_PAYLOAD_DATASET_DATASETVALUE.fields_by_name['extension_value'].containing_oneof = _PAYLOAD_DATASET_DATASETVALUE.oneofs_by_name['value']\n_PAYLOAD_DATASET_ROW.fields_by_name['elements'].message_type = _PAYLOAD_DATASET_DATASETVALUE\n_PAYLOAD_DATASET_ROW.containing_type = _PAYLOAD_DATASET\n_PAYLOAD_DATASET.fields_by_name['rows'].message_type = _PAYLOAD_DATASET_ROW\n_PAYLOAD_DATASET.containing_type = _PAYLOAD\n_PAYLOAD_PROPERTYVALUE_PROPERTYVALUEEXTENSION.containing_type = _PAYLOAD_PROPERTYVALUE\n_PAYLOAD_PROPERTYVALUE.fields_by_name['propertyset_value'].message_type = _PAYLOAD_PROPERTYSET\n_PAYLOAD_PROPERTYVALUE.fields_by_name['propertysets_value'].message_type = _PAYLOAD_PROPERTYSETLIST\n_PAYLOAD_PROPERTYVALUE.fields_by_name['extension_value'].message_type = _PAYLOAD_PROPERTYVALUE_PROPERTYVALUEEXTENSION\n_PAYLOAD_PROPERTYVALUE.containing_type = _PAYLOAD\n_PAYLOAD_PROPERTYVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_PROPERTYVALUE.fields_by_name['int_value'])\n_PAYLOAD_PROPERTYVALUE.fields_by_name['int_value'].containing_oneof = _PAYLOAD_PROPERTYVALUE.oneofs_by_name['value']\n_PAYLOAD_PROPERTYVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_PROPERTYVALUE.fields_by_name['long_value'])\n_PAYLOAD_PROPERTYVALUE.fields_by_name['long_value'].containing_oneof = _PAYLOAD_PROPERTYVALUE.oneofs_by_name['value']\n_PAYLOAD_PROPERTYVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_PROPERTYVALUE.fields_by_name['float_value'])\n_PAYLOAD_PROPERTYVALUE.fields_by_name['float_value'].containing_oneof = _PAYLOAD_PROPERTYVALUE.oneofs_by_name['value']\n_PAYLOAD_PROPERTYVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_PROPERTYVALUE.fields_by_name['double_value'])\n_PAYLOAD_PROPERTYVALUE.fields_by_name['double_value'].containing_oneof = _PAYLOAD_PROPERTYVALUE.oneofs_by_name['value']\n_PAYLOAD_PROPERTYVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_PROPERTYVALUE.fields_by_name['boolean_value'])\n_PAYLOAD_PROPERTYVALUE.fields_by_name['boolean_value'].containing_oneof = _PAYLOAD_PROPERTYVALUE.oneofs_by_name['value']\n_PAYLOAD_PROPERTYVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_PROPERTYVALUE.fields_by_name['string_value'])\n_PAYLOAD_PROPERTYVALUE.fields_by_name['string_value'].containing_oneof = _PAYLOAD_PROPERTYVALUE.oneofs_by_name['value']\n_PAYLOAD_PROPERTYVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_PROPERTYVALUE.fields_by_name['propertyset_value'])\n_PAYLOAD_PROPERTYVALUE.fields_by_name['propertyset_value'].containing_oneof = _PAYLOAD_PROPERTYVALUE.oneofs_by_name['value']\n_PAYLOAD_PROPERTYVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_PROPERTYVALUE.fields_by_name['propertysets_value'])\n_PAYLOAD_PROPERTYVALUE.fields_by_name['propertysets_value'].containing_oneof = _PAYLOAD_PROPERTYVALUE.oneofs_by_name['value']\n_PAYLOAD_PROPERTYVALUE.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_PROPERTYVALUE.fields_by_name['extension_value'])\n_PAYLOAD_PROPERTYVALUE.fields_by_name['extension_value'].containing_oneof = _PAYLOAD_PROPERTYVALUE.oneofs_by_name['value']\n_PAYLOAD_PROPERTYSET.fields_by_name['values'].message_type = _PAYLOAD_PROPERTYVALUE\n_PAYLOAD_PROPERTYSET.containing_type = _PAYLOAD\n_PAYLOAD_PROPERTYSETLIST.fields_by_name['propertyset'].message_type = _PAYLOAD_PROPERTYSET\n_PAYLOAD_PROPERTYSETLIST.containing_type = _PAYLOAD\n_PAYLOAD_METADATA.containing_type = _PAYLOAD\n_PAYLOAD_METRIC_METRICVALUEEXTENSION.containing_type = _PAYLOAD_METRIC\n_PAYLOAD_METRIC.fields_by_name['metadata'].message_type = _PAYLOAD_METADATA\n_PAYLOAD_METRIC.fields_by_name['properties'].message_type = _PAYLOAD_PROPERTYSET\n_PAYLOAD_METRIC.fields_by_name['dataset_value'].message_type = _PAYLOAD_DATASET\n_PAYLOAD_METRIC.fields_by_name['template_value'].message_type = _PAYLOAD_TEMPLATE\n_PAYLOAD_METRIC.fields_by_name['extension_value'].message_type = _PAYLOAD_METRIC_METRICVALUEEXTENSION\n_PAYLOAD_METRIC.containing_type = _PAYLOAD\n_PAYLOAD_METRIC.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_METRIC.fields_by_name['int_value'])\n_PAYLOAD_METRIC.fields_by_name['int_value'].containing_oneof = _PAYLOAD_METRIC.oneofs_by_name['value']\n_PAYLOAD_METRIC.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_METRIC.fields_by_name['long_value'])\n_PAYLOAD_METRIC.fields_by_name['long_value'].containing_oneof = _PAYLOAD_METRIC.oneofs_by_name['value']\n_PAYLOAD_METRIC.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_METRIC.fields_by_name['float_value'])\n_PAYLOAD_METRIC.fields_by_name['float_value'].containing_oneof = _PAYLOAD_METRIC.oneofs_by_name['value']\n_PAYLOAD_METRIC.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_METRIC.fields_by_name['double_value'])\n_PAYLOAD_METRIC.fields_by_name['double_value'].containing_oneof = _PAYLOAD_METRIC.oneofs_by_name['value']\n_PAYLOAD_METRIC.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_METRIC.fields_by_name['boolean_value'])\n_PAYLOAD_METRIC.fields_by_name['boolean_value'].containing_oneof = _PAYLOAD_METRIC.oneofs_by_name['value']\n_PAYLOAD_METRIC.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_METRIC.fields_by_name['string_value'])\n_PAYLOAD_METRIC.fields_by_name['string_value'].containing_oneof = _PAYLOAD_METRIC.oneofs_by_name['value']\n_PAYLOAD_METRIC.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_METRIC.fields_by_name['bytes_value'])\n_PAYLOAD_METRIC.fields_by_name['bytes_value'].containing_oneof = _PAYLOAD_METRIC.oneofs_by_name['value']\n_PAYLOAD_METRIC.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_METRIC.fields_by_name['dataset_value'])\n_PAYLOAD_METRIC.fields_by_name['dataset_value'].containing_oneof = _PAYLOAD_METRIC.oneofs_by_name['value']\n_PAYLOAD_METRIC.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_METRIC.fields_by_name['template_value'])\n_PAYLOAD_METRIC.fields_by_name['template_value'].containing_oneof = _PAYLOAD_METRIC.oneofs_by_name['value']\n_PAYLOAD_METRIC.oneofs_by_name['value'].fields.append(\n  _PAYLOAD_METRIC.fields_by_name['extension_value'])\n_PAYLOAD_METRIC.fields_by_name['extension_value'].containing_oneof = _PAYLOAD_METRIC.oneofs_by_name['value']\n_PAYLOAD.fields_by_name['metrics'].message_type = _PAYLOAD_METRIC\nDESCRIPTOR.message_types_by_name['Payload'] = _PAYLOAD\n\nPayload = _reflection.GeneratedProtocolMessageType('Payload', (_message.Message,), dict(\n\n  Template = _reflection.GeneratedProtocolMessageType('Template', (_message.Message,), dict(\n\n    Parameter = _reflection.GeneratedProtocolMessageType('Parameter', (_message.Message,), dict(\n\n      ParameterValueExtension = _reflection.GeneratedProtocolMessageType('ParameterValueExtension', (_message.Message,), dict(\n        DESCRIPTOR = _PAYLOAD_TEMPLATE_PARAMETER_PARAMETERVALUEEXTENSION,\n        __module__ = 'sparkplug_b_pb2'\n        # @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.Template.Parameter.ParameterValueExtension)\n        ))\n      ,\n      DESCRIPTOR = _PAYLOAD_TEMPLATE_PARAMETER,\n      __module__ = 'sparkplug_b_pb2'\n      # @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.Template.Parameter)\n      ))\n    ,\n    DESCRIPTOR = _PAYLOAD_TEMPLATE,\n    __module__ = 'sparkplug_b_pb2'\n    # @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.Template)\n    ))\n  ,\n\n  DataSet = _reflection.GeneratedProtocolMessageType('DataSet', (_message.Message,), dict(\n\n    DataSetValue = _reflection.GeneratedProtocolMessageType('DataSetValue', (_message.Message,), dict(\n\n      DataSetValueExtension = _reflection.GeneratedProtocolMessageType('DataSetValueExtension', (_message.Message,), dict(\n        DESCRIPTOR = _PAYLOAD_DATASET_DATASETVALUE_DATASETVALUEEXTENSION,\n        __module__ = 'sparkplug_b_pb2'\n        # @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue.DataSetValueExtension)\n        ))\n      ,\n      DESCRIPTOR = _PAYLOAD_DATASET_DATASETVALUE,\n      __module__ = 'sparkplug_b_pb2'\n      # @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.DataSet.DataSetValue)\n      ))\n    ,\n\n    Row = _reflection.GeneratedProtocolMessageType('Row', (_message.Message,), dict(\n      DESCRIPTOR = _PAYLOAD_DATASET_ROW,\n      __module__ = 'sparkplug_b_pb2'\n      # @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.DataSet.Row)\n      ))\n    ,\n    DESCRIPTOR = _PAYLOAD_DATASET,\n    __module__ = 'sparkplug_b_pb2'\n    # @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.DataSet)\n    ))\n  ,\n\n  PropertyValue = _reflection.GeneratedProtocolMessageType('PropertyValue', (_message.Message,), dict(\n\n    PropertyValueExtension = _reflection.GeneratedProtocolMessageType('PropertyValueExtension', (_message.Message,), dict(\n      DESCRIPTOR = _PAYLOAD_PROPERTYVALUE_PROPERTYVALUEEXTENSION,\n      __module__ = 'sparkplug_b_pb2'\n      # @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.PropertyValue.PropertyValueExtension)\n      ))\n    ,\n    DESCRIPTOR = _PAYLOAD_PROPERTYVALUE,\n    __module__ = 'sparkplug_b_pb2'\n    # @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.PropertyValue)\n    ))\n  ,\n\n  PropertySet = _reflection.GeneratedProtocolMessageType('PropertySet', (_message.Message,), dict(\n    DESCRIPTOR = _PAYLOAD_PROPERTYSET,\n    __module__ = 'sparkplug_b_pb2'\n    # @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.PropertySet)\n    ))\n  ,\n\n  PropertySetList = _reflection.GeneratedProtocolMessageType('PropertySetList', (_message.Message,), dict(\n    DESCRIPTOR = _PAYLOAD_PROPERTYSETLIST,\n    __module__ = 'sparkplug_b_pb2'\n    # @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.PropertySetList)\n    ))\n  ,\n\n  MetaData = _reflection.GeneratedProtocolMessageType('MetaData', (_message.Message,), dict(\n    DESCRIPTOR = _PAYLOAD_METADATA,\n    __module__ = 'sparkplug_b_pb2'\n    # @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.MetaData)\n    ))\n  ,\n\n  Metric = _reflection.GeneratedProtocolMessageType('Metric', (_message.Message,), dict(\n\n    MetricValueExtension = _reflection.GeneratedProtocolMessageType('MetricValueExtension', (_message.Message,), dict(\n      DESCRIPTOR = _PAYLOAD_METRIC_METRICVALUEEXTENSION,\n      __module__ = 'sparkplug_b_pb2'\n      # @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.Metric.MetricValueExtension)\n      ))\n    ,\n    DESCRIPTOR = _PAYLOAD_METRIC,\n    __module__ = 'sparkplug_b_pb2'\n    # @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload.Metric)\n    ))\n  ,\n  DESCRIPTOR = _PAYLOAD,\n  __module__ = 'sparkplug_b_pb2'\n  # @@protoc_insertion_point(class_scope:org.eclipse.tahu.protobuf.Payload)\n  ))\n_sym_db.RegisterMessage(Payload)\n_sym_db.RegisterMessage(Payload.Template)\n_sym_db.RegisterMessage(Payload.Template.Parameter)\n_sym_db.RegisterMessage(Payload.Template.Parameter.ParameterValueExtension)\n_sym_db.RegisterMessage(Payload.DataSet)\n_sym_db.RegisterMessage(Payload.DataSet.DataSetValue)\n_sym_db.RegisterMessage(Payload.DataSet.DataSetValue.DataSetValueExtension)\n_sym_db.RegisterMessage(Payload.DataSet.Row)\n_sym_db.RegisterMessage(Payload.PropertyValue)\n_sym_db.RegisterMessage(Payload.PropertyValue.PropertyValueExtension)\n_sym_db.RegisterMessage(Payload.PropertySet)\n_sym_db.RegisterMessage(Payload.PropertySetList)\n_sym_db.RegisterMessage(Payload.MetaData)\n_sym_db.RegisterMessage(Payload.Metric)\n_sym_db.RegisterMessage(Payload.Metric.MetricValueExtension)\n\n\nDESCRIPTOR.has_options = True\nDESCRIPTOR._options = _descriptor._ParseOptions(descriptor_pb2.FileOptions(), _b('\\n\\031org.eclipse.tahu.protobufB\\017SparkplugBProto'))\n# @@protoc_insertion_point(module_scope)\n"
  },
  {
    "path": "python/examples/THIRD-PARTY.txt",
    "content": "(1) Eclipse Paho Client\nCopyright © 2014 Eclipse Paho. All Rights Reserved.\nThe Software contains Eclipse Paho Client, which is protected under the Eclipse Public License, Version 1.0. \nYou can get the full source code for Eclipse Paho Client at: https://repo.eclipse.org/content/repositories/paho-releases. \nA copy of the Eclipse Public License, Version 1.0 is available at http://www.eclipse.org/legal/epl-v10.html\n\n(2) Google Protocol Buffers\nCopyright 2014, Google Inc. All rights reserved.\nThe Software contains Google Protocol Buffers, which is protected under a BSD style license. \nYou can get the full source code for Google Protocol Buffers at: https://github.com/google/protobuf. \nA copy of the license is available at https://github.com/google/protobuf/blob/master/LICENSE\n"
  },
  {
    "path": "python/examples/example.py",
    "content": "#!/usr/bin/python\n#/********************************************************************************\n# * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n# *\n# * This program and the accompanying materials are made available under the\n# * terms of the Eclipse Public License 2.0 which is available at\n# * http://www.eclipse.org/legal/epl-2.0.\n# *\n# * SPDX-License-Identifier: EPL-2.0\n# *\n# * Contributors:\n# *   Cirrus Link Solutions - initial implementation\n# ********************************************************************************/\nimport sys\nsys.path.insert(0, \"../core/\")\n#print(sys.path)\n\nimport paho.mqtt.client as mqtt\nimport sparkplug_b as sparkplug\nimport time\nimport random\nimport string\n\nfrom sparkplug_b import *\n\n# Application Variables\nserverUrl = \"localhost\"\nmyGroupId = \"Sparkplug B Devices\"\nmyNodeName = \"Python Edge Node 1\"\nmyDeviceName = \"Emulated Device\"\npublishPeriod = 5000\nmyUsername = \"admin\"\nmyPassword = \"changeme\"\n\nclass AliasMap:\n    Next_Server = 0\n    Rebirth = 1\n    Reboot = 2\n    Dataset = 3\n    Node_Metric0 = 4\n    Node_Metric1 = 5\n    Node_Metric2 = 6\n    Node_Metric3 = 7\n    Device_Metric0 = 8\n    Device_Metric1 = 9\n    Device_Metric2 = 10\n    Device_Metric3 = 11\n    My_Custom_Motor = 12\n\n######################################################################\n# The callback for when the client receives a CONNACK response from the server.\n######################################################################\ndef on_connect(client, userdata, flags, rc):\n    if rc == 0:\n        print(\"Connected with result code \"+str(rc))\n    else:\n        print(\"Failed to connect with result code \"+str(rc))\n        sys.exit()\n\n    global myGroupId\n    global myNodeName\n\n    # Subscribing in on_connect() means that if we lose the connection and\n    # reconnect then subscriptions will be renewed.\n    client.subscribe(\"spBv1.0/\" + myGroupId + \"/NCMD/\" + myNodeName + \"/#\")\n    client.subscribe(\"spBv1.0/\" + myGroupId + \"/DCMD/\" + myNodeName + \"/#\")\n######################################################################\n\n######################################################################\n# The callback for when a PUBLISH message is received from the server.\n######################################################################\ndef on_message(client, userdata, msg):\n    print(\"Message arrived: \" + msg.topic)\n    tokens = msg.topic.split(\"/\")\n\n    if tokens[0] == \"spBv1.0\" and tokens[1] == myGroupId and (tokens[2] == \"NCMD\" or tokens[2] == \"DCMD\") and tokens[3] == myNodeName:\n        inboundPayload = sparkplug_b_pb2.Payload()\n        inboundPayload.ParseFromString(msg.payload)\n        for metric in inboundPayload.metrics:\n            if metric.name == \"Node Control/Next Server\" or metric.alias == AliasMap.Next_Server:\n                # 'Node Control/Next Server' is an NCMD used to tell the device/client application to\n                # disconnect from the current MQTT server and connect to the next MQTT server in the\n                # list of available servers.  This is used for clients that have a pool of MQTT servers\n                # to connect to.\n                print( \"'Node Control/Next Server' is not implemented in this example\")\n            elif metric.name == \"Node Control/Rebirth\" or metric.alias == AliasMap.Rebirth:\n                # 'Node Control/Rebirth' is an NCMD used to tell the device/client application to resend\n                # its full NBIRTH and DBIRTH again.  MQTT Engine will send this NCMD to a device/client\n                # application if it receives an NDATA or DDATA with a metric that was not published in the\n                # original NBIRTH or DBIRTH.  This is why the application must send all known metrics in\n                # its original NBIRTH and DBIRTH messages.\n                publishBirth()\n            elif metric.name == \"Node Control/Reboot\" or metric.alias == AliasMap.Reboot:\n                # 'Node Control/Reboot' is an NCMD used to tell a device/client application to reboot\n                # This can be used for devices that need a full application reset via a soft reboot.\n                # In this case, we fake a full reboot with a republishing of the NBIRTH and DBIRTH\n                # messages.\n                publishBirth()\n            elif metric.name == \"output/Device Metric2\" or metric.alias == AliasMap.Device_Metric2:\n                # This is a metric we declared in our DBIRTH message and we're emulating an output.\n                # So, on incoming 'writes' to the output we must publish a DDATA with the new output\n                # value.  If this were a real output we'd write to the output and then read it back\n                # before publishing a DDATA message.\n\n                # We know this is an Int16 because of how we declated it in the DBIRTH\n                newValue = metric.int_value\n                print( \"CMD message for output/Device Metric2 - New Value: {}\".format(newValue))\n\n                # Create the DDATA payload - Use the alias because this isn't the DBIRTH\n                payload = sparkplug.getDdataPayload()\n                addMetric(payload, None, AliasMap.Device_Metric2, MetricDataType.Int16, newValue)\n\n                # Publish a message data\n                byteArray = bytearray(payload.SerializeToString())\n                client.publish(\"spBv1.0/\" + myGroupId + \"/DDATA/\" + myNodeName + \"/\" + myDeviceName, byteArray, 0, False)\n            elif metric.name == \"output/Device Metric3\" or metric.alias == AliasMap.Device_Metric3:\n                # This is a metric we declared in our DBIRTH message and we're emulating an output.\n                # So, on incoming 'writes' to the output we must publish a DDATA with the new output\n                # value.  If this were a real output we'd write to the output and then read it back\n                # before publishing a DDATA message.\n\n                # We know this is an Boolean because of how we declated it in the DBIRTH\n                newValue = metric.boolean_value\n                print( \"CMD message for output/Device Metric3 - New Value: %r\" % newValue)\n\n                # Create the DDATA payload - use the alias because this isn't the DBIRTH\n                payload = sparkplug.getDdataPayload()\n                addMetric(payload, None, AliasMap.Device_Metric3, MetricDataType.Boolean, newValue)\n\n                # Publish a message data\n                byteArray = bytearray(payload.SerializeToString())\n                client.publish(\"spBv1.0/\" + myGroupId + \"/DDATA/\" + myNodeName + \"/\" + myDeviceName, byteArray, 0, False)\n            else:\n                print( \"Unknown command: \" + metric.name)\n    else:\n        print( \"Unknown command...\")\n\n    print( \"Done publishing\")\n######################################################################\n\n######################################################################\n# Publish the BIRTH certificates\n######################################################################\ndef publishBirth():\n    publishNodeBirth()\n    publishDeviceBirth()\n######################################################################\n\n######################################################################\n# Publish the NBIRTH certificate\n######################################################################\ndef publishNodeBirth():\n    print( \"Publishing Node Birth\")\n\n    # Create the node birth payload\n    payload = sparkplug.getNodeBirthPayload()\n\n    # Set up the Node Controls\n    addMetric(payload, \"Node Control/Next Server\", AliasMap.Next_Server, MetricDataType.Boolean, False)\n    addMetric(payload, \"Node Control/Rebirth\", AliasMap.Rebirth, MetricDataType.Boolean, False)\n    addMetric(payload, \"Node Control/Reboot\", AliasMap.Reboot, MetricDataType.Boolean, False)\n\n    # Add some regular node metrics\n    addMetric(payload, \"Node Metric0\", AliasMap.Node_Metric0, MetricDataType.String, \"hello node\")\n    addMetric(payload, \"Node Metric1\", AliasMap.Node_Metric1, MetricDataType.Boolean, True)\n    addNullMetric(payload, \"Node Metric3\", AliasMap.Node_Metric3, MetricDataType.Int32)\n\n    # Create a DataSet (012 - 345) two rows with Int8, Int16, and Int32 contents and headers Int8s, Int16s, Int32s and add it to the payload\n    columns = [\"Int8s\", \"Int16s\", \"Int32s\"]\n    types = [DataSetDataType.Int8, DataSetDataType.Int16, DataSetDataType.Int32]\n    dataset = initDatasetMetric(payload, \"DataSet\", AliasMap.Dataset, columns, types)\n    row = dataset.rows.add()\n    element = row.elements.add();\n    element.int_value = 0\n    element = row.elements.add();\n    element.int_value = 1\n    element = row.elements.add();\n    element.int_value = 2\n    row = dataset.rows.add()\n    element = row.elements.add();\n    element.int_value = 3\n    element = row.elements.add();\n    element.int_value = 4\n    element = row.elements.add();\n    element.int_value = 5\n\n    # Add a metric with a custom property\n    metric = addMetric(payload, \"Node Metric2\", AliasMap.Node_Metric2, MetricDataType.Int16, 13)\n    metric.properties.keys.extend([\"engUnit\"])\n    propertyValue = metric.properties.values.add()\n    propertyValue.type = ParameterDataType.String\n    propertyValue.string_value = \"MyCustomUnits\"\n\n    # Create the UDT definition value which includes two UDT members and a single parameter and add it to the payload\n    template = initTemplateMetric(payload, \"_types_/Custom_Motor\", None, None)    # No alias for Template definitions\n    templateParameter = template.parameters.add()\n    templateParameter.name = \"Index\"\n    templateParameter.type = ParameterDataType.String\n    templateParameter.string_value = \"0\"\n    addMetric(template, \"RPMs\", None, MetricDataType.Int32, 0)    # No alias in UDT members\n    addMetric(template, \"AMPs\", None, MetricDataType.Int32, 0)    # No alias in UDT members\n\n    # Publish the node birth certificate\n    byteArray = bytearray(payload.SerializeToString())\n    client.publish(\"spBv1.0/\" + myGroupId + \"/NBIRTH/\" + myNodeName, byteArray, 0, False)\n######################################################################\n\n######################################################################\n# Publish the DBIRTH certificate\n######################################################################\ndef publishDeviceBirth():\n    print( \"Publishing Device Birth\")\n\n    # Get the payload\n    payload = sparkplug.getDeviceBirthPayload()\n\n    # Add some device metrics\n    addMetric(payload, \"input/Device Metric0\", AliasMap.Device_Metric0, MetricDataType.String, \"hello device\")\n    addMetric(payload, \"input/Device Metric1\", AliasMap.Device_Metric1, MetricDataType.Boolean, True)\n    addMetric(payload, \"output/Device Metric2\", AliasMap.Device_Metric2, MetricDataType.Int16, 16)\n    addMetric(payload, \"output/Device Metric3\", AliasMap.Device_Metric3, MetricDataType.Boolean, True)\n\n    # Create the UDT definition value which includes two UDT members and a single parameter and add it to the payload\n    template = initTemplateMetric(payload, \"My_Custom_Motor\", AliasMap.My_Custom_Motor, \"Custom_Motor\")\n    templateParameter = template.parameters.add()\n    templateParameter.name = \"Index\"\n    templateParameter.type = ParameterDataType.String\n    templateParameter.string_value = \"1\"\n    addMetric(template, \"RPMs\", None, MetricDataType.Int32, 123)    # No alias in UDT members\n    addMetric(template, \"AMPs\", None, MetricDataType.Int32, 456)    # No alias in UDT members\n\n    # Publish the initial data with the Device BIRTH certificate\n    totalByteArray = bytearray(payload.SerializeToString())\n    client.publish(\"spBv1.0/\" + myGroupId + \"/DBIRTH/\" + myNodeName + \"/\" + myDeviceName, totalByteArray, 0, False)\n######################################################################\n\n######################################################################\n# Main Application\n######################################################################\nprint(\"Starting main application\")\n\n# Create the node death payload\ndeathPayload = sparkplug.getNodeDeathPayload()\n\n# Start of main program - Set up the MQTT client connection\nclient = mqtt.Client(serverUrl, 1883, 60)\nclient.on_connect = on_connect\nclient.on_message = on_message\nclient.username_pw_set(myUsername, myPassword)\ndeathByteArray = bytearray(deathPayload.SerializeToString())\nclient.will_set(\"spBv1.0/\" + myGroupId + \"/NDEATH/\" + myNodeName, deathByteArray, 0, False)\nclient.connect(serverUrl, 1883, 60)\n\n# Short delay to allow connect callback to occur\ntime.sleep(.1)\nclient.loop()\n\n# Publish the birth certificates\npublishBirth()\n\nwhile True:\n    # Periodically publish some new data\n    payload = sparkplug.getDdataPayload()\n\n    # Add some random data to the inputs\n    addMetric(payload, None, AliasMap.Device_Metric0, MetricDataType.String, ''.join(random.choice(string.ascii_lowercase) for i in range(12)))\n\n    # Note this data we're setting to STALE via the propertyset as an example\n    metric = addMetric(payload, None, AliasMap.Device_Metric1, MetricDataType.Boolean, random.choice([True, False]))\n    metric.properties.keys.extend([\"Quality\"])\n    propertyValue = metric.properties.values.add()\n    propertyValue.type = ParameterDataType.Int32\n    propertyValue.int_value = 500\n\n    # Publish a message data\n    byteArray = bytearray(payload.SerializeToString())\n    client.publish(\"spBv1.0/\" + myGroupId + \"/DDATA/\" + myNodeName + \"/\" + myDeviceName, byteArray, 0, False)\n\n    # Sit and wait for inbound or outbound events\n    for _ in range(5):\n        time.sleep(.1)\n        client.loop()\n######################################################################\n"
  },
  {
    "path": "python/examples/example_datatype.py",
    "content": "#!/usr/bin/python\n#/********************************************************************************\n# * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n# *\n# * This program and the accompanying materials are made available under the\n# * terms of the Eclipse Public License 2.0 which is available at\n# * http://www.eclipse.org/legal/epl-2.0.\n# *\n# * SPDX-License-Identifier: EPL-2.0\n# *\n# * Contributors:\n# *   Cirrus Link Solutions - initial implementation\n# ********************************************************************************/\nimport sys\nsys.path.insert(0, \"../core/\")\n#print(sys.path)\n\nimport paho.mqtt.client as mqtt\nimport sparkplug_b as sparkplug\nimport time\nimport random\nimport string\n\nfrom sparkplug_b import *\n\n# Application Variables\nserverUrl = \"localhost\"\nmyGroupId = \"Sparkplug B Devices\"\nmyNodeName = \"Python Edge Node 1\"\nmyDeviceName = \"Emulated Device\"\npublishPeriod = 5000\nmyUsername = \"admin\"\nmyPassword = \"changeme\"\n\n######################################################################\n# The callback for when the client receives a CONNACK response from the server.\n######################################################################\ndef on_connect(client, userdata, flags, rc):\n    if rc == 0:\n        print(\"Connected with result code \"+str(rc))\n    else:\n        print(\"Failed to connect with result code \"+str(rc))\n        sys.exit()\n\n    global myGroupId\n    global myNodeName\n\n    # Subscribing in on_connect() means that if we lose the connection and\n    # reconnect then subscriptions will be renewed.\n    client.subscribe(\"spBv1.0/\" + myGroupId + \"/NCMD/\" + myNodeName + \"/#\")\n    client.subscribe(\"spBv1.0/\" + myGroupId + \"/DCMD/\" + myNodeName + \"/#\")\n######################################################################\n\n######################################################################\n# The callback for when a PUBLISH message is received from the server.\n######################################################################\ndef on_message(client, userdata, msg):\n    print(\"Message arrived: \" + msg.topic)\n    tokens = msg.topic.split(\"/\")\n\n    if tokens[0] == \"spBv1.0\" and tokens[1] == myGroupId and (tokens[2] == \"NCMD\" or tokens[2] == \"DCMD\") and tokens[3] == myNodeName:\n        inboundPayload = sparkplug_b_pb2.Payload()\n        inboundPayload.ParseFromString(msg.payload)\n        for metric in inboundPayload.metrics:\n            if metric.name == \"Node Control/Next Server\":\n                # 'Node Control/Next Server' is an NCMD used to tell the device/client application to\n                # disconnect from the current MQTT server and connect to the next MQTT server in the\n                # list of available servers.  This is used for clients that have a pool of MQTT servers\n                # to connect to.\n                print( \"'Node Control/Next Server' is not implemented in this example\")\n            elif metric.name == \"Node Control/Rebirth\":\n                # 'Node Control/Rebirth' is an NCMD used to tell the device/client application to resend\n                # its full NBIRTH and DBIRTH again.  MQTT Engine will send this NCMD to a device/client\n                # application if it receives an NDATA or DDATA with a metric that was not published in the\n                # original NBIRTH or DBIRTH.  This is why the application must send all known metrics in\n                # its original NBIRTH and DBIRTH messages.\n                publishBirth()\n            elif metric.name == \"Node Control/Reboot\":\n                # 'Node Control/Reboot' is an NCMD used to tell a device/client application to reboot\n                # This can be used for devices that need a full application reset via a soft reboot.\n                # In this case, we fake a full reboot with a republishing of the NBIRTH and DBIRTH\n                # messages.\n                publishBirth()\n            elif metric.name == \"output/Device Metric2\":\n                # This is a metric we declared in our DBIRTH message and we're emulating an output.\n                # So, on incoming 'writes' to the output we must publish a DDATA with the new output\n                # value.  If this were a real output we'd write to the output and then read it back\n                # before publishing a DDATA message.\n\n                # We know this is an Int16 because of how we declated it in the DBIRTH\n                newValue = metric.int_value\n                print( \"CMD message for output/Device Metric2 - New Value: {}\".format(newValue))\n\n                # Create the DDATA payload - Use the alias because this isn't the DBIRTH\n                payload = sparkplug.getDdataPayload()\n                addMetric(payload, None, None, MetricDataType.Int16, newValue)\n\n                # Publish a message data\n                byteArray = bytearray(payload.SerializeToString())\n                client.publish(\"spBv1.0/\" + myGroupId + \"/DDATA/\" + myNodeName + \"/\" + myDeviceName, byteArray, 0, False)\n            elif metric.name == \"output/Device Metric3\":\n                # This is a metric we declared in our DBIRTH message and we're emulating an output.\n                # So, on incoming 'writes' to the output we must publish a DDATA with the new output\n                # value.  If this were a real output we'd write to the output and then read it back\n                # before publishing a DDATA message.\n\n                # We know this is an Boolean because of how we declated it in the DBIRTH\n                newValue = metric.boolean_value\n                print( \"CMD message for output/Device Metric3 - New Value: %r\" % newValue)\n\n                # Create the DDATA payload - use the alias because this isn't the DBIRTH\n                payload = sparkplug.getDdataPayload()\n                addMetric(payload, None, None, MetricDataType.Boolean, newValue)\n\n                # Publish a message data\n                byteArray = bytearray(payload.SerializeToString())\n                client.publish(\"spBv1.0/\" + myGroupId + \"/DDATA/\" + myNodeName + \"/\" + myDeviceName, byteArray, 0, False)\n            else:\n                print( \"Unknown command: \" + metric.name)\n    else:\n        print( \"Unknown command...\")\n\n    print( \"Done publishing\")\n######################################################################\n\n######################################################################\n# Publish the BIRTH certificates\n######################################################################\ndef publishBirth():\n    publishNodeBirth()\n    publishDeviceBirth()\n######################################################################\n\n######################################################################\n# Publish the NBIRTH certificate\n######################################################################\ndef publishNodeBirth():\n    print( \"Publishing Node Birth\")\n\n    # Create the node birth payload\n    payload = sparkplug.getNodeBirthPayload()\n\n    # Set up the Node Controls\n    addMetric(payload, \"Node Control/Next Server\", None, MetricDataType.Boolean, False)\n    addMetric(payload, \"Node Control/Rebirth\", None, MetricDataType.Boolean, False)\n    addMetric(payload, \"Node Control/Reboot\", None, MetricDataType.Boolean, False)\n\n    # Publish the node birth certificate\n    byteArray = bytearray(payload.SerializeToString())\n    client.publish(\"spBv1.0/\" + myGroupId + \"/NBIRTH/\" + myNodeName, byteArray, 0, False)\n######################################################################\n\n######################################################################\n# Publish the DBIRTH certificate\n######################################################################\ndef publishDeviceBirth():\n    print( \"Publishing Device Birth\")\n\n    # Get the payload\n    payload = sparkplug.getDeviceBirthPayload()\n\n    # Add some device metrics\n    addMetric(payload, \"Int8_Min\", None, MetricDataType.Int8, -128)\n    addMetric(payload, \"Int8_Max\", None, MetricDataType.Int8, 127)\n    addMetric(payload, \"Int16_Min\", None, MetricDataType.Int16, -32768)\n    addMetric(payload, \"Int16_Max\", None, MetricDataType.Int16, 32767)\n    addMetric(payload, \"Int32_Min\", None, MetricDataType.Int32, -2147483648)\n    addMetric(payload, \"Int32_Max\", None, MetricDataType.Int32, 2147483647)\n    addMetric(payload, \"Int64_Min\", None, MetricDataType.Int64, -9223372036854775808)\n    addMetric(payload, \"Int64_Max\", None, MetricDataType.Int64, 9223372036854775807)\n\n    addMetric(payload, \"UInt8_Min\", None, MetricDataType.UInt8, 0)\n    addMetric(payload, \"UInt8_Max\", None, MetricDataType.UInt8, 255)\n    addMetric(payload, \"UInt16_Min\", None, MetricDataType.UInt16, 0)\n    addMetric(payload, \"UInt16_Max\", None, MetricDataType.UInt16, 64535)\n    addMetric(payload, \"UInt32_Min\", None, MetricDataType.UInt32, 0)\n    addMetric(payload, \"UInt32_Max\", None, MetricDataType.UInt32, 4294967295)\n    addMetric(payload, \"UInt64_Min\", None, MetricDataType.UInt64, 0)\n    addMetric(payload, \"UInt64_Max\", None, MetricDataType.UInt64, 18446744073709551615)\n\n    # Publish the initial data with the Device BIRTH certificate\n    totalByteArray = bytearray(payload.SerializeToString())\n    client.publish(\"spBv1.0/\" + myGroupId + \"/DBIRTH/\" + myNodeName + \"/\" + myDeviceName, totalByteArray, 0, False)\n######################################################################\n\n######################################################################\n# Main Application\n######################################################################\nprint(\"Starting main application\")\n\n# Create the node death payload\ndeathPayload = sparkplug.getNodeDeathPayload()\n\n# Start of main program - Set up the MQTT client connection\nclient = mqtt.Client(serverUrl, 1883, 60)\nclient.on_connect = on_connect\nclient.on_message = on_message\nclient.username_pw_set(myUsername, myPassword)\ndeathByteArray = bytearray(deathPayload.SerializeToString())\nclient.will_set(\"spBv1.0/\" + myGroupId + \"/NDEATH/\" + myNodeName, deathByteArray, 0, False)\nclient.connect(serverUrl, 1883, 60)\n\n# Short delay to allow connect callback to occur\ntime.sleep(.1)\nclient.loop()\n\n# Publish the birth certificates\npublishBirth()\n\nwhile True:\n    # Periodically publish some new data\n    payload = sparkplug.getDdataPayload()\n\n    # Add some random data to the inputs\n    addMetric(payload, None, None, MetricDataType.String, ''.join(random.choice(string.ascii_lowercase) for i in range(12)))\n\n    # Note this data we're setting to STALE via the propertyset as an example\n    metric = addMetric(payload, None, 102, MetricDataType.Boolean, random.choice([True, False]))\n    metric.properties.keys.extend([\"Quality\"])\n    propertyValue = metric.properties.values.add()\n    propertyValue.type = ParameterDataType.Int32\n    propertyValue.int_value = 500\n\n    # Publish a message data\n    byteArray = bytearray(payload.SerializeToString())\n    # client.publish(\"spBv1.0/\" + myGroupId + \"/DDATA/\" + myNodeName + \"/\" + myDeviceName, byteArray, 0, False)\n\n    # Sit and wait for inbound or outbound events\n    for _ in range(5):\n        time.sleep(.1)\n        client.loop()\n######################################################################\n"
  },
  {
    "path": "python/examples/example_raspberry_pi.py",
    "content": "#!/usr/bin/python\n#/********************************************************************************\n# * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n# *\n# * This program and the accompanying materials are made available under the\n# * terms of the Eclipse Public License 2.0 which is available at\n# * http://www.eclipse.org/legal/epl-2.0.\n# *\n# * SPDX-License-Identifier: EPL-2.0\n# *\n# * Contributors:\n# *   Cirrus Link Solutions - initial implementation\n# ********************************************************************************/\nimport sys\nsys.path.insert(0, \"client_lib\")\n\nimport paho.mqtt.client as mqtt\nimport pibrella\nimport sparkplug_b as sparkplug\nimport time\nimport random\nimport subprocess\n\nfrom sparkplug_b import *\nfrom threading import Lock\n\nserverUrl = \"192.168.1.53\"\nmyGroupId = \"Sparkplug B Devices\"\nmyNodeName = \"Python Raspberry Pi\"\nmySubNodeName = \"Pibrella\"\nmyUsername = \"admin\"\nmyPassword = \"changeme\"\nlock = Lock()\n\n######################################################################\n# Button press event handler\n######################################################################\ndef button_changed(pin):\n    outboundPayload = sparkplug.getDdataPayload()\n    buttonValue = pin.read()\n    if buttonValue == 1:\n        print(\"You pressed the button!\")\n    else:\n        print(\"You released the button!\")\n    addMetric(outboundPayload, \"button\", None, MetricDataType.Boolean, buttonValue);\n    byteArray = bytearray(outboundPayload.SerializeToString())\n    client.publish(\"spBv1.0/\" + myGroupId + \"/DDATA/\" + myNodeName + \"/\" + mySubNodeName, byteArray, 0, False)\n\n######################################################################\n# Input change event handler\n######################################################################\ndef input_a_changed(pin):\n    input_changed(\"Inputs/a\", pin)\ndef input_b_changed(pin):\n    input_changed(\"Inputs/b\", pin)\ndef input_c_changed(pin):\n    input_changed(\"Inputs/c\", pin)\ndef input_d_changed(pin):\n    input_changed(\"Inputs/d\", pin)\ndef input_changed(name, pin):\n    lock.acquire()\n    try:\n        # Lock the block around the callback handler to prevent inproper access based on debounce\n        outboundPayload = sparkplug.getDdataPayload()\n        addMetric(outboundPayload, name, None, MetricDataType.Boolean, pin.read());\n        byteArray = bytearray(outboundPayload.SerializeToString())\n        client.publish(\"spBv1.0/\" + myGroupId + \"/DDATA/\" + myNodeName + \"/\" + mySubNodeName, byteArray, 0, False)\n    finally:\n        lock.release()\n######################################################################\n\n######################################################################\n# The callback for when the client receives a CONNACK response from the server.\n######################################################################\ndef on_connect(client, userdata, flags, rc):\n    global myGroupId\n    global myNodeName\n    print(\"Connected with result code \"+str(rc))\n\n    # Subscribing in on_connect() means that if we lose the connection and\n    # reconnect then subscriptions will be renewed.\n    client.subscribe(\"spBv1.0/\" + myGroupId + \"/NCMD/\" + myNodeName + \"/#\")\n    client.subscribe(\"spBv1.0/\" + myGroupId + \"/DCMD/\" + myNodeName + \"/#\")\n######################################################################\n\n######################################################################\n# The callback for when a PUBLISH message is received from the server.\n######################################################################\ndef on_message(client, userdata, msg):\n    print(\"Message arrived: \" + msg.topic)\n    tokens = msg.topic.split(\"/\")\n\n    if tokens[0] == \"spBv1.0\" and tokens[1] == myGroupId and tokens[2] == \"DCMD\" and tokens[3] == myNodeName:\n        inboundPayload = sparkplug_b_pb2.Payload()\n        inboundPayload.ParseFromString(msg.payload)\n        outboundPayload = sparkplug.getDdataPayload()\n\n        for metric in inboundPayload.metrics:\n            print \"Tag Name: \" + metric.name\n            if metric.name == \"Outputs/e\":\n                pibrella.output.e.write(metric.boolean_value)\n                addMetric(outboundPayload, \"Outputs/e\", None, MetricDataType.Boolean, pibrella.output.e.read())\n            elif metric.name == \"Outputs/f\":\n                pibrella.output.f.write(metric.boolean_value)\n                addMetric(outboundPayload, \"Outputs/f\", None, MetricDataType.Boolean, pibrella.output.f.read())\n            elif metric.name == \"Outputs/g\":\n                pibrella.output.g.write(metric.boolean_value)\n                addMetric(outboundPayload, \"Outputs/g\", None, MetricDataType.Boolean, pibrella.output.g.read())\n            elif metric.name == \"Outputs/h\":\n                pibrella.output.h.write(metric.boolean_value)\n                addMetric(outboundPayload, \"Outputs/h\", None, MetricDataType.Boolean, pibrella.output.h.read())\n            elif metric.name == \"Outputs/LEDs/green\":\n                if metric.boolean_value:\n                    pibrella.light.green.on()\n                else:\n                    pibrella.light.green.off()\n                addMetric(outboundPayload, \"Outputs/LEDs/green\", None, MetricDataType.Boolean, pibrella.light.green.read())\n            elif metric.name == \"Outputs/LEDs/red\":\n                if metric.boolean_value:\n                    pibrella.light.red.on()\n                else:\n                    pibrella.light.red.off()\n                addMetric(outboundPayload, \"Outputs/LEDs/red\", None, MetricDataType.Boolean, pibrella.light.red.read())\n            elif metric.name == \"Outputs/LEDs/yellow\":\n                if metric.boolean_value:\n                    pibrella.light.yellow.on()\n                else:\n                    pibrella.light.yellow.off()\n                addMetric(outboundPayload, \"Outputs/LEDs/yellow\", None, MetricDataType.Boolean, pibrella.light.yellow.read())\n            elif metric.name == \"buzzer_fail\":\n                pibrella.buzzer.fail()\n            elif metric.name == \"buzzer_success\":\n                pibrella.buzzer.success()\n\n        byteArray = bytearray(outboundPayload.SerializeToString())\n        client.publish(\"spBv1.0/\" + myGroupId + \"/DDATA/\" + myNodeName + \"/\" + mySubNodeName, byteArray, 0, False)\n    elif tokens[0] == \"spBv1.0\" and tokens[1] == myGroupId and tokens[2] == \"NCMD\" and tokens[3] == myNodeName:\n        inboundPayload = sparkplug_b_pb2.Payload()\n        inboundPayload.ParseFromString(msg.payload)\n        for metric in inboundPayload.metrics:\n            if metric.name == \"Node Control/Next Server\":\n                publishBirths()\n            if metric.name == \"Node Control/Rebirth\":\n                publishBirths()\n            if metric.name == \"Node Control/Reboot\":\n                publishBirths()\n    else:\n        print \"Unknown command...\"\n\n    print \"done publishing\"\n######################################################################\n\n######################################################################\n# Publish the Birth certificate\n######################################################################\ndef publishBirths():\n    print(\"Publishing Birth\")\n\n    # Create the NBIRTH payload\n    payload = sparkplug.getNodeBirthPayload()\n\n    # Add the Node Controls\n    addMetric(payload, \"Node Control/Next Server\", None, MetricDataType.Boolean, False)\n    addMetric(payload, \"Node Control/Rebirth\", None, MetricDataType.Boolean, False)\n    addMetric(payload, \"Node Control/Reboot\", None, MetricDataType.Boolean, False)\n\n    # Set up the device Parameters\n    p = subprocess.Popen('uname -a', shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)\n    for line in p.stdout.readlines():\n        unameOutput = line,\n    retVal = p.wait()\n    p = subprocess.Popen('cat /proc/cpuinfo | grep Hardware', shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)\n    for line in p.stdout.readlines():\n        hardwareOutput = line,\n    retVal = p.wait()\n    p = subprocess.Popen('cat /proc/cpuinfo | grep Revision', shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)\n    for line in p.stdout.readlines():\n        revisionOutput = line,\n    retVal = p.wait()\n    p = subprocess.Popen('cat /proc/cpuinfo | grep Serial', shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)\n    for line in p.stdout.readlines():\n        serialOutput = line,\n    retVal = p.wait()\n    addMetric(payload, \"Parameters/sw_version\", None, MetricDataType.String, ''.join(unameOutput))\n    addMetric(payload, \"Parameters/hw_version\", None, MetricDataType.String, ''.join(hardwareOutput))\n    addMetric(payload, \"Parameters/hw_revision\", None, MetricDataType.String, ''.join(revisionOutput))\n    addMetric(payload, \"Parameters/hw_serial\", None, MetricDataType.String, ''.join(serialOutput))\n\n    # Publish the NBIRTH certificate\n    byteArray = bytearray(payload.SerializeToString())\n    client.publish(\"spBv1.0/\" + myGroupId + \"/NBIRTH/\" + myNodeName, byteArray, 0, False)\n\n    # Set up the DBIRTH with the input metrics\n    payload = sparkplug.getDeviceBirthPayload()\n\n    addMetric(payload, \"Inputs/a\", None, MetricDataType.Boolean, pibrella.input.a.read())\n    addMetric(payload, \"Inputs/b\", None, MetricDataType.Boolean, pibrella.input.b.read())\n    addMetric(payload, \"Inputs/c\", None, MetricDataType.Boolean, pibrella.input.c.read())\n    addMetric(payload, \"Inputs/d\", None, MetricDataType.Boolean, pibrella.input.d.read())\n\n    # Set up the output states on first run so Ignition and MQTT Engine are aware of them\n    addMetric(payload, \"Outputs/e\", None, MetricDataType.Boolean, pibrella.output.e.read())\n    addMetric(payload, \"Outputs/f\", None, MetricDataType.Boolean, pibrella.output.f.read())\n    addMetric(payload, \"Outputs/g\", None, MetricDataType.Boolean, pibrella.output.g.read())\n    addMetric(payload, \"Outputs/h\", None, MetricDataType.Boolean, pibrella.output.h.read())\n    addMetric(payload, \"Outputs/LEDs/green\", None, MetricDataType.Boolean, pibrella.light.green.read())\n    addMetric(payload, \"Outputs/LEDs/red\", None, MetricDataType.Boolean, pibrella.light.red.read())\n    addMetric(payload, \"Outputs/LEDs/yellow\", None, MetricDataType.Boolean, pibrella.light.yellow.read())\n    addMetric(payload, \"button\", None, MetricDataType.Boolean, pibrella.button.read())\n    addMetric(payload, \"buzzer_fail\", None, MetricDataType.Boolean, 0)\n    addMetric(payload, \"buzzer_success\", None, MetricDataType.Boolean, 0)\n\n    # Publish the initial data with the DBIRTH certificate\n    totalByteArray = bytearray(payload.SerializeToString())\n    client.publish(\"spBv1.0/\" + myGroupId + \"/DBIRTH/\" + myNodeName + \"/\" + mySubNodeName, totalByteArray, 0, False)\n######################################################################\n\n# Create the NDEATH payload\ndeathPayload = sparkplug.getNodeDeathPayload()\n\n# Start of main program - Set up the MQTT client connection\nclient = mqtt.Client(serverUrl, 1883, 60)\nclient.on_connect = on_connect\nclient.on_message = on_message\nclient.username_pw_set(myUsername, myPassword)\ndeathByteArray = bytearray(deathPayload.SerializeToString())\nclient.will_set(\"spBv1.0/\" + myGroupId + \"/NDEATH/\" + myNodeName, deathByteArray, 0, False)\nclient.connect(serverUrl, 1883, 60)\n\n# Short delay to allow connect callback to occur\ntime.sleep(.1)\nclient.loop()\n\npublishBirths()\n\n# Set up the button press event handler\npibrella.button.changed(button_changed)\npibrella.input.a.changed(input_a_changed)\npibrella.input.b.changed(input_b_changed)\npibrella.input.c.changed(input_c_changed)\npibrella.input.d.changed(input_d_changed)\n\n# Sit and wait for inbound or outbound events\nwhile True:\n    time.sleep(.1)\n    client.loop()\n\n"
  },
  {
    "path": "python/examples/example_simple.py",
    "content": "#!/usr/bin/python\n#/********************************************************************************\n# * Copyright (c) 2014, 2018 Cirrus Link Solutions and others\n# *\n# * This program and the accompanying materials are made available under the\n# * terms of the Eclipse Public License 2.0 which is available at\n# * http://www.eclipse.org/legal/epl-2.0.\n# *\n# * SPDX-License-Identifier: EPL-2.0\n# *\n# * Contributors:\n# *   Cirrus Link Solutions - initial implementation\n# ********************************************************************************/\nimport sys\nsys.path.insert(0, \"../core/\")\n#print(sys.path)\n\nimport paho.mqtt.client as mqtt\nimport sparkplug_b as sparkplug\nimport time\nimport random\nimport string\n\nfrom sparkplug_b import *\n\n# Application Variables\nserverUrl = \"localhost\"\nmyGroupId = \"Sparkplug B Devices\"\nmyNodeName = \"Python Edge Node 1\"\nmyDeviceName = \"Emulated Device\"\npublishPeriod = 5000\nmyUsername = \"admin\"\nmyPassword = \"changeme\"\n\n######################################################################\n# The callback for when the client receives a CONNACK response from the server.\n######################################################################\ndef on_connect(client, userdata, flags, rc):\n    if rc == 0:\n        print(\"Connected with result code \"+str(rc))\n    else:\n        print(\"Failed to connect with result code \"+str(rc))\n        sys.exit()\n\n    global myGroupId\n    global myNodeName\n\n    # Subscribing in on_connect() means that if we lose the connection and\n    # reconnect then subscriptions will be renewed.\n    client.subscribe(\"spBv1.0/\" + myGroupId + \"/NCMD/\" + myNodeName + \"/#\")\n    client.subscribe(\"spBv1.0/\" + myGroupId + \"/DCMD/\" + myNodeName + \"/#\")\n######################################################################\n\n######################################################################\n# The callback for when a PUBLISH message is received from the server.\n######################################################################\ndef on_message(client, userdata, msg):\n    print(\"Message arrived: \" + msg.topic)\n    tokens = msg.topic.split(\"/\")\n\n    if tokens[0] == \"spBv1.0\" and tokens[1] == myGroupId and (tokens[2] == \"NCMD\" or tokens[2] == \"DCMD\") and tokens[3] == myNodeName:\n        inboundPayload = sparkplug_b_pb2.Payload()\n        inboundPayload.ParseFromString(msg.payload)\n        for metric in inboundPayload.metrics:\n            if metric.name == \"Node Control/Next Server\":\n                # 'Node Control/Next Server' is an NCMD used to tell the device/client application to\n                # disconnect from the current MQTT server and connect to the next MQTT server in the\n                # list of available servers.  This is used for clients that have a pool of MQTT servers\n                # to connect to.\n                print( \"'Node Control/Next Server' is not implemented in this example\")\n            elif metric.name == \"Node Control/Rebirth\":\n                # 'Node Control/Rebirth' is an NCMD used to tell the device/client application to resend\n                # its full NBIRTH and DBIRTH again.  MQTT Engine will send this NCMD to a device/client\n                # application if it receives an NDATA or DDATA with a metric that was not published in the\n                # original NBIRTH or DBIRTH.  This is why the application must send all known metrics in\n                # its original NBIRTH and DBIRTH messages.\n                publishBirth()\n            elif metric.name == \"Node Control/Reboot\":\n                # 'Node Control/Reboot' is an NCMD used to tell a device/client application to reboot\n                # This can be used for devices that need a full application reset via a soft reboot.\n                # In this case, we fake a full reboot with a republishing of the NBIRTH and DBIRTH\n                # messages.\n                publishBirth()\n            elif metric.name == \"output/Device Metric2\":\n                # This is a metric we declared in our DBIRTH message and we're emulating an output.\n                # So, on incoming 'writes' to the output we must publish a DDATA with the new output\n                # value.  If this were a real output we'd write to the output and then read it back\n                # before publishing a DDATA message.\n\n                # We know this is an Int16 because of how we declated it in the DBIRTH\n                newValue = metric.int_value\n                print( \"CMD message for output/Device Metric2 - New Value: {}\".format(newValue))\n\n                # Create the DDATA payload\n                payload = sparkplug.getDdataPayload()\n                addMetric(payload, None, None, MetricDataType.Int16, newValue)\n\n                # Publish a message data\n                byteArray = bytearray(payload.SerializeToString())\n                client.publish(\"spBv1.0/\" + myGroupId + \"/DDATA/\" + myNodeName + \"/\" + myDeviceName, byteArray, 0, False)\n            elif metric.name == \"output/Device Metric3\":\n                # This is a metric we declared in our DBIRTH message and we're emulating an output.\n                # So, on incoming 'writes' to the output we must publish a DDATA with the new output\n                # value.  If this were a real output we'd write to the output and then read it back\n                # before publishing a DDATA message.\n\n                # We know this is an Boolean because of how we declated it in the DBIRTH\n                newValue = metric.boolean_value\n                print( \"CMD message for output/Device Metric3 - New Value: %r\" % newValue)\n\n                # Create the DDATA payload\n                payload = sparkplug.getDdataPayload()\n                addMetric(payload, None, None, MetricDataType.Boolean, newValue)\n\n                # Publish a message data\n                byteArray = bytearray(payload.SerializeToString())\n                client.publish(\"spBv1.0/\" + myGroupId + \"/DDATA/\" + myNodeName + \"/\" + myDeviceName, byteArray, 0, False)\n            else:\n                print( \"Unknown command: \" + metric.name)\n    else:\n        print( \"Unknown command...\")\n\n    print( \"Done publishing\")\n######################################################################\n\n######################################################################\n# Publish the BIRTH certificates\n######################################################################\ndef publishBirth():\n    publishNodeBirth()\n    publishDeviceBirth()\n######################################################################\n\n######################################################################\n# Publish the NBIRTH certificate\n######################################################################\ndef publishNodeBirth():\n    print( \"Publishing Node Birth\")\n\n    # Create the node birth payload\n    payload = sparkplug.getNodeBirthPayload()\n\n    # Set up the Node Controls\n    addMetric(payload, \"Node Control/Next Server\", None, MetricDataType.Boolean, False)\n    addMetric(payload, \"Node Control/Rebirth\", None, MetricDataType.Boolean, False)\n    addMetric(payload, \"Node Control/Reboot\", None, MetricDataType.Boolean, False)\n\n    # Add some regular node metrics\n    addMetric(payload, \"Node Metric0\", None, MetricDataType.String, \"hello node\")\n    addMetric(payload, \"Node Metric1\", None, MetricDataType.Boolean, True)\n    addNullMetric(payload, \"Node Metric3\", None, MetricDataType.Int32)\n\n    # Create a DataSet (012 - 345) two rows with Int8, Int16, and Int32 contents and headers Int8s, Int16s, Int32s and add it to the payload\n    columns = [\"Int8s\", \"Int16s\", \"Int32s\"]\n    types = [DataSetDataType.Int8, DataSetDataType.Int16, DataSetDataType.Int32]\n    dataset = initDatasetMetric(payload, \"DataSet\", None, columns, types)\n    row = dataset.rows.add()\n    element = row.elements.add();\n    element.int_value = 0\n    element = row.elements.add();\n    element.int_value = 1\n    element = row.elements.add();\n    element.int_value = 2\n    row = dataset.rows.add()\n    element = row.elements.add();\n    element.int_value = 3\n    element = row.elements.add();\n    element.int_value = 4\n    element = row.elements.add();\n    element.int_value = 5\n\n    # Add a metric with a custom property\n    metric = addMetric(payload, \"Node Metric2\", None, MetricDataType.Int16, 13)\n    metric.properties.keys.extend([\"engUnit\"])\n    propertyValue = metric.properties.values.add()\n    propertyValue.type = ParameterDataType.String\n    propertyValue.string_value = \"MyCustomUnits\"\n\n    # Create the UDT definition value which includes two UDT members and a single parameter and add it to the payload\n    template = initTemplateMetric(payload, \"_types_/Custom_Motor\", None, None)\n    templateParameter = template.parameters.add()\n    templateParameter.name = \"Index\"\n    templateParameter.type = ParameterDataType.String\n    templateParameter.string_value = \"0\"\n    addMetric(template, \"RPMs\", None, MetricDataType.Int32, 0)\n    addMetric(template, \"AMPs\", None, MetricDataType.Int32, 0)\n\n    # Publish the node birth certificate\n    byteArray = bytearray(payload.SerializeToString())\n    client.publish(\"spBv1.0/\" + myGroupId + \"/NBIRTH/\" + myNodeName, byteArray, 0, False)\n######################################################################\n\n######################################################################\n# Publish the DBIRTH certificate\n######################################################################\ndef publishDeviceBirth():\n    print( \"Publishing Device Birth\")\n\n    # Get the payload\n    payload = sparkplug.getDeviceBirthPayload()\n\n    # Add some device metrics\n    addMetric(payload, \"input/Device Metric0\", None, MetricDataType.String, \"hello device\")\n    addMetric(payload, \"input/Device Metric1\", None, MetricDataType.Boolean, True)\n    addMetric(payload, \"output/Device Metric2\", None, MetricDataType.Int16, 16)\n    addMetric(payload, \"output/Device Metric3\", None, MetricDataType.Boolean, True)\n    addMetric(payload, \"DateTime Metric\", None, MetricDataType.DateTime, long(time.time() * 1000))\n\n    # Create the UDT definition value which includes two UDT members and a single parameter and add it to the payload\n    template = initTemplateMetric(payload, \"My_Custom_Motor\", None, \"Custom_Motor\")\n    templateParameter = template.parameters.add()\n    templateParameter.name = \"Index\"\n    templateParameter.type = ParameterDataType.String\n    templateParameter.string_value = \"1\"\n    addMetric(template, \"RPMs\", None, MetricDataType.Int32, 123)\n    addMetric(template, \"AMPs\", None, MetricDataType.Int32, 456)\n\n    # Publish the initial data with the Device BIRTH certificate\n    totalByteArray = bytearray(payload.SerializeToString())\n    client.publish(\"spBv1.0/\" + myGroupId + \"/DBIRTH/\" + myNodeName + \"/\" + myDeviceName, totalByteArray, 0, False)\n######################################################################\n\n######################################################################\n# Main Application\n######################################################################\nprint(\"Starting main application\")\n\n# Create the node death payload\ndeathPayload = sparkplug.getNodeDeathPayload()\n\n# Start of main program - Set up the MQTT client connection\nclient = mqtt.Client(serverUrl, 1883, 60)\nclient.on_connect = on_connect\nclient.on_message = on_message\nclient.username_pw_set(myUsername, myPassword)\ndeathByteArray = bytearray(deathPayload.SerializeToString())\nclient.will_set(\"spBv1.0/\" + myGroupId + \"/NDEATH/\" + myNodeName, deathByteArray, 0, False)\nclient.connect(serverUrl, 1883, 60)\n\n# Short delay to allow connect callback to occur\ntime.sleep(.1)\nclient.loop()\n\n# Publish the birth certificates\npublishBirth()\n\nwhile True:\n    # Periodically publish some new data\n    payload = sparkplug.getDdataPayload()\n\n    # Add some random data to the inputs\n    addMetric(payload, None, None, MetricDataType.String, ''.join(random.choice(string.ascii_lowercase) for i in range(12)))\n\n    # Note this data we're setting to STALE via the propertyset as an example\n    metric = addMetric(payload, None, None, MetricDataType.Boolean, random.choice([True, False]))\n    metric.properties.keys.extend([\"Quality\"])\n    propertyValue = metric.properties.values.add()\n    propertyValue.type = ParameterDataType.Int32\n    propertyValue.int_value = 500\n\n    # Publish a message data\n    byteArray = bytearray(payload.SerializeToString())\n    client.publish(\"spBv1.0/\" + myGroupId + \"/DDATA/\" + myNodeName + \"/\" + myDeviceName, byteArray, 0, False)\n\n    # Sit and wait for inbound or outbound events\n    for _ in range(5):\n        time.sleep(.1)\n        client.loop()\n######################################################################\n"
  },
  {
    "path": "sparkplug_b/sparkplug_b.json",
    "content": "{\n    \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n    \"title\": \"Sparkplug B Payload\",\n    \"description\": \"A Sparkplug B payload\",\n    \"definitions\" : {\n        \"parameter\" : {\n            \"type\" : \"object\",\n            \"properties\" : {\n                \"name\" : { \"type\" : \"string\" },\n                \"type\" : { \"type\" : \"string\" },\n                \"value\" : { \"type\" : [ \"string\", \"number\", \"integer\", \"boolean\" ] }\n            },\n            \"additionalProperties\" : false\n        },\n        \"template\" : {\n            \"type\" : \"object\",\n            \"properties\" : {\n                \"version\" : { \"type\" : \"string\" },\n                \"reference\" : { \"type\" : \"string\" },\n                \"isDefinition\" : { \"type\" : \"boolean\" },\n                \"parameters\" : {\n                    \"type\" : \"array\",\n                    \"items\" : { \"$ref\" : \"#/definitions/parameter\" }\n                },\n                \"metrics\" : {\n                    \"type\" : \"array\",\n                    \"items\" : { \"$ref\" : \"#/definitions/metric\" }\n                }\n            },\n            \"additionalProperties\" : false\n        },\n        \"dataset\" : {\n            \"type\" : \"object\",\n            \"properties\" : {\n                \"numberOfColumns\" : { \"type\" : \"integer\" },\n                \"columnNames\" : { \n                    \"type\" : \"array\",\n                    \"items\" : { \"type\" : \"string\" }\n                },\n                \"types\" : { \n                    \"type\" : \"array\",\n                    \"items\" : { \"type\" : \"string\" }\n                },\n                \"rows\" : {\n                    \"type\" : \"array\",\n                    \"items\" : {\n                        \"type\" : \"array\",\n                        \"items\" : { \"type\" : [ \"string\", \"number\", \"integer\", \"boolean\" ] }\n                    }\n                }\n            },\n            \"additionalProperties\" : false\n        },\n\n        \"property\" : {\n            \"type\" : \"object\",\n            \"properties\" : {\n                \"type\" : { \"type\" : \"string\" },\n                \"value\" : {\n                    \"oneOf\" : [\n                        { \"type\" : [ \"string\", \"number\", \"integer\", \"boolean\", \"null\" ] },\n                        { \"$ref\" : \"#/definitions/propertySet\" },\n                        {\n                            \"type\" : \"array\",\n                            \"items\" : { \"$ref\" : \"#/definitions/propertySet\" }\n                        }\n                    ]\n                }\n            },\n            \"additionalProperties\" : false\n        },\n\n        \"propertySet\" : {\n            \"type\" : \"object\",\n            \"additionalProperties\" : { \"$ref\" : \"#/definitions/property\" }\n        },\n\n        \"metadata\" : {\n            \"type\" : \"object\",\n            \"properties\" : { \n                \"contentType\" : { \"type\" : \"string\" },\n                \"isMultiPart\" : { \"type\" : \"boolean\" },\n                \"seq\" : { \"type\" : \"integer\" },\n                \"size\" : { \"type\" : \"integer\" },\n                \"fileName\" : { \"type\" : \"string\" },\n                \"fileType\" : { \"type\" : \"string\" },\n                \"md5\" : { \"type\" : \"string\" },\n                \"description\" : { \"type\" : \"string\" }\n            }\n        },\n\n        \"metric\" : {\n            \"type\" : \"object\",\n            \"properties\" : { \n                \"name\" : { \"type\" : \"string\" },\n                \"alias\" : { \"type\" : \"integer\" },\n                \"timestamp\" : { \"type\" : \"integer\" },\n                \"datatype\" : { \"type\" : \"integer\" },\n                \"isHistorical\" : { \"type\" : \"boolean\" },\n                \"isTransient\" : { \"type\" : \"boolean\" },\n                \"metadata\" : { \"$ref\" : \"#/definitions/metadata\" },\n                \"properties\" : { \"$ref\" : \"#/definitions/propertySet\" },\n                \"value\" : {\n                    \"oneOf\" : [\n                        { \"type\" : [ \"string\", \"number\", \"integer\", \"boolean\", \"null\" ] },\n                        { \"$ref\" : \"#/definitions/dataset\" },\n                        { \"$ref\" : \"#/definitions/template\" }\n                    ]\n                }\n            }\n        }\n    },\n\n    \"type\": \"object\",\n    \"properties\": {\n        \"timestamp\" : { \n            \"description\" : \"A timestamp in milliseconds\",\n            \"type\" : \"integer\" \n        },\n        \"seq\" : { \n            \"description\" : \"A sequence number\",\n            \"type\" : \"integer\"\n        },\n        \"uuid\" : {\n            \"description\" : \"A unique identifier\",\n            \"type\" : \"string\"\n        },\n        \"body\" : {\n            \"description\" : \"A UTF-8 encoded string representing a byte array\",\n            \"type\" : \"string\"\n        },\n        \"metrics\" : {\n            \"description\" : \"An array of metrics\",\n            \"type\" : \"array\",\n            \"items\" : { \"$ref\" : \"#/definitions/metric\" }\n        }\n    }\n}\n"
  },
  {
    "path": "sparkplug_b/sparkplug_b.proto",
    "content": "// * Copyright (c) 2015, 2018 Cirrus Link Solutions and others\n// *\n// * This program and the accompanying materials are made available under the\n// * terms of the Eclipse Public License 2.0 which is available at\n// * http://www.eclipse.org/legal/epl-2.0.\n// *\n// * SPDX-License-Identifier: EPL-2.0\n// *\n// * Contributors:\n// *   Cirrus Link Solutions - initial implementation\n\n//\n// To compile:\n// cd client_libraries/java\n// protoc --proto_path=../../ --java_out=src/main/java ../../sparkplug_b.proto\n//\n\nsyntax = \"proto2\";\n\npackage org.eclipse.tahu.protobuf;\n\noption java_package         = \"org.eclipse.tahu.protobuf\";\noption java_outer_classname = \"SparkplugBProto\";\n\nenum DataType {\n    // Indexes of Data Types\n\n    // Unknown placeholder for future expansion.\n    Unknown         = 0;\n\n    // Basic Types\n    Int8            = 1;\n    Int16           = 2;\n    Int32           = 3;\n    Int64           = 4;\n    UInt8           = 5;\n    UInt16          = 6;\n    UInt32          = 7;\n    UInt64          = 8;\n    Float           = 9;\n    Double          = 10;\n    Boolean         = 11;\n    String          = 12;\n    DateTime        = 13;\n    Text            = 14;\n\n    // Additional Metric Types\n    UUID            = 15;\n    DataSet         = 16;\n    Bytes           = 17;\n    File            = 18;\n    Template        = 19;\n\n    // Additional PropertyValue Types\n    PropertySet     = 20;\n    PropertySetList = 21;\n\n    // Array Types\n    Int8Array = 22;\n    Int16Array = 23;\n    Int32Array = 24;\n    Int64Array = 25;\n    UInt8Array = 26;\n    UInt16Array = 27;\n    UInt32Array = 28;\n    UInt64Array = 29;\n    FloatArray = 30;\n    DoubleArray = 31;\n    BooleanArray = 32;\n    StringArray = 33;\n    DateTimeArray = 34;\n}\n\nmessage Payload {\n\n    message Template {\n\n        message Parameter {\n            optional string name        = 1;\n            optional uint32 type        = 2;\n\n            oneof value {\n                uint32 int_value        = 3;\n                uint64 long_value       = 4;\n                float  float_value      = 5;\n                double double_value     = 6;\n                bool   boolean_value    = 7;\n                string string_value     = 8;\n                ParameterValueExtension extension_value = 9;\n            }\n\n            message ParameterValueExtension {\n                extensions              1 to max;\n            }\n        }\n\n        optional string version         = 1;          // The version of the Template to prevent mismatches\n        repeated Metric metrics         = 2;          // Each metric includes a name, datatype, and optionally a value\n        repeated Parameter parameters   = 3;\n        optional string template_ref    = 4;          // MUST be a reference to a template definition if this is an instance (i.e. the name of the template definition) - MUST be omitted for template definitions\n        optional bool is_definition     = 5;\n        extensions                      6 to max;\n    }\n\n    message DataSet {\n\n        message DataSetValue {\n\n            oneof value {\n                uint32 int_value                        = 1;\n                uint64 long_value                       = 2;\n                float  float_value                      = 3;\n                double double_value                     = 4;\n                bool   boolean_value                    = 5;\n                string string_value                     = 6;\n                DataSetValueExtension extension_value   = 7;\n            }\n\n            message DataSetValueExtension {\n                extensions  1 to max;\n            }\n        }\n\n        message Row {\n            repeated DataSetValue elements  = 1;\n            extensions                      2 to max;   // For third party extensions\n        }\n\n        optional uint64   num_of_columns    = 1;\n        repeated string   columns           = 2;\n        repeated uint32   types             = 3;\n        repeated Row      rows              = 4;\n        extensions                          5 to max;   // For third party extensions\n    }\n\n    message PropertyValue {\n\n        optional uint32     type                    = 1;\n        optional bool       is_null                 = 2;\n\n        oneof value {\n            uint32          int_value               = 3;\n            uint64          long_value              = 4;\n            float           float_value             = 5;\n            double          double_value            = 6;\n            bool            boolean_value           = 7;\n            string          string_value            = 8;\n            PropertySet     propertyset_value       = 9;\n            PropertySetList propertysets_value      = 10;      // List of Property Values\n            PropertyValueExtension extension_value  = 11;\n        }\n\n        message PropertyValueExtension {\n            extensions                             1 to max;\n        }\n    }\n\n    message PropertySet {\n        repeated string        keys     = 1;         // Names of the properties\n        repeated PropertyValue values   = 2;\n        extensions                      3 to max;\n    }\n\n    message PropertySetList {\n        repeated PropertySet propertyset = 1;\n        extensions                       2 to max;\n    }\n\n    message MetaData {\n        // Bytes specific metadata\n        optional bool   is_multi_part   = 1;\n\n        // General metadata\n        optional string content_type    = 2;        // Content/Media type\n        optional uint64 size            = 3;        // File size, String size, Multi-part size, etc\n        optional uint64 seq             = 4;        // Sequence number for multi-part messages\n\n        // File metadata\n        optional string file_name       = 5;        // File name\n        optional string file_type       = 6;        // File type (i.e. xml, json, txt, cpp, etc)\n        optional string md5             = 7;        // md5 of data\n\n        // Catchalls and future expansion\n        optional string description     = 8;        // Could be anything such as json or xml of custom properties\n        extensions                      9 to max;\n    }\n\n    message Metric {\n\n        optional string   name          = 1;        // Metric name - should only be included on birth\n        optional uint64   alias         = 2;        // Metric alias - tied to name on birth and included in all later DATA messages\n        optional uint64   timestamp     = 3;        // Timestamp associated with data acquisition time\n        optional uint32   datatype      = 4;        // DataType of the metric/tag value\n        optional bool     is_historical = 5;        // If this is historical data and should not update real time tag\n        optional bool     is_transient  = 6;        // Tells consuming clients such as MQTT Engine to not store this as a tag\n        optional bool     is_null       = 7;        // If this is null - explicitly say so rather than using -1, false, etc for some datatypes.\n        optional MetaData metadata      = 8;        // Metadata for the payload\n        optional PropertySet properties = 9;\n\n        oneof value {\n            uint32   int_value                      = 10;\n            uint64   long_value                     = 11;\n            float    float_value                    = 12;\n            double   double_value                   = 13;\n            bool     boolean_value                  = 14;\n            string   string_value                   = 15;\n            bytes    bytes_value                    = 16;       // Bytes, File\n            DataSet  dataset_value                  = 17;\n            Template template_value                 = 18;\n            MetricValueExtension extension_value    = 19;\n        }\n\n        message MetricValueExtension {\n            extensions  1 to max;\n        }\n    }\n\n    optional uint64   timestamp     = 1;        // Timestamp at message sending time\n    repeated Metric   metrics       = 2;        // Repeated forever - no limit in Google Protobufs\n    optional uint64   seq           = 3;        // Sequence number\n    optional string   uuid          = 4;        // UUID to track message type in terms of schema definitions\n    optional bytes    body          = 5;        // To optionally bypass the whole definition above\n    extensions                      6 to max;   // For third party extensions\n}\n"
  },
  {
    "path": "sparkplug_b/sparkplug_b_c_sharp.proto",
    "content": "// * Copyright (c) 2015, 2018 Cirrus Link Solutions and others\n// *\n// * This program and the accompanying materials are made available under the\n// * terms of the Eclipse Public License 2.0 which is available at\n// * http://www.eclipse.org/legal/epl-2.0.\n// *\n// * SPDX-License-Identifier: EPL-2.0\n// *\n// * Contributors:\n// *   Cirrus Link Solutions - initial implementation\n//\n// To compile:\n// cd client_libraries/c_sharp\n// protoc --proto_path=../../ --csharp_out=src --csharp_opt=base_namespace=Org.Eclipse.Tahu.Protobuf ../../sparkplug_b/sparkplug_b_c_sharp.proto\n//\n\nsyntax = \"proto3\";\n\nimport \"google/protobuf/any.proto\";\n\npackage org.eclipse.tahu.protobuf;\n\noption java_package         = \"org.eclipse.tahu.protobuf\";\noption java_outer_classname = \"SparkplugBProto\";\n\nenum DataType {\n    // Indexes of Data Types\n\n    // Unknown placeholder for future expansion.\n    Unknown         = 0;\n\n    // Basic Types\n    Int8            = 1;\n    Int16           = 2;\n    Int32           = 3;\n    Int64           = 4;\n    UInt8           = 5;\n    UInt16          = 6;\n    UInt32          = 7;\n    UInt64          = 8;\n    Float           = 9;\n    Double          = 10;\n    Boolean         = 11;\n    String          = 12;\n    DateTime        = 13;\n    Text            = 14;\n\n    // Additional Metric Types\n    UUID            = 15;\n    DataSet         = 16;\n    Bytes           = 17;\n    File            = 18;\n    Template        = 19;\n\n    // Additional PropertyValue Types\n    PropertySet     = 20;\n    PropertySetList = 21;\n\n    // Array Types\n    Int8Array = 22;\n    Int16Array = 23;\n    Int32Array = 24;\n    Int64Array = 25;\n    UInt8Array = 26;\n    UInt16Array = 27;\n    UInt32Array = 28;\n    UInt64Array = 29;\n    FloatArray = 30;\n    DoubleArray = 31;\n    BooleanArray = 32;\n    StringArray = 33;\n    DateTimeArray = 34;\n}\n\nmessage Payload {\n\n    message Template {\n\n        message Parameter {\n            string name        = 1;\n            uint32 type        = 2;\n\n            oneof value {\n                uint32 int_value        = 3;\n                uint64 long_value       = 4;\n                float  float_value      = 5;\n                double double_value     = 6;\n                bool   boolean_value    = 7;\n                string string_value     = 8;\n                ParameterValueExtension extension_value = 9;\n            }\n\n            message ParameterValueExtension {\n                repeated google.protobuf.Any extensions = 1;\n            }\n        }\n\n        string version                       = 1;          // The version of the Template to prevent mismatches\n        repeated Metric metrics              = 2;          // Each metric is the name of the metric and the datatype of the member but does not contain a value\n        repeated Parameter parameters        = 3;\n        string template_ref                  = 4;          // Reference to a template if this is extending a Template or an instance - must exist if an instance\n        bool is_definition                   = 5;\n        repeated google.protobuf.Any details = 6;\n    }\n\n    message DataSet {\n\n        message DataSetValue {\n\n            oneof value {\n                uint32 int_value                        = 1;\n                uint64 long_value                       = 2;\n                float  float_value                      = 3;\n                double double_value                     = 4;\n                bool   boolean_value                    = 5;\n                string string_value                     = 6;\n                DataSetValueExtension extension_value   = 7;\n            }\n\n            message DataSetValueExtension {\n                repeated google.protobuf.Any details = 1;\n            }\n        }\n\n        message Row {\n            repeated DataSetValue elements  = 1;\n            repeated google.protobuf.Any details = 2;\n        }\n\n        uint64 num_of_columns                = 1;\n        repeated string   columns            = 2;\n        repeated uint32   types              = 3;\n        repeated Row      rows               = 4;\n        repeated google.protobuf.Any details = 5;\n    }\n\n    message PropertyValue {\n\n        uint32     type                   = 1;\n        bool       is_null                = 2;\n\n        oneof value {\n            uint32          int_value              = 3;\n            uint64          long_value             = 4;\n            float           float_value            = 5;\n            double          double_value           = 6;\n            bool            boolean_value          = 7;\n            string          string_value           = 8;\n            PropertySet     propertyset_value      = 9;\n            PropertySetList propertysets_value     = 10;      // List of Property Values\n            PropertyValueExtension extension_value = 11;\n        }\n\n        message PropertyValueExtension {\n            repeated google.protobuf.Any details = 1;\n        }\n    }\n\n    message PropertySet {\n        repeated string        keys     = 1;         // Names of the properties\n        repeated PropertyValue values   = 2;\n        repeated google.protobuf.Any details = 3;\n    }\n\n    message PropertySetList {\n        repeated PropertySet propertyset = 1;\n        repeated google.protobuf.Any details = 2;\n    }\n\n    message MetaData {\n        // Bytes specific metadata\n        bool   is_multi_part   = 1;\n\n        // General metadata\n        string content_type    = 2;        // Content/Media type\n        uint64 size            = 3;        // File size, String size, Multi-part size, etc\n        uint64 seq             = 4;        // Sequence number for multi-part messages\n\n        // File metadata\n        string file_name       = 5;        // File name\n        string file_type       = 6;        // File type (i.e. xml, json, txt, cpp, etc)\n        string md5             = 7;        // md5 of data\n\n        // Catchalls and future expansion\n        string description     = 8;        // Could be anything such as json or xml of custom properties\n        repeated google.protobuf.Any details = 9;\n    }\n\n    message Metric {\n\n        string   name          = 1;        // Metric name - should only be included on birth\n        uint64   alias         = 2;        // Metric alias - tied to name on birth and included in all later DATA messages\n        uint64   timestamp     = 3;        // Timestamp associated with data acquisition time\n        uint32   datatype      = 4;        // DataType of the metric/tag value\n        bool     is_historical = 5;        // If this is historical data and should not update real time tag\n        bool     is_transient  = 6;        // Tells consuming clients such as MQTT Engine to not store this as a tag\n        bool     is_null       = 7;        // If this is null - explicitly say so rather than using -1, false, etc for some datatypes.\n        MetaData metadata      = 8;        // Metadata for the payload\n        PropertySet properties = 9;\n\n        oneof value {\n            uint32   int_value                      = 10;\n            uint64   long_value                     = 11;\n            float    float_value                    = 12;\n            double   double_value                   = 13;\n            bool     boolean_value                  = 14;\n            string   string_value                   = 15;\n            bytes    bytes_value                    = 16;       // Bytes, File\n            DataSet  dataset_value                  = 17;\n            Template template_value                 = 18;\n            MetricValueExtension extension_value    = 19;\n        }\n\n        message MetricValueExtension {\n            repeated google.protobuf.Any details = 1;\n        }\n    }\n\n    uint64   timestamp      = 1;        // Timestamp at message sending time\n    repeated Metric metrics = 2;        // Repeated forever - no limit in Google Protobufs\n    uint64   seq            = 3;        // Sequence number\n    string   uuid           = 4;        // UUID to track message type in terms of schema definitions\n    bytes    body           = 5;        // To optionally bypass the whole definition above\n    repeated google.protobuf.Any details = 6;\n}\n"
  }
]