网关V1,添加批量写

This commit is contained in:
2025-08-13 18:13:09 +08:00
commit 04bdb5f52b
1377 changed files with 186064 additions and 0 deletions

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

1489
gateway/api_server.py Normal file

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,467 @@
([('main.exe',
'D:\\Projects\\gateway\\gateway\\build\\main\\main.exe',
'EXECUTABLE'),
('python313.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\python313.dll',
'BINARY'),
('select.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\select.pyd',
'EXTENSION'),
('_socket.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\_socket.pyd',
'EXTENSION'),
('_lzma.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\_lzma.pyd',
'EXTENSION'),
('_bz2.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\_bz2.pyd',
'EXTENSION'),
('_decimal.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\_decimal.pyd',
'EXTENSION'),
('_hashlib.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\_hashlib.pyd',
'EXTENSION'),
('unicodedata.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\unicodedata.pyd',
'EXTENSION'),
('_ctypes.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\_ctypes.pyd',
'EXTENSION'),
('_queue.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\_queue.pyd',
'EXTENSION'),
('_overlapped.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\_overlapped.pyd',
'EXTENSION'),
('_ssl.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\_ssl.pyd',
'EXTENSION'),
('_asyncio.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\_asyncio.pyd',
'EXTENSION'),
('_multiprocessing.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\_multiprocessing.pyd',
'EXTENSION'),
('pyexpat.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\pyexpat.pyd',
'EXTENSION'),
('_wmi.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\_wmi.pyd',
'EXTENSION'),
('rpds\\rpds.cp313-win_amd64.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\rpds\\rpds.cp313-win_amd64.pyd',
'EXTENSION'),
('charset_normalizer\\md__mypyc.cp313-win_amd64.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\charset_normalizer\\md__mypyc.cp313-win_amd64.pyd',
'EXTENSION'),
('charset_normalizer\\md.cp313-win_amd64.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\charset_normalizer\\md.cp313-win_amd64.pyd',
'EXTENSION'),
('_uuid.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\DLLs\\_uuid.pyd',
'EXTENSION'),
('markupsafe\\_speedups.cp313-win_amd64.pyd',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\markupsafe\\_speedups.cp313-win_amd64.pyd',
'EXTENSION'),
('api-ms-win-crt-conio-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-crt-conio-l1-1-0.dll',
'BINARY'),
('api-ms-win-crt-runtime-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-crt-runtime-l1-1-0.dll',
'BINARY'),
('api-ms-win-crt-string-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-crt-string-l1-1-0.dll',
'BINARY'),
('api-ms-win-crt-heap-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-crt-heap-l1-1-0.dll',
'BINARY'),
('api-ms-win-crt-filesystem-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-crt-filesystem-l1-1-0.dll',
'BINARY'),
('api-ms-win-crt-time-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-crt-time-l1-1-0.dll',
'BINARY'),
('api-ms-win-crt-stdio-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-crt-stdio-l1-1-0.dll',
'BINARY'),
('api-ms-win-crt-locale-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-crt-locale-l1-1-0.dll',
'BINARY'),
('api-ms-win-crt-process-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-crt-process-l1-1-0.dll',
'BINARY'),
('zlib.dll', 'D:\\ProgramFiles\\Anaconda\\envs\\ota\\zlib.dll', 'BINARY'),
('api-ms-win-crt-math-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-crt-math-l1-1-0.dll',
'BINARY'),
('api-ms-win-crt-convert-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-crt-convert-l1-1-0.dll',
'BINARY'),
('api-ms-win-crt-environment-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-crt-environment-l1-1-0.dll',
'BINARY'),
('VCRUNTIME140.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\VCRUNTIME140.dll',
'BINARY'),
('liblzma.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Library\\bin\\liblzma.dll',
'BINARY'),
('LIBBZ2.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Library\\bin\\LIBBZ2.dll',
'BINARY'),
('libmpdec-4.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Library\\bin\\libmpdec-4.dll',
'BINARY'),
('libcrypto-3-x64.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Library\\bin\\libcrypto-3-x64.dll',
'BINARY'),
('ffi.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Library\\bin\\ffi.dll',
'BINARY'),
('libssl-3-x64.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Library\\bin\\libssl-3-x64.dll',
'BINARY'),
('libexpat.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Library\\bin\\libexpat.dll',
'BINARY'),
('api-ms-win-core-synch-l1-2-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-synch-l1-2-0.dll',
'BINARY'),
('ucrtbase.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\ucrtbase.dll',
'BINARY'),
('api-ms-win-crt-utility-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Library\\bin\\api-ms-win-crt-utility-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-handle-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-handle-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-file-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-file-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-file-l1-2-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-file-l1-2-0.dll',
'BINARY'),
('api-ms-win-core-processenvironment-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-processenvironment-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-heap-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-heap-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-sysinfo-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-sysinfo-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-datetime-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-datetime-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-interlocked-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-interlocked-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-processthreads-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-processthreads-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-fibers-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-fibers-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-profile-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-profile-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-processthreads-l1-1-1.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-processthreads-l1-1-1.dll',
'BINARY'),
('api-ms-win-core-util-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-util-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-debug-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-debug-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-timezone-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-timezone-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-synch-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-synch-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-console-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-console-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-errorhandling-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-errorhandling-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-string-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-string-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-rtlsupport-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-rtlsupport-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-libraryloader-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-libraryloader-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-localization-l1-2-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-localization-l1-2-0.dll',
'BINARY'),
('api-ms-win-core-file-l2-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-file-l2-1-0.dll',
'BINARY'),
('api-ms-win-core-memory-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-memory-l1-1-0.dll',
'BINARY'),
('api-ms-win-core-namedpipe-l1-1-0.dll',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\api-ms-win-core-namedpipe-l1-1-0.dll',
'BINARY'),
('jsonschema-4.25.0.dist-info\\METADATA',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema-4.25.0.dist-info\\METADATA',
'DATA'),
('jsonschema-4.25.0.dist-info\\REQUESTED',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema-4.25.0.dist-info\\REQUESTED',
'DATA'),
('jsonschema-4.25.0.dist-info\\RECORD',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema-4.25.0.dist-info\\RECORD',
'DATA'),
('jsonschema-4.25.0.dist-info\\INSTALLER',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema-4.25.0.dist-info\\INSTALLER',
'DATA'),
('jsonschema-4.25.0.dist-info\\WHEEL',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema-4.25.0.dist-info\\WHEEL',
'DATA'),
('jsonschema-4.25.0.dist-info\\licenses\\COPYING',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema-4.25.0.dist-info\\licenses\\COPYING',
'DATA'),
('jsonschema\\benchmarks\\issue232\\issue.json',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema\\benchmarks\\issue232\\issue.json',
'DATA'),
('jsonschema-4.25.0.dist-info\\entry_points.txt',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema-4.25.0.dist-info\\entry_points.txt',
'DATA'),
('setuptools\\_vendor\\importlib_metadata-8.0.0.dist-info\\METADATA',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\setuptools\\_vendor\\importlib_metadata-8.0.0.dist-info\\METADATA',
'DATA'),
('setuptools\\_vendor\\importlib_metadata-8.0.0.dist-info\\top_level.txt',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\setuptools\\_vendor\\importlib_metadata-8.0.0.dist-info\\top_level.txt',
'DATA'),
('setuptools\\_vendor\\importlib_metadata-8.0.0.dist-info\\INSTALLER',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\setuptools\\_vendor\\importlib_metadata-8.0.0.dist-info\\INSTALLER',
'DATA'),
('setuptools\\_vendor\\importlib_metadata-8.0.0.dist-info\\REQUESTED',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\setuptools\\_vendor\\importlib_metadata-8.0.0.dist-info\\REQUESTED',
'DATA'),
('setuptools\\_vendor\\importlib_metadata-8.0.0.dist-info\\LICENSE',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\setuptools\\_vendor\\importlib_metadata-8.0.0.dist-info\\LICENSE',
'DATA'),
('setuptools\\_vendor\\importlib_metadata-8.0.0.dist-info\\RECORD',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\setuptools\\_vendor\\importlib_metadata-8.0.0.dist-info\\RECORD',
'DATA'),
('setuptools\\_vendor\\importlib_metadata-8.0.0.dist-info\\WHEEL',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\setuptools\\_vendor\\importlib_metadata-8.0.0.dist-info\\WHEEL',
'DATA'),
('setuptools\\_vendor\\jaraco\\text\\Lorem ipsum.txt',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\setuptools\\_vendor\\jaraco\\text\\Lorem '
'ipsum.txt',
'DATA'),
('certifi\\cacert.pem',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\certifi\\cacert.pem',
'DATA'),
('certifi\\py.typed',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\certifi\\py.typed',
'DATA'),
('jsonschema_specifications\\schemas\\draft201909\\vocabularies\\meta-data',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft201909\\vocabularies\\meta-data',
'DATA'),
('jsonschema_specifications\\schemas\\draft202012\\metaschema.json',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft202012\\metaschema.json',
'DATA'),
('jsonschema_specifications\\schemas\\draft7\\metaschema.json',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft7\\metaschema.json',
'DATA'),
('jsonschema_specifications\\schemas\\draft202012\\vocabularies\\validation',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft202012\\vocabularies\\validation',
'DATA'),
('jsonschema_specifications\\schemas\\draft201909\\metaschema.json',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft201909\\metaschema.json',
'DATA'),
('jsonschema_specifications\\schemas\\draft202012\\vocabularies\\content',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft202012\\vocabularies\\content',
'DATA'),
('jsonschema_specifications\\schemas\\draft202012\\vocabularies\\format',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft202012\\vocabularies\\format',
'DATA'),
('jsonschema_specifications\\schemas\\draft202012\\vocabularies\\format-assertion',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft202012\\vocabularies\\format-assertion',
'DATA'),
('jsonschema_specifications\\schemas\\draft201909\\vocabularies\\core',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft201909\\vocabularies\\core',
'DATA'),
('jsonschema_specifications\\schemas\\draft4\\metaschema.json',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft4\\metaschema.json',
'DATA'),
('jsonschema_specifications\\schemas\\draft202012\\vocabularies\\format-annotation',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft202012\\vocabularies\\format-annotation',
'DATA'),
('jsonschema_specifications\\schemas\\draft201909\\vocabularies\\validation',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft201909\\vocabularies\\validation',
'DATA'),
('jsonschema_specifications\\schemas\\draft6\\metaschema.json',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft6\\metaschema.json',
'DATA'),
('jsonschema_specifications\\schemas\\draft201909\\vocabularies\\applicator',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft201909\\vocabularies\\applicator',
'DATA'),
('jsonschema_specifications\\schemas\\draft202012\\vocabularies\\meta-data',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft202012\\vocabularies\\meta-data',
'DATA'),
('jsonschema_specifications\\schemas\\draft202012\\vocabularies\\core',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft202012\\vocabularies\\core',
'DATA'),
('jsonschema_specifications\\schemas\\draft202012\\vocabularies\\unevaluated',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft202012\\vocabularies\\unevaluated',
'DATA'),
('jsonschema_specifications\\schemas\\draft201909\\vocabularies\\content',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft201909\\vocabularies\\content',
'DATA'),
('jsonschema_specifications\\schemas\\draft202012\\vocabularies\\applicator',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft202012\\vocabularies\\applicator',
'DATA'),
('jsonschema_specifications\\schemas\\draft3\\metaschema.json',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\jsonschema_specifications\\schemas\\draft3\\metaschema.json',
'DATA'),
('click-8.2.1.dist-info\\RECORD',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\click-8.2.1.dist-info\\RECORD',
'DATA'),
('click-8.2.1.dist-info\\INSTALLER',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\click-8.2.1.dist-info\\INSTALLER',
'DATA'),
('flask-3.1.1.dist-info\\entry_points.txt',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\flask-3.1.1.dist-info\\entry_points.txt',
'DATA'),
('flask-3.1.1.dist-info\\licenses\\LICENSE.txt',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\flask-3.1.1.dist-info\\licenses\\LICENSE.txt',
'DATA'),
('attrs-25.3.0.dist-info\\INSTALLER',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\attrs-25.3.0.dist-info\\INSTALLER',
'DATA'),
('attrs-25.3.0.dist-info\\RECORD',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\attrs-25.3.0.dist-info\\RECORD',
'DATA'),
('click-8.2.1.dist-info\\licenses\\LICENSE.txt',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\click-8.2.1.dist-info\\licenses\\LICENSE.txt',
'DATA'),
('attrs-25.3.0.dist-info\\WHEEL',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\attrs-25.3.0.dist-info\\WHEEL',
'DATA'),
('python_snap7-2.0.2.dist-info\\INSTALLER',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\python_snap7-2.0.2.dist-info\\INSTALLER',
'DATA'),
('werkzeug-3.1.3.dist-info\\WHEEL',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\werkzeug-3.1.3.dist-info\\WHEEL',
'DATA'),
('attrs-25.3.0.dist-info\\METADATA',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\attrs-25.3.0.dist-info\\METADATA',
'DATA'),
('itsdangerous-2.2.0.dist-info\\WHEEL',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\itsdangerous-2.2.0.dist-info\\WHEEL',
'DATA'),
('wheel-0.45.1.dist-info\\LICENSE.txt',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\wheel-0.45.1.dist-info\\LICENSE.txt',
'DATA'),
('wheel-0.45.1.dist-info\\RECORD',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\wheel-0.45.1.dist-info\\RECORD',
'DATA'),
('werkzeug-3.1.3.dist-info\\INSTALLER',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\werkzeug-3.1.3.dist-info\\INSTALLER',
'DATA'),
('MarkupSafe-3.0.2.dist-info\\INSTALLER',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\MarkupSafe-3.0.2.dist-info\\INSTALLER',
'DATA'),
('python_snap7-2.0.2.dist-info\\entry_points.txt',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\python_snap7-2.0.2.dist-info\\entry_points.txt',
'DATA'),
('flask-3.1.1.dist-info\\RECORD',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\flask-3.1.1.dist-info\\RECORD',
'DATA'),
('python_snap7-2.0.2.dist-info\\METADATA',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\python_snap7-2.0.2.dist-info\\METADATA',
'DATA'),
('click-8.2.1.dist-info\\METADATA',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\click-8.2.1.dist-info\\METADATA',
'DATA'),
('werkzeug-3.1.3.dist-info\\METADATA',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\werkzeug-3.1.3.dist-info\\METADATA',
'DATA'),
('flask-3.1.1.dist-info\\INSTALLER',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\flask-3.1.1.dist-info\\INSTALLER',
'DATA'),
('python_snap7-2.0.2.dist-info\\top_level.txt',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\python_snap7-2.0.2.dist-info\\top_level.txt',
'DATA'),
('MarkupSafe-3.0.2.dist-info\\top_level.txt',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\MarkupSafe-3.0.2.dist-info\\top_level.txt',
'DATA'),
('MarkupSafe-3.0.2.dist-info\\RECORD',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\MarkupSafe-3.0.2.dist-info\\RECORD',
'DATA'),
('werkzeug-3.1.3.dist-info\\RECORD',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\werkzeug-3.1.3.dist-info\\RECORD',
'DATA'),
('python_snap7-2.0.2.dist-info\\RECORD',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\python_snap7-2.0.2.dist-info\\RECORD',
'DATA'),
('flask-3.1.1.dist-info\\WHEEL',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\flask-3.1.1.dist-info\\WHEEL',
'DATA'),
('click-8.2.1.dist-info\\WHEEL',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\click-8.2.1.dist-info\\WHEEL',
'DATA'),
('wheel-0.45.1.dist-info\\METADATA',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\wheel-0.45.1.dist-info\\METADATA',
'DATA'),
('itsdangerous-2.2.0.dist-info\\INSTALLER',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\itsdangerous-2.2.0.dist-info\\INSTALLER',
'DATA'),
('python_snap7-2.0.2.dist-info\\WHEEL',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\python_snap7-2.0.2.dist-info\\WHEEL',
'DATA'),
('itsdangerous-2.2.0.dist-info\\METADATA',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\itsdangerous-2.2.0.dist-info\\METADATA',
'DATA'),
('wheel-0.45.1.dist-info\\WHEEL',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\wheel-0.45.1.dist-info\\WHEEL',
'DATA'),
('attrs-25.3.0.dist-info\\licenses\\LICENSE',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\attrs-25.3.0.dist-info\\licenses\\LICENSE',
'DATA'),
('flask-3.1.1.dist-info\\REQUESTED',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\flask-3.1.1.dist-info\\REQUESTED',
'DATA'),
('python_snap7-2.0.2.dist-info\\LICENSE',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\python_snap7-2.0.2.dist-info\\LICENSE',
'DATA'),
('python_snap7-2.0.2.dist-info\\REQUESTED',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\python_snap7-2.0.2.dist-info\\REQUESTED',
'DATA'),
('MarkupSafe-3.0.2.dist-info\\WHEEL',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\MarkupSafe-3.0.2.dist-info\\WHEEL',
'DATA'),
('flask-3.1.1.dist-info\\METADATA',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\flask-3.1.1.dist-info\\METADATA',
'DATA'),
('wheel-0.45.1.dist-info\\entry_points.txt',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\wheel-0.45.1.dist-info\\entry_points.txt',
'DATA'),
('MarkupSafe-3.0.2.dist-info\\LICENSE.txt',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\MarkupSafe-3.0.2.dist-info\\LICENSE.txt',
'DATA'),
('MarkupSafe-3.0.2.dist-info\\METADATA',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\MarkupSafe-3.0.2.dist-info\\METADATA',
'DATA'),
('itsdangerous-2.2.0.dist-info\\RECORD',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\itsdangerous-2.2.0.dist-info\\RECORD',
'DATA'),
('werkzeug-3.1.3.dist-info\\LICENSE.txt',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\werkzeug-3.1.3.dist-info\\LICENSE.txt',
'DATA'),
('itsdangerous-2.2.0.dist-info\\LICENSE.txt',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\itsdangerous-2.2.0.dist-info\\LICENSE.txt',
'DATA'),
('base_library.zip',
'D:\\Projects\\gateway\\gateway\\build\\main\\base_library.zip',
'DATA')],)

View File

@ -0,0 +1,78 @@
('D:\\Projects\\gateway\\gateway\\build\\main\\main.exe',
True,
False,
True,
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\PyInstaller\\bootloader\\images\\icon-console.ico',
None,
False,
False,
b'<?xml version="1.0" encoding="UTF-8" standalone="yes"?>\n<assembly xmlns='
b'"urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0">\n <trustInfo x'
b'mlns="urn:schemas-microsoft-com:asm.v3">\n <security>\n <requested'
b'Privileges>\n <requestedExecutionLevel level="asInvoker" uiAccess='
b'"false"/>\n </requestedPrivileges>\n </security>\n </trustInfo>\n '
b'<compatibility xmlns="urn:schemas-microsoft-com:compatibility.v1">\n <'
b'application>\n <supportedOS Id="{e2011457-1546-43c5-a5fe-008deee3d3f'
b'0}"/>\n <supportedOS Id="{35138b9a-5d96-4fbd-8e2d-a2440225f93a}"/>\n '
b' <supportedOS Id="{4a2f28e3-53b9-4441-ba9c-d69d4a4a6e38}"/>\n <s'
b'upportedOS Id="{1f676c76-80e1-4239-95bb-83d0f6d0da78}"/>\n <supporte'
b'dOS Id="{8e0f7a12-bfb3-4fe8-b9a5-48fd50a15a9a}"/>\n </application>\n <'
b'/compatibility>\n <application xmlns="urn:schemas-microsoft-com:asm.v3">'
b'\n <windowsSettings>\n <longPathAware xmlns="http://schemas.micros'
b'oft.com/SMI/2016/WindowsSettings">true</longPathAware>\n </windowsSett'
b'ings>\n </application>\n <dependency>\n <dependentAssembly>\n <ass'
b'emblyIdentity type="win32" name="Microsoft.Windows.Common-Controls" version='
b'"6.0.0.0" processorArchitecture="*" publicKeyToken="6595b64144ccf1df" langua'
b'ge="*"/>\n </dependentAssembly>\n </dependency>\n</assembly>',
True,
False,
None,
None,
None,
'D:\\Projects\\gateway\\gateway\\build\\main\\main.pkg',
[('pyi-contents-directory _internal', '', 'OPTION'),
('PYZ-00.pyz',
'D:\\Projects\\gateway\\gateway\\build\\main\\PYZ-00.pyz',
'PYZ'),
('struct',
'D:\\Projects\\gateway\\gateway\\build\\main\\localpycs\\struct.pyc',
'PYMODULE'),
('pyimod01_archive',
'D:\\Projects\\gateway\\gateway\\build\\main\\localpycs\\pyimod01_archive.pyc',
'PYMODULE'),
('pyimod02_importers',
'D:\\Projects\\gateway\\gateway\\build\\main\\localpycs\\pyimod02_importers.pyc',
'PYMODULE'),
('pyimod03_ctypes',
'D:\\Projects\\gateway\\gateway\\build\\main\\localpycs\\pyimod03_ctypes.pyc',
'PYMODULE'),
('pyimod04_pywin32',
'D:\\Projects\\gateway\\gateway\\build\\main\\localpycs\\pyimod04_pywin32.pyc',
'PYMODULE'),
('pyiboot01_bootstrap',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\PyInstaller\\loader\\pyiboot01_bootstrap.py',
'PYSOURCE'),
('pyi_rth_inspect',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_inspect.py',
'PYSOURCE'),
('pyi_rth_setuptools',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_setuptools.py',
'PYSOURCE'),
('pyi_rth_pkgutil',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_pkgutil.py',
'PYSOURCE'),
('pyi_rth_multiprocessing',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_multiprocessing.py',
'PYSOURCE'),
('pyi_rth_pkgres',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_pkgres.py',
'PYSOURCE'),
('main', 'D:\\Projects\\gateway\\gateway\\main.py', 'PYSOURCE')],
[],
False,
False,
1753498013,
[('run.exe',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\PyInstaller\\bootloader\\Windows-64bit-intel\\run.exe',
'EXECUTABLE')],
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\python313.dll')

View File

@ -0,0 +1,56 @@
('D:\\Projects\\gateway\\gateway\\build\\main\\main.pkg',
{'BINARY': True,
'DATA': True,
'EXECUTABLE': True,
'EXTENSION': True,
'PYMODULE': True,
'PYSOURCE': True,
'PYZ': False,
'SPLASH': True,
'SYMLINK': False},
[('pyi-contents-directory _internal', '', 'OPTION'),
('PYZ-00.pyz',
'D:\\Projects\\gateway\\gateway\\build\\main\\PYZ-00.pyz',
'PYZ'),
('struct',
'D:\\Projects\\gateway\\gateway\\build\\main\\localpycs\\struct.pyc',
'PYMODULE'),
('pyimod01_archive',
'D:\\Projects\\gateway\\gateway\\build\\main\\localpycs\\pyimod01_archive.pyc',
'PYMODULE'),
('pyimod02_importers',
'D:\\Projects\\gateway\\gateway\\build\\main\\localpycs\\pyimod02_importers.pyc',
'PYMODULE'),
('pyimod03_ctypes',
'D:\\Projects\\gateway\\gateway\\build\\main\\localpycs\\pyimod03_ctypes.pyc',
'PYMODULE'),
('pyimod04_pywin32',
'D:\\Projects\\gateway\\gateway\\build\\main\\localpycs\\pyimod04_pywin32.pyc',
'PYMODULE'),
('pyiboot01_bootstrap',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\PyInstaller\\loader\\pyiboot01_bootstrap.py',
'PYSOURCE'),
('pyi_rth_inspect',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_inspect.py',
'PYSOURCE'),
('pyi_rth_setuptools',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_setuptools.py',
'PYSOURCE'),
('pyi_rth_pkgutil',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_pkgutil.py',
'PYSOURCE'),
('pyi_rth_multiprocessing',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_multiprocessing.py',
'PYSOURCE'),
('pyi_rth_pkgres',
'D:\\ProgramFiles\\Anaconda\\envs\\ota\\Lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_pkgres.py',
'PYSOURCE'),
('main', 'D:\\Projects\\gateway\\gateway\\main.py', 'PYSOURCE')],
'python313.dll',
True,
False,
False,
[],
None,
None,
None)

Binary file not shown.

File diff suppressed because it is too large Load Diff

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

BIN
gateway/build/main/main.exe Normal file

Binary file not shown.

BIN
gateway/build/main/main.pkg Normal file

Binary file not shown.

View File

@ -0,0 +1,96 @@
This file lists modules PyInstaller was not able to find. This does not
necessarily mean this module is required for running your program. Python and
Python 3rd-party packages include a lot of conditional or optional modules. For
example the module 'ntpath' only exists on Windows, whereas the module
'posixpath' only exists on Posix systems.
Types if import:
* top-level: imported at the top-level - look at these first
* conditional: imported within an if-statement
* delayed: imported within a function
* optional: imported within a try-except-statement
IMPORTANT: Do NOT post this list to the issue-tracker. Use it as a basis for
tracking down the missing module yourself. Thanks!
missing module named pyimod02_importers - imported by D:\ProgramFiles\Anaconda\envs\ota\Lib\site-packages\PyInstaller\hooks\rthooks\pyi_rth_pkgutil.py (delayed), D:\ProgramFiles\Anaconda\envs\ota\Lib\site-packages\PyInstaller\hooks\rthooks\pyi_rth_pkgres.py (delayed)
missing module named pwd - imported by posixpath (delayed, conditional, optional), shutil (delayed, optional), tarfile (optional), pathlib._local (optional), subprocess (delayed, conditional, optional), http.server (delayed, optional), setuptools._distutils.util (delayed, conditional, optional), netrc (delayed, conditional), getpass (delayed, optional), setuptools._vendor.backports.tarfile (optional), setuptools._distutils.archive_util (optional)
missing module named grp - imported by shutil (delayed, optional), tarfile (optional), pathlib._local (optional), subprocess (delayed, conditional, optional), setuptools._vendor.backports.tarfile (optional), setuptools._distutils.archive_util (optional)
missing module named _posixsubprocess - imported by subprocess (conditional), multiprocessing.util (delayed)
missing module named fcntl - imported by subprocess (optional), _pyrepl.unix_console (top-level)
missing module named posix - imported by posixpath (optional), shutil (conditional), importlib._bootstrap_external (conditional), os (conditional, optional), _pyrepl.unix_console (delayed, optional)
missing module named resource - imported by posix (top-level)
missing module named _manylinux - imported by packaging._manylinux (delayed, optional), setuptools._vendor.packaging._manylinux (delayed, optional), wheel.vendored.packaging._manylinux (delayed, optional)
missing module named _frozen_importlib_external - imported by importlib._bootstrap (delayed), importlib (optional), importlib.abc (optional), zipimport (top-level)
missing module named typing_extensions.TypeAlias - imported by setuptools._vendor.typing_extensions (top-level), setuptools._distutils.compilers.C.base (conditional), setuptools._reqs (conditional), setuptools.warnings (conditional), setuptools._path (conditional), setuptools._distutils.dist (conditional), setuptools.config.setupcfg (conditional), setuptools.config._apply_pyprojecttoml (conditional), setuptools.dist (conditional), pkg_resources (conditional), setuptools.command.bdist_egg (conditional), setuptools.compat.py311 (conditional)
missing module named typing_extensions.Self - imported by setuptools._vendor.typing_extensions (top-level), setuptools.config.expand (conditional), setuptools.config.pyprojecttoml (conditional), setuptools.config._validate_pyproject.error_reporting (conditional), pkg_resources (conditional)
missing module named asyncio.DefaultEventLoopPolicy - imported by asyncio (delayed, conditional), asyncio.events (delayed, conditional)
missing module named _posixshmem - imported by multiprocessing.resource_tracker (conditional), multiprocessing.shared_memory (conditional)
missing module named multiprocessing.set_start_method - imported by multiprocessing (top-level), multiprocessing.spawn (top-level)
missing module named multiprocessing.get_start_method - imported by multiprocessing (top-level), multiprocessing.spawn (top-level)
missing module named multiprocessing.get_context - imported by multiprocessing (top-level), multiprocessing.pool (top-level), multiprocessing.managers (top-level), multiprocessing.sharedctypes (top-level)
missing module named multiprocessing.TimeoutError - imported by multiprocessing (top-level), multiprocessing.pool (top-level)
missing module named _scproxy - imported by urllib.request (conditional)
missing module named termios - imported by getpass (optional), tty (top-level), _pyrepl.pager (delayed, optional), werkzeug._reloader (delayed, optional), click._termui_impl (conditional), _pyrepl.unix_console (top-level), _pyrepl.fancy_termios (top-level), _pyrepl.unix_eventqueue (top-level)
missing module named multiprocessing.BufferTooShort - imported by multiprocessing (top-level), multiprocessing.connection (top-level)
missing module named multiprocessing.AuthenticationError - imported by multiprocessing (top-level), multiprocessing.connection (top-level)
missing module named multiprocessing.Value - imported by multiprocessing (top-level), werkzeug.debug (top-level)
missing module named usercustomize - imported by site (delayed, optional)
missing module named sitecustomize - imported by site (delayed, optional)
missing module named _curses - imported by curses (top-level), curses.has_key (top-level), _pyrepl.curses (optional)
missing module named readline - imported by code (delayed, conditional, optional), flask.cli (delayed, conditional, optional), rlcompleter (optional), site (delayed, optional)
missing module named trove_classifiers - imported by setuptools.config._validate_pyproject.formats (optional)
missing module named typing_extensions.Buffer - imported by setuptools._vendor.typing_extensions (top-level), wheel.wheelfile (conditional)
missing module named typing_extensions.Literal - imported by setuptools._vendor.typing_extensions (top-level), setuptools.config._validate_pyproject.formats (conditional)
missing module named typing_extensions.deprecated - imported by setuptools._vendor.typing_extensions (top-level), setuptools._distutils.sysconfig (conditional), setuptools._distutils.command.bdist (conditional)
missing module named typing_extensions.Unpack - imported by setuptools._vendor.typing_extensions (top-level), setuptools._distutils.util (conditional), setuptools._distutils.compilers.C.base (conditional), setuptools._distutils.cmd (conditional)
missing module named typing_extensions.TypeVarTuple - imported by setuptools._vendor.typing_extensions (top-level), setuptools._distutils.util (conditional), setuptools._distutils.compilers.C.base (conditional), setuptools._distutils.cmd (conditional)
missing module named '_typeshed.importlib' - imported by pkg_resources (conditional)
missing module named _typeshed - imported by werkzeug._internal (conditional), setuptools._distutils.dist (conditional), pkg_resources (conditional), setuptools.glob (conditional), setuptools.compat.py311 (conditional), click.testing (conditional)
missing module named jnius - imported by setuptools._vendor.platformdirs.android (delayed, conditional, optional)
missing module named android - imported by setuptools._vendor.platformdirs.android (delayed, conditional, optional)
missing module named importlib_resources - imported by setuptools._vendor.jaraco.text (optional), jsonschema_specifications._core (optional)
missing module named 'collections.abc' - imported by traceback (top-level), typing (top-level), inspect (top-level), logging (top-level), importlib.resources.readers (top-level), selectors (top-level), tracemalloc (top-level), http.client (top-level), werkzeug.wrappers.request (top-level), werkzeug.datastructures.accept (top-level), werkzeug.datastructures.structures (top-level), markupsafe (top-level), setuptools (top-level), setuptools._distutils.filelist (top-level), setuptools._distutils.util (top-level), setuptools._vendor.jaraco.functools (top-level), setuptools._vendor.more_itertools.more (top-level), setuptools._vendor.more_itertools.recipes (top-level), setuptools._distutils._modified (top-level), setuptools._distutils.compat (top-level), setuptools._distutils.spawn (top-level), setuptools._distutils.compilers.C.base (top-level), setuptools._distutils.fancy_getopt (top-level), setuptools._reqs (top-level), setuptools.discovery (top-level), setuptools.dist (top-level), setuptools._distutils.command.bdist (top-level), setuptools._distutils.core (top-level), setuptools._distutils.cmd (top-level), setuptools._distutils.dist (top-level), configparser (top-level), setuptools._distutils.extension (top-level), setuptools.config.setupcfg (top-level), setuptools.config.expand (top-level), setuptools.config.pyprojecttoml (top-level), setuptools.config._apply_pyprojecttoml (top-level), tomllib._parser (top-level), setuptools._vendor.tomli._parser (top-level), pkg_resources (top-level), setuptools._vendor.platformdirs.windows (conditional), setuptools.command.egg_info (top-level), setuptools._distutils.command.build (top-level), setuptools._distutils.command.sdist (top-level), setuptools.glob (top-level), setuptools.command._requirestxt (top-level), setuptools.command.bdist_wheel (top-level), wheel.cli.convert (top-level), wheel.cli.tags (top-level), setuptools._vendor.typing_extensions (top-level), werkzeug.datastructures.cache_control (top-level), werkzeug.datastructures.mixins (top-level), werkzeug.datastructures.auth (top-level), werkzeug.datastructures.csp (top-level), werkzeug.datastructures.etag (top-level), werkzeug.datastructures.file_storage (top-level), werkzeug.datastructures.headers (top-level), werkzeug.datastructures.range (top-level), werkzeug.middleware.shared_data (top-level), flask.app (top-level), click.core (top-level), click.types (top-level), click._compat (top-level), click._winconsole (top-level), click.exceptions (top-level), click.utils (top-level), click.shell_completion (top-level), click.formatting (top-level), click.parser (top-level), click._textwrap (top-level), click.termui (top-level), click._termui_impl (top-level), flask.cli (top-level), blinker.base (top-level), blinker._utilities (top-level), asyncio.base_events (top-level), asyncio.coroutines (top-level), flask.typing (top-level), flask.sessions (top-level), itsdangerous.serializer (top-level), itsdangerous.signer (top-level), itsdangerous.timed (top-level), click.testing (top-level), attr._compat (top-level), attr._make (top-level), referencing._core (top-level), referencing.typing (top-level), referencing.jsonschema (top-level), jsonschema._utils (top-level), jsonschema.exceptions (conditional), jsonschema._types (conditional), jsonschema.validators (top-level), jsonschema._typing (top-level), jsonschema.protocols (conditional), requests.compat (top-level), setuptools._distutils.command.build_ext (top-level), _pyrepl.types (top-level), _pyrepl.readline (top-level), setuptools._distutils.compilers.C.msvc (top-level)
excluded module named _frozen_importlib - imported by importlib (optional), importlib.abc (optional), zipimport (top-level)
missing module named vms_lib - imported by platform (delayed, optional)
missing module named 'java.lang' - imported by platform (delayed, optional)
missing module named java - imported by platform (delayed)
missing module named rpds.List - imported by rpds (top-level), referencing._core (top-level)
missing module named rpds.HashTrieSet - imported by rpds (top-level), referencing._core (top-level)
missing module named rpds.HashTrieMap - imported by rpds (top-level), referencing._core (top-level), jsonschema._types (top-level), jsonschema.validators (top-level)
missing module named annotationlib - imported by attr._compat (conditional)
missing module named simplejson - imported by requests.compat (conditional, optional)
missing module named dummy_threading - imported by requests.cookies (optional)
missing module named zstandard - imported by urllib3.util.request (optional), urllib3.response (optional)
missing module named compression - imported by urllib3.util.request (optional), urllib3.response (optional)
missing module named 'h2.events' - imported by urllib3.http2.connection (top-level)
missing module named 'h2.connection' - imported by urllib3.http2.connection (top-level)
missing module named h2 - imported by urllib3.http2.connection (top-level)
missing module named brotli - imported by urllib3.util.request (optional), urllib3.response (optional)
missing module named brotlicffi - imported by urllib3.util.request (optional), urllib3.response (optional)
missing module named socks - imported by urllib3.contrib.socks (optional)
missing module named cryptography - imported by werkzeug.serving (delayed, conditional, optional), flask.cli (delayed, conditional, optional), urllib3.contrib.pyopenssl (top-level), requests (conditional, optional)
missing module named 'OpenSSL.crypto' - imported by urllib3.contrib.pyopenssl (delayed, conditional)
missing module named 'cryptography.x509' - imported by werkzeug.serving (delayed, conditional, optional), urllib3.contrib.pyopenssl (delayed, optional)
missing module named OpenSSL - imported by urllib3.contrib.pyopenssl (top-level)
missing module named chardet - imported by requests (optional)
missing module named 'pyodide.ffi' - imported by urllib3.contrib.emscripten.fetch (delayed, optional)
missing module named pyodide - imported by urllib3.contrib.emscripten.fetch (top-level)
missing module named js - imported by urllib3.contrib.emscripten.fetch (top-level)
missing module named isoduration - imported by jsonschema._format (top-level)
missing module named uri_template - imported by jsonschema._format (top-level)
missing module named jsonpointer - imported by jsonschema._format (top-level)
missing module named webcolors - imported by jsonschema._format (top-level)
missing module named rfc3339_validator - imported by jsonschema._format (top-level)
missing module named rfc3987_syntax - imported by jsonschema._format (optional)
missing module named rfc3986_validator - imported by jsonschema._format (optional)
missing module named rfc3987 - imported by jsonschema._format (optional)
missing module named fqdn - imported by jsonschema._format (top-level)
missing module named 'watchdog.observers' - imported by werkzeug._reloader (delayed)
missing module named 'watchdog.events' - imported by werkzeug._reloader (delayed)
missing module named watchdog - imported by werkzeug._reloader (delayed)
missing module named 'cryptography.hazmat' - imported by werkzeug.serving (delayed, optional)
missing module named '_typeshed.wsgi' - imported by werkzeug.exceptions (conditional), werkzeug.http (conditional), werkzeug.wsgi (conditional), werkzeug.utils (conditional), werkzeug.wrappers.response (conditional), werkzeug.test (conditional), werkzeug.datastructures.headers (conditional), werkzeug.formparser (conditional), werkzeug.wrappers.request (conditional), werkzeug.serving (conditional), werkzeug.debug (conditional), werkzeug.middleware.shared_data (conditional), werkzeug.local (conditional), werkzeug.routing.exceptions (conditional), werkzeug.routing.map (conditional), flask.typing (conditional), flask.ctx (conditional), flask.testing (conditional), flask.cli (conditional), flask.app (conditional)
missing module named asgiref - imported by flask.app (delayed, optional)
missing module named dotenv - imported by flask.cli (delayed, optional)

File diff suppressed because it is too large Load Diff

657
gateway/cache_manager.py Normal file
View File

@ -0,0 +1,657 @@
import threading
import time
import logging
from snap7.util import *
class CacheManager:
"""PLC数据缓存管理器"""
def __init__(self, config, plc_manager, app=None):
"""
初始化缓存管理器
Args:
config: 配置对象
plc_manager: PLC管理器实例
app: 主应用程序引用(用于配置重载)
"""
self.plc_manager = plc_manager
self.config = config
self.app = app
self.cache = {}
self.refresh_interval = 1 # 1秒刷新一次
self.running = False
self.lock = threading.Lock()
self.thread = None
self.last_update = {} # 区域级最后更新时间
self.plc_last_connected = {} # PLC级最后连接时间
self.plc_connection_status = {} # PLC连接状态
self.logger = logging.getLogger("CacheManager")
self.init_cache()
def init_cache(self):
"""初始化缓存结构"""
for plc in self.config["plcs"]:
plc_name = plc["name"]
self.cache[plc_name] = {}
self.last_update[plc_name] = {}
self.plc_last_connected[plc_name] = 0 # 初始化为0未连接
self.plc_connection_status[plc_name] = "never_connected"
for area in plc["areas"]:
name = area["name"]
# 确保初始状态为断开
self.cache[plc_name][name] = {
"data": bytearray(area["size"]),
"db_number": area["db_number"],
"offset": area["offset"],
"size": area["size"],
"type": area["type"],
"structure": area.get("structure", []),
"status": "disconnected" # 初始状态为断开
}
self.last_update[plc_name][name] = 0
def refresh_cache(self):
"""后台线程:定期刷新缓存"""
while self.running:
try:
for plc in self.config["plcs"]:
plc_name = plc["name"]
client = self.plc_manager.get_plc(plc_name)
# 检查PLC连接状态
plc_connected = client.connected
# 更新PLC连接状态
with self.lock:
if plc_connected:
self.plc_last_connected[plc_name] = time.time()
self.plc_connection_status[plc_name] = "connected"
else:
if self.plc_last_connected[plc_name] == 0:
self.plc_connection_status[plc_name] = "never_connected"
else:
self.plc_connection_status[plc_name] = "disconnected"
# 刷新所有可读区域
for area in plc["areas"]:
if area["type"] in ["read", "read_write"]:
name = area["name"]
try:
data = client.read_db(area["db_number"], area["offset"], area["size"])
# 更新区域状态基于PLC连接状态和读取结果
with self.lock:
if plc_connected and data and len(data) == area["size"]:
self.cache[plc_name][name]["data"] = bytearray(data)
self.cache[plc_name][name]["status"] = "connected"
self.last_update[plc_name][name] = time.time()
else:
self.cache[plc_name][name]["status"] = self.plc_connection_status[plc_name]
# 如果之前有数据,保留旧数据但标记状态
if self.last_update[plc_name][name] > 0:
self.logger.info(f"PLC {plc_name} area {name} disconnected but keeping last valid data")
except Exception as e:
with self.lock:
self.cache[plc_name][name]["status"] = self.plc_connection_status[plc_name]
self.logger.warning(f"Error updating status for {plc_name}/{name}: {e}")
time.sleep(self.refresh_interval)
except Exception as e:
self.logger.error(f"Error in refresh_cache: {e}")
time.sleep(self.refresh_interval)
def start(self):
"""启动缓存刷新线程"""
if self.running:
return
self.running = True
self.thread = threading.Thread(
target=self.refresh_cache,
name="CacheRefreshThread",
daemon=True
)
self.thread.start()
self.logger.info("Cache manager started")
def stop(self):
"""停止缓存刷新线程"""
if not self.running:
return
self.running = False
if self.thread:
# 等待线程结束,但设置超时防止卡死
self.thread.join(timeout=2.0)
if self.thread.is_alive():
self.logger.warning("Cache refresh thread did not terminate gracefully")
self.thread = None
self.logger.info("Cache manager stopped")
def get_plc_connection_status(self, plc_name):
"""获取PLC连接状态"""
with self.lock:
return self.plc_connection_status.get(plc_name, "unknown")
def get_last_update_time(self, plc_name, area_name):
"""获取区域数据最后更新时间"""
with self.lock:
return self.last_update.get(plc_name, {}).get(area_name, 0)
def get_summary(self):
"""获取缓存摘要信息"""
summary = {}
with self.lock:
for plc_name, areas in self.cache.items():
summary[plc_name] = {}
for area_name, area in areas.items():
last_update = self.last_update[plc_name][area_name]
plc_status = self.plc_connection_status.get(plc_name, "unknown")
# 区域状态应与PLC连接状态一致除非有有效数据
area_status = area["status"]
if plc_status == "never_connected":
area_status = "never_connected"
elif plc_status == "disconnected" and self.last_update[plc_name][area_name] == 0:
area_status = "disconnected"
summary[plc_name][area_name] = {
"status": area_status,
"plc_connection_status": plc_status,
"last_update": time.strftime("%Y-%m-%d %H:%M:%S", time.localtime(last_update)) if last_update > 0 else "Never",
"size": area["size"],
"type": area["type"]
}
return summary
def get_area_status(self, plc_name, area_name):
"""获取区域状态"""
with self.lock:
area = self.cache.get(plc_name, {}).get(area_name)
if not area:
return {"status": "not_found", "message": "PLC or area not found"}
plc_status = self.plc_connection_status.get(plc_name, "unknown")
last_update = self.last_update.get(plc_name, {}).get(area_name, 0)
# 区域状态应与PLC连接状态一致除非有有效数据
area_status = area["status"]
if plc_status == "never_connected":
area_status = "never_connected"
elif plc_status == "disconnected" and last_update == 0:
area_status = "disconnected"
return {
"status": area_status,
"plc_connection_status": plc_status,
"last_update": last_update,
"last_update_formatted": time.strftime("%Y-%m-%d %H:%M:%S", time.localtime(last_update)) if last_update > 0 else "Never",
"size": area["size"],
"type": area["type"]
}
def read_area(self, plc_name, area_name, offset, length):
"""单个区域读取"""
with self.lock:
area = self.cache.get(plc_name, {}).get(area_name)
if not area:
return None, "Area not found", "unknown", 0
if offset + length > area["size"]:
return None, "Offset out of bounds", "unknown", 0
client = self.plc_manager.get_plc(plc_name)
plc_status = self.plc_connection_status.get(plc_name, "unknown")
# 如果PLC未连接直接返回错误
if plc_status != "connected":
return None, f"PLC not connected (status: {plc_status})", plc_status, 0
try:
data = client.read_db(area["db_number"], area["offset"] + offset, length)
# 验证数据有效性
if data and len(data) == length:
# 更新缓存中的这部分数据
for i in range(length):
area["data"][offset + i] = data[i]
update_time = time.time()
self.last_update[plc_name][area_name] = update_time
area["status"] = "connected"
return data, None, plc_status, update_time
else:
area["status"] = plc_status
return None, "Invalid data returned", plc_status, 0
except Exception as e:
area["status"] = plc_status
self.logger.error(f"Read failed for {plc_name}/{area_name}: {e}")
return None, f"Read failed: {str(e)}", plc_status, 0
def read_area_bool(self, plc_name, area_name, offset, length):
"""单个区域读取"""
with self.lock:
area = self.cache.get(plc_name, {}).get(area_name)
if not area:
return None, "Area not found", "unknown", 0
if offset + length > area["size"]:
return None, "Offset out of bounds", "unknown", 0
client = self.plc_manager.get_plc(plc_name)
plc_status = self.plc_connection_status.get(plc_name, "unknown")
# 如果PLC未连接直接返回错误
if plc_status != "connected":
return None, f"PLC not connected (status: {plc_status})", plc_status, 0
try:
data = client.read_db_bool(area["db_number"], area["offset"] + offset, length)
# 验证数据有效性
if all(isinstance(val, bool) for val in data.values()):
# 按字典键顺序更新多个值
for i, val in data.items():
area["data"][offset + i] = val # 确保offset+i不越界
# area["data"][offset] = data.values
update_time = time.time()
self.last_update[plc_name][area_name] = update_time
area["status"] = "connected"
return data, None, plc_status, update_time
else:
area["status"] = plc_status
return None, "Invalid data returned", plc_status, 0
except Exception as e:
area["status"] = plc_status
self.logger.error(f"Read failed for {plc_name}/{area_name}: {e}")
return None, f"Read failed: {str(e)}", plc_status, 0
def write_area(self, plc_name, area_name, offset, data):
"""单个区域写入"""
with self.lock:
area = self.cache.get(plc_name, {}).get(area_name)
if not area:
return False, "Area not found", "unknown", 0
if area["type"] not in ["write", "read_write"]:
plc_status = self.plc_connection_status.get(plc_name, "unknown")
return False, "Area is read-only", plc_status, 0
if offset + len(data) > area["size"]:
plc_status = self.plc_connection_status.get(plc_name, "unknown")
return False, "Offset out of bounds", plc_status, 0
client = self.plc_manager.get_plc(plc_name)
plc_status = self.plc_connection_status.get(plc_name, "unknown")
# 如果PLC未连接直接返回错误
if plc_status != "connected":
return False, f"PLC not connected (status: {plc_status})", plc_status, 0
try:
print(data)
success = client.write_db(area["db_number"], area["offset"] + offset, data)
if success:
# 更新缓存中的这部分数据
for i in range(len(data)):
area["data"][offset + i] = data[i]
update_time = time.time()
self.last_update[plc_name][area_name] = update_time
area["status"] = "connected (last write)"
return True, None, plc_status, update_time
else:
area["status"] = plc_status
return False, "Write failed", plc_status, 0
except Exception as e:
area["status"] = plc_status
self.logger.error(f"Write failed for {plc_name}/{area_name}: {e}")
return False, f"Write failed: {str(e)}", plc_status, 0
def batch_write_area(self, plc_name, area_name, offset, data):
"""单个区域写入"""
with self.lock:
area = self.cache.get(plc_name, {}).get(area_name)
if not area:
return False, "Area not found", "unknown", 0
if area["type"] not in ["write", "read_write"]:
plc_status = self.plc_connection_status.get(plc_name, "unknown")
return False, "Area is read-only", plc_status, 0
if offset + len(data) > area["size"]:
plc_status = self.plc_connection_status.get(plc_name, "unknown")
return False, "Offset out of bounds", plc_status, 0
client = self.plc_manager.get_plc(plc_name)
plc_status = self.plc_connection_status.get(plc_name, "unknown")
# 如果PLC未连接直接返回错误
if plc_status != "connected":
return False, f"PLC not connected (status: {plc_status})", plc_status, 0
try:
print("data:", data)
for i, byte in enumerate(data):
print("i,byte:", i, byte)
byte_data = bytes([byte])
current_offset = offset + (i * 2)
byte_value = byte_data[0]
value = bytearray(2)
if isinstance(byte_value, int):
set_int(value, 0, byte_value)
data = value
print(area["db_number"], current_offset, data)
success = client.batch_write_db(area["db_number"], current_offset, data)
if success:
# 更新缓存中的这部分数据
for j in range(len(data)):
area["data"][offset + j] = data[j]
update_time = time.time()
self.last_update[plc_name][area_name] = update_time
area["status"] = "connected (last write)"
else:
area["status"] = plc_status
return False, "Write failed", plc_status, 0
return True, None, plc_status, update_time
except Exception as e:
area["status"] = plc_status
self.logger.error(f"Write failed for {plc_name}/{area_name}: {e}")
return False, f"Write failed: {str(e)}", plc_status, 0
def batch_write_bool_area(self, plc_name, area_name, offset, data):
"""单个区域写入"""
with self.lock:
area = self.cache.get(plc_name, {}).get(area_name)
if not area:
return False, "Area not found", "unknown", 0
if area["type"] not in ["write", "read_write"]:
plc_status = self.plc_connection_status.get(plc_name, "unknown")
return False, "Area is read-only", plc_status, 0
if offset + len(data) > area["size"]:
plc_status = self.plc_connection_status.get(plc_name, "unknown")
return False, "Offset out of bounds", plc_status, 0
client = self.plc_manager.get_plc(plc_name)
plc_status = self.plc_connection_status.get(plc_name, "unknown")
# 如果PLC未连接直接返回错误
if plc_status != "connected":
return False, f"PLC not connected (status: {plc_status})", plc_status, 0
try:
value = bytearray(offset + 1)
for bit, bit_value in enumerate(data):
print("i,byte:", bit, bit_value)
set_bool(value, offset, bit, bit_value)
data = value
print(area["db_number"], offset, data)
success = client.batch_write_db_bool(area["db_number"], offset, data)
if success:
# 更新缓存中的这部分数据
for j in range(len(data)):
area["data"][offset + j] = data[j]
update_time = time.time()
self.last_update[plc_name][area_name] = update_time
area["status"] = "connected (last write)"
else:
area["status"] = plc_status
return False, "Write failed", plc_status, 0
return True, None, plc_status, update_time
except Exception as e:
area["status"] = plc_status
self.logger.error(f"Write failed for {plc_name}/{area_name}: {e}")
return False, f"Write failed: {str(e)}", plc_status, 0
def write_area_bool(self, plc_name, area_name, offset, data):
"""单个区域写入"""
with self.lock:
area = self.cache.get(plc_name, {}).get(area_name)
if not area:
return False, "Area not found", "unknown", 0
if area["type"] not in ["write", "read_write"]:
plc_status = self.plc_connection_status.get(plc_name, "unknown")
return False, "Area is read-only", plc_status, 0
if offset + len(data) > area["size"]:
plc_status = self.plc_connection_status.get(plc_name, "unknown")
return False, "Offset out of bounds", plc_status, 0
client = self.plc_manager.get_plc(plc_name)
plc_status = self.plc_connection_status.get(plc_name, "unknown")
# 如果PLC未连接直接返回错误
if plc_status != "connected":
return False, f"PLC not connected (status: {plc_status})", plc_status, 0
try:
success = client.write_db_bool(area["db_number"], area["offset"] + offset, data)
if success:
# 更新缓存中的这部分数据
for i in range(len(data)):
area["data"][offset + i] = data[i]
update_time = time.time()
self.last_update[plc_name][area_name] = update_time
area["status"] = "connected (last write)"
return True, None, plc_status, update_time
else:
area["status"] = plc_status
return False, "Write failed", plc_status, 0
except Exception as e:
area["status"] = plc_status
self.logger.error(f"Write failed for {plc_name}/{area_name}: {e}")
return False, f"Write failed: {str(e)}", plc_status, 0
def batch_read(self, requests):
"""批量读取"""
results = []
for req in requests:
plc_name = req["plc_name"]
area_name = req["area_name"]
offset = req.get("offset", 0)
length = req.get("length", None)
# 获取PLC连接状态
plc_status = self.plc_connection_status.get(plc_name, "unknown")
# 如果PLC未连接直接返回错误
if plc_status != "connected":
results.append({
"plc_name": plc_name,
"area_name": area_name,
"status": "error",
"plc_connection_status": plc_status,
"last_update": 0,
"last_update_formatted": "N/A",
"message": f"PLC not connected (status: {plc_status})"
})
continue
area = self.cache.get(plc_name, {}).get(area_name)
if not area:
results.append({
"plc_name": plc_name,
"area_name": area_name,
"status": "error",
"plc_connection_status": plc_status,
"last_update": 0,
"last_update_formatted": "N/A",
"message": "Area not found"
})
continue
# 如果未指定length读取整个区域
if length is None:
length = area["size"] - offset
data, error, _, update_time = self.read_area(plc_name, area_name, offset, length)
if error:
results.append({
"plc_name": plc_name,
"area_name": area_name,
"status": "error",
"plc_connection_status": plc_status,
"last_update": update_time,
"last_update_formatted": time.strftime("%Y-%m-%d %H:%M:%S", time.localtime(update_time)) if update_time > 0 else "Never",
"message": error
})
else:
results.append({
"plc_name": plc_name,
"area_name": area_name,
"status": "success",
"plc_connection_status": plc_status,
"last_update": update_time,
"last_update_formatted": time.strftime("%Y-%m-%d %H:%M:%S", time.localtime(update_time)),
"offset": offset,
"length": length,
"data": list(data)
})
return results
def batch_write(self, requests):
"""批量写入"""
results = []
for req in requests:
plc_name = req["plc_name"]
area_name = req["area_name"]
offset = req["offset"]
data = bytes(req["data"])
# 获取PLC连接状态
plc_status = self.plc_connection_status.get(plc_name, "unknown")
# 如果PLC未连接直接返回错误
if plc_status != "connected":
results.append({
"plc_name": plc_name,
"area_name": area_name,
"status": "error",
"plc_connection_status": plc_status,
"last_update": 0,
"last_update_formatted": "N/A",
"message": f"PLC not connected (status: {plc_status})",
"offset": offset
})
continue
success, error, _, update_time = self.batch_write_area(plc_name, area_name, offset, data)
if error:
results.append({
"plc_name": plc_name,
"area_name": area_name,
"status": "error",
"plc_connection_status": plc_status,
"last_update": update_time,
"last_update_formatted": time.strftime("%Y-%m-%d %H:%M:%S", time.localtime(update_time)) if update_time > 0 else "Never",
"message": error,
"offset": offset
})
else:
results.append({
"plc_name": plc_name,
"area_name": area_name,
"status": "success",
"plc_connection_status": plc_status,
"last_update": update_time,
"last_update_formatted": time.strftime("%Y-%m-%d %H:%M:%S", time.localtime(update_time)),
"offset": offset,
"length": len(data)
})
return results
def batch_write_bool(self, requests):
"""批量写入"""
results = []
for req in requests:
plc_name = req["plc_name"]
area_name = req["area_name"]
offset = req["offset"]
data = bytes(req["data"])
# 获取PLC连接状态
plc_status = self.plc_connection_status.get(plc_name, "unknown")
# 如果PLC未连接直接返回错误
if plc_status != "connected":
results.append({
"plc_name": plc_name,
"area_name": area_name,
"status": "error",
"plc_connection_status": plc_status,
"last_update": 0,
"last_update_formatted": "N/A",
"message": f"PLC not connected (status: {plc_status})",
"offset": offset
})
continue
success, error, _, update_time = self.batch_write_bool_area(plc_name, area_name, offset, data)
if error:
results.append({
"plc_name": plc_name,
"area_name": area_name,
"status": "error",
"plc_connection_status": plc_status,
"last_update": update_time,
"last_update_formatted": time.strftime("%Y-%m-%d %H:%M:%S", time.localtime(update_time)) if update_time > 0 else "Never",
"message": error,
"offset": offset
})
else:
results.append({
"plc_name": plc_name,
"area_name": area_name,
"status": "success",
"plc_connection_status": plc_status,
"last_update": update_time,
"last_update_formatted": time.strftime("%Y-%m-%d %H:%M:%S", time.localtime(update_time)),
"offset": offset,
"length": len(data)
})
return results
def get_parsed_data(self, plc_name, area_name):
"""获取解析后的数据"""
from data_parser import parse_data
with self.lock:
area = self.cache.get(plc_name, {}).get(area_name)
if not area:
return {"error": "Area not found"}
plc_status = self.plc_connection_status.get(plc_name, "unknown")
last_update = self.last_update.get(plc_name, {}).get(area_name, 0)
# 区域状态应与PLC连接状态一致除非有有效数据
area_status = area["status"]
if plc_status == "never_connected":
area_status = "never_connected"
elif plc_status == "disconnected" and last_update == 0:
area_status = "disconnected"
structure = area.get("structure", [])
if structure:
parsed = parse_data(area["data"], structure)
parsed["status"] = area_status
parsed["plc_connection_status"] = plc_status
parsed["last_update"] = last_update
parsed["last_update_formatted"] = time.strftime("%Y-%m-%d %H:%M:%S", time.localtime(last_update)) if last_update > 0 else "Never"
return parsed
else:
return {
"raw_data": list(area["data"]),
"status": area_status,
"plc_connection_status": plc_status,
"last_update": last_update,
"last_update_formatted": time.strftime("%Y-%m-%d %H:%M:%S", time.localtime(last_update)) if last_update > 0 else "Never"
}

10
gateway/config_loader.py Normal file
View File

@ -0,0 +1,10 @@
import json
import os
def load_config(config_path="config/config.json"):
"""加载配置文件"""
if not os.path.exists(config_path):
raise FileNotFoundError(f"Configuration file not found: {config_path}")
with open(config_path, "r") as f:
return json.load(f)

87
gateway/config_manager.py Normal file
View File

@ -0,0 +1,87 @@
import json
import os
import logging
from config_validator import ConfigValidator
class ConfigManager:
"""配置文件管理器"""
def __init__(self, config_path="../config/config.json"):
self.config_path = config_path
self.config = None
self.load_config()
def load_config(self):
"""加载配置文件"""
try:
if not os.path.exists(self.config_path):
# 尝试从备份恢复
backup_path = self.config_path + ".bak"
if os.path.exists(backup_path):
logging.warning(f"Main config not found, using backup: {backup_path}")
with open(backup_path, 'r') as src, open(self.config_path, 'w') as dst:
config_data = src.read()
dst.write(config_data)
else:
raise FileNotFoundError(f"Configuration file not found: {self.config_path}")
with open(self.config_path, 'r') as f:
self.config = json.load(f)
# 验证配置
is_valid, error = ConfigValidator.validate_config(self.config)
if not is_valid:
logging.error(f"Invalid configuration: {error}")
# 尝试从备份恢复
backup_path = self.config_path + ".bak"
if os.path.exists(backup_path):
logging.warning("Attempting to load from backup configuration")
with open(backup_path, 'r') as f:
self.config = json.load(f)
is_valid, error = ConfigValidator.validate_config(self.config)
if not is_valid:
raise ValueError(f"Backup config also invalid: {error}")
else:
raise ValueError(f"Invalid configuration: {error}")
return True, None
except Exception as e:
logging.error(f"Failed to load config: {e}")
self.config = {"plcs": []}
return False, str(e)
def get_config(self):
"""获取当前配置"""
return self.config
def validate_config(self, config):
"""验证配置是否有效"""
return ConfigValidator.validate_config(config)
def save_config(self, new_config):
"""保存配置文件"""
try:
# 验证配置
is_valid, error = self.validate_config(new_config)
if not is_valid:
return False, f"Invalid configuration: {error}"
# 备份旧配置
backup_path = self.config_path + ".bak"
if os.path.exists(self.config_path):
with open(self.config_path, 'r') as src, open(backup_path, 'w') as dst:
dst.write(src.read())
# 保存新配置
with open(self.config_path, 'w') as f:
json.dump(new_config, f, indent=2)
self.config = new_config
return True, None
except Exception as e:
logging.error(f"Failed to save config: {e}")
return False, str(e)
def reload_config(self):
"""重新加载配置"""
return self.load_config()

View File

@ -0,0 +1,82 @@
import json
from jsonschema import validate, Draft7Validator, FormatChecker
class ConfigValidator:
"""配置文件验证器"""
SCHEMA = {
"type": "object",
"properties": {
"plcs": {
"type": "array",
"minItems": 1,
"items": {
"type": "object",
"properties": {
"name": {"type": "string", "minLength": 1},
"ip": {"type": "string", "format": "ipv4"},
"rack": {"type": "integer", "minimum": 0},
"slot": {"type": "integer", "minimum": 0},
"areas": {
"type": "array",
"minItems": 1,
"items": {
"type": "object",
"properties": {
"name": {"type": "string", "minLength": 1},
"type": {"type": "string", "enum": ["read", "write", "read_write"]},
"db_number": {"type": "integer", "minimum": 1},
"offset": {"type": "integer", "minimum": 0},
"size": {"type": "integer", "minimum": 1},
"structure": {
"type": "array",
"items": {
"type": "object",
"properties": {
"name": {"type": "string"},
"type": {"type": "string", "enum": ["bool", "byte", "int", "dint", "real", "word", "dword"]},
"offset": {"type": "integer", "minimum": 0},
"bit": {"type": "integer", "minimum": 0, "maximum": 7} # 修复了这里
},
"required": ["name", "type", "offset"]
}
}
},
"required": ["name", "type", "db_number", "offset", "size"]
}
}
},
"required": ["name", "ip", "rack", "slot", "areas"]
}
}
},
"required": ["plcs"]
}
@staticmethod
def validate_config(config):
"""验证配置是否符合规范"""
try:
# 添加IPv4格式验证
validator = Draft7Validator(
ConfigValidator.SCHEMA,
format_checker=FormatChecker(["ipv4"])
)
validator.validate(config)
return True, None
except Exception as e:
return False, str(e)
@staticmethod
def is_valid_ip(ip):
"""验证IP地址格式"""
parts = ip.split('.')
if len(parts) != 4:
return False
for part in parts:
if not part.isdigit():
return False
num = int(part)
if num < 0 or num > 255:
return False
return True

55
gateway/data_parser.py Normal file
View File

@ -0,0 +1,55 @@
from struct import unpack
import time
def parse_data(data, structure):
"""解析结构化数据"""
result = {"raw_data": list(data)}
if not structure:
return result
result["parsed"] = {}
for field in structure:
offset = field["offset"]
name = field["name"]
data_type = field["type"]
try:
if data_type == "int":
if offset + 2 > len(data):
raise ValueError("Offset out of bounds")
val = unpack(">h", data[offset:offset+2])[0]
elif data_type == "dint":
if offset + 4 > len(data):
raise ValueError("Offset out of bounds")
val = unpack(">l", data[offset:offset+4])[0]
elif data_type == "real":
if offset + 4 > len(data):
raise ValueError("Offset out of bounds")
val = unpack(">f", data[offset:offset+4])[0]
elif data_type == "bool":
bit = field.get("bit", 0)
if offset >= len(data):
raise ValueError("Offset out of bounds")
byte = data[offset]
val = bool((byte >> bit) & 1)
elif data_type == "byte":
if offset >= len(data):
raise ValueError("Offset out of bounds")
val = data[offset]
elif data_type == "word":
if offset + 2 > len(data):
raise ValueError("Offset out of bounds")
val = (data[offset] << 8) | data[offset + 1]
elif data_type == "dword":
if offset + 4 > len(data):
raise ValueError("Offset out of bounds")
val = (data[offset] << 24) | (data[offset+1] << 16) | (data[offset+2] << 8) | data[offset+3]
else:
val = f"Unknown type: {data_type}"
result["parsed"][name] = val
except Exception as e:
result["parsed"][name] = f"Error: {str(e)}"
return result

66
gateway/dist/config/config.json vendored Normal file
View File

@ -0,0 +1,66 @@
{
"plcs": [
{
"name": "PLC1",
"ip": "192.168.0.100",
"rack": 0,
"slot": 1,
"areas": [
{
"name": "DB100_Read",
"type": "read",
"db_number": 100,
"offset": 0,
"size": 4000,
"structure": [
{
"name": "temperature",
"type": "real",
"offset": 0
},
{
"name": "pressure",
"type": "int",
"offset": 4
},
{
"name": "status",
"type": "bool",
"offset": 6,
"bit": 0
}
]
},
{
"name": "DB100_Write",
"type": "write",
"db_number": 100,
"offset": 4000,
"size": 5000
},
{
"name": "DB202_Params",
"type": "read_write",
"db_number": 202,
"offset": 0,
"size": 2000
}
]
},
{
"name": "PLC2",
"ip": "192.168.0.101",
"rack": 0,
"slot": 1,
"areas": [
{
"name": "DB100_Read",
"type": "read",
"db_number": 100,
"offset": 0,
"size": 4000
}
]
}
]
}

66
gateway/dist/config/config.json.bak vendored Normal file
View File

@ -0,0 +1,66 @@
{
"plcs": [
{
"name": "PLC1",
"ip": "192.168.0.100",
"rack": 0,
"slot": 1,
"areas": [
{
"name": "DB100_Read",
"type": "read",
"db_number": 100,
"offset": 0,
"size": 4000,
"structure": [
{
"name": "temperature",
"type": "real",
"offset": 0
},
{
"name": "pressure",
"type": "int",
"offset": 4
},
{
"name": "status",
"type": "bool",
"offset": 6,
"bit": 0
}
]
},
{
"name": "DB100_Write",
"type": "write",
"db_number": 100,
"offset": 4000,
"size": 5000
},
{
"name": "DB202_Params",
"type": "read_write",
"db_number": 202,
"offset": 0,
"size": 2000
}
]
},
{
"name": "PLC2",
"ip": "192.168.0.101",
"rack": 0,
"slot": 1,
"areas": [
{
"name": "DB100_Read",
"type": "read",
"db_number": 100,
"offset": 0,
"size": 4000
}
]
}
]
}

BIN
gateway/dist/main/_internal/LIBBZ2.dll vendored Normal file

Binary file not shown.

View File

@ -0,0 +1 @@
pip

View File

@ -0,0 +1,28 @@
Copyright 2010 Pallets
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are
met:
1. Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A
PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED
TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

View File

@ -0,0 +1,92 @@
Metadata-Version: 2.1
Name: MarkupSafe
Version: 3.0.2
Summary: Safely add untrusted strings to HTML/XML markup.
Maintainer-email: Pallets <contact@palletsprojects.com>
License: Copyright 2010 Pallets
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are
met:
1. Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A
PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED
TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
Project-URL: Donate, https://palletsprojects.com/donate
Project-URL: Documentation, https://markupsafe.palletsprojects.com/
Project-URL: Changes, https://markupsafe.palletsprojects.com/changes/
Project-URL: Source, https://github.com/pallets/markupsafe/
Project-URL: Chat, https://discord.gg/pallets
Classifier: Development Status :: 5 - Production/Stable
Classifier: Environment :: Web Environment
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: BSD License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content
Classifier: Topic :: Text Processing :: Markup :: HTML
Classifier: Typing :: Typed
Requires-Python: >=3.9
Description-Content-Type: text/markdown
License-File: LICENSE.txt
# MarkupSafe
MarkupSafe implements a text object that escapes characters so it is
safe to use in HTML and XML. Characters that have special meanings are
replaced so that they display as the actual characters. This mitigates
injection attacks, meaning untrusted user input can safely be displayed
on a page.
## Examples
```pycon
>>> from markupsafe import Markup, escape
>>> # escape replaces special characters and wraps in Markup
>>> escape("<script>alert(document.cookie);</script>")
Markup('&lt;script&gt;alert(document.cookie);&lt;/script&gt;')
>>> # wrap in Markup to mark text "safe" and prevent escaping
>>> Markup("<strong>Hello</strong>")
Markup('<strong>hello</strong>')
>>> escape(Markup("<strong>Hello</strong>"))
Markup('<strong>hello</strong>')
>>> # Markup is a str subclass
>>> # methods and operators escape their arguments
>>> template = Markup("Hello <em>{name}</em>")
>>> template.format(name='"World"')
Markup('Hello <em>&#34;World&#34;</em>')
```
## Donate
The Pallets organization develops and supports MarkupSafe and other
popular packages. In order to grow the community of contributors and
users, and allow the maintainers to devote more time to the projects,
[please donate today][].
[please donate today]: https://palletsprojects.com/donate

View File

@ -0,0 +1,14 @@
MarkupSafe-3.0.2.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
MarkupSafe-3.0.2.dist-info/LICENSE.txt,sha256=RjHsDbX9kKVH4zaBcmTGeYIUM4FG-KyUtKV_lu6MnsQ,1503
MarkupSafe-3.0.2.dist-info/METADATA,sha256=nhoabjupBG41j_JxPCJ3ylgrZ6Fx8oMCFbiLF9Kafqc,4067
MarkupSafe-3.0.2.dist-info/RECORD,,
MarkupSafe-3.0.2.dist-info/WHEEL,sha256=-v_yZ08fSknsoT62oIKG9wp1eCBV9_ao2rO4BeIReTY,101
MarkupSafe-3.0.2.dist-info/top_level.txt,sha256=qy0Plje5IJuvsCBjejJyhDCjEAdcDLK_2agVcex8Z6U,11
markupsafe/__init__.py,sha256=pREerPwvinB62tNCMOwqxBS2YHV6R52Wcq1d-rB4Z5o,13609
markupsafe/__pycache__/__init__.cpython-313.pyc,,
markupsafe/__pycache__/_native.cpython-313.pyc,,
markupsafe/_native.py,sha256=2ptkJ40yCcp9kq3L1NqpgjfpZB-obniYKFFKUOkHh4Q,218
markupsafe/_speedups.c,sha256=SglUjn40ti9YgQAO--OgkSyv9tXq9vvaHyVhQows4Ok,4353
markupsafe/_speedups.cp313-win_amd64.pyd,sha256=7MA12j0aUiSeNpFy-98h_pPSqgCpLeRacgp3I-j00Yo,13312
markupsafe/_speedups.pyi,sha256=LSDmXYOefH4HVpAXuL8sl7AttLw0oXh1njVoVZp2wqQ,42
markupsafe/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0

View File

@ -0,0 +1,5 @@
Wheel-Version: 1.0
Generator: setuptools (75.2.0)
Root-Is-Purelib: false
Tag: cp313-cp313-win_amd64

View File

@ -0,0 +1 @@
markupsafe

Binary file not shown.

BIN
gateway/dist/main/_internal/_asyncio.pyd vendored Normal file

Binary file not shown.

BIN
gateway/dist/main/_internal/_bz2.pyd vendored Normal file

Binary file not shown.

BIN
gateway/dist/main/_internal/_ctypes.pyd vendored Normal file

Binary file not shown.

BIN
gateway/dist/main/_internal/_decimal.pyd vendored Normal file

Binary file not shown.

BIN
gateway/dist/main/_internal/_hashlib.pyd vendored Normal file

Binary file not shown.

BIN
gateway/dist/main/_internal/_lzma.pyd vendored Normal file

Binary file not shown.

Binary file not shown.

Binary file not shown.

BIN
gateway/dist/main/_internal/_queue.pyd vendored Normal file

Binary file not shown.

BIN
gateway/dist/main/_internal/_socket.pyd vendored Normal file

Binary file not shown.

BIN
gateway/dist/main/_internal/_ssl.pyd vendored Normal file

Binary file not shown.

BIN
gateway/dist/main/_internal/_uuid.pyd vendored Normal file

Binary file not shown.

BIN
gateway/dist/main/_internal/_wmi.pyd vendored Normal file

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@ -0,0 +1 @@
pip

View File

@ -0,0 +1,232 @@
Metadata-Version: 2.4
Name: attrs
Version: 25.3.0
Summary: Classes Without Boilerplate
Project-URL: Documentation, https://www.attrs.org/
Project-URL: Changelog, https://www.attrs.org/en/stable/changelog.html
Project-URL: GitHub, https://github.com/python-attrs/attrs
Project-URL: Funding, https://github.com/sponsors/hynek
Project-URL: Tidelift, https://tidelift.com/subscription/pkg/pypi-attrs?utm_source=pypi-attrs&utm_medium=pypi
Author-email: Hynek Schlawack <hs@ox.cx>
License-Expression: MIT
License-File: LICENSE
Keywords: attribute,boilerplate,class
Classifier: Development Status :: 5 - Production/Stable
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Python :: Implementation :: PyPy
Classifier: Typing :: Typed
Requires-Python: >=3.8
Provides-Extra: benchmark
Requires-Dist: cloudpickle; (platform_python_implementation == 'CPython') and extra == 'benchmark'
Requires-Dist: hypothesis; extra == 'benchmark'
Requires-Dist: mypy>=1.11.1; (platform_python_implementation == 'CPython' and python_version >= '3.10') and extra == 'benchmark'
Requires-Dist: pympler; extra == 'benchmark'
Requires-Dist: pytest-codspeed; extra == 'benchmark'
Requires-Dist: pytest-mypy-plugins; (platform_python_implementation == 'CPython' and python_version >= '3.10') and extra == 'benchmark'
Requires-Dist: pytest-xdist[psutil]; extra == 'benchmark'
Requires-Dist: pytest>=4.3.0; extra == 'benchmark'
Provides-Extra: cov
Requires-Dist: cloudpickle; (platform_python_implementation == 'CPython') and extra == 'cov'
Requires-Dist: coverage[toml]>=5.3; extra == 'cov'
Requires-Dist: hypothesis; extra == 'cov'
Requires-Dist: mypy>=1.11.1; (platform_python_implementation == 'CPython' and python_version >= '3.10') and extra == 'cov'
Requires-Dist: pympler; extra == 'cov'
Requires-Dist: pytest-mypy-plugins; (platform_python_implementation == 'CPython' and python_version >= '3.10') and extra == 'cov'
Requires-Dist: pytest-xdist[psutil]; extra == 'cov'
Requires-Dist: pytest>=4.3.0; extra == 'cov'
Provides-Extra: dev
Requires-Dist: cloudpickle; (platform_python_implementation == 'CPython') and extra == 'dev'
Requires-Dist: hypothesis; extra == 'dev'
Requires-Dist: mypy>=1.11.1; (platform_python_implementation == 'CPython' and python_version >= '3.10') and extra == 'dev'
Requires-Dist: pre-commit-uv; extra == 'dev'
Requires-Dist: pympler; extra == 'dev'
Requires-Dist: pytest-mypy-plugins; (platform_python_implementation == 'CPython' and python_version >= '3.10') and extra == 'dev'
Requires-Dist: pytest-xdist[psutil]; extra == 'dev'
Requires-Dist: pytest>=4.3.0; extra == 'dev'
Provides-Extra: docs
Requires-Dist: cogapp; extra == 'docs'
Requires-Dist: furo; extra == 'docs'
Requires-Dist: myst-parser; extra == 'docs'
Requires-Dist: sphinx; extra == 'docs'
Requires-Dist: sphinx-notfound-page; extra == 'docs'
Requires-Dist: sphinxcontrib-towncrier; extra == 'docs'
Requires-Dist: towncrier; extra == 'docs'
Provides-Extra: tests
Requires-Dist: cloudpickle; (platform_python_implementation == 'CPython') and extra == 'tests'
Requires-Dist: hypothesis; extra == 'tests'
Requires-Dist: mypy>=1.11.1; (platform_python_implementation == 'CPython' and python_version >= '3.10') and extra == 'tests'
Requires-Dist: pympler; extra == 'tests'
Requires-Dist: pytest-mypy-plugins; (platform_python_implementation == 'CPython' and python_version >= '3.10') and extra == 'tests'
Requires-Dist: pytest-xdist[psutil]; extra == 'tests'
Requires-Dist: pytest>=4.3.0; extra == 'tests'
Provides-Extra: tests-mypy
Requires-Dist: mypy>=1.11.1; (platform_python_implementation == 'CPython' and python_version >= '3.10') and extra == 'tests-mypy'
Requires-Dist: pytest-mypy-plugins; (platform_python_implementation == 'CPython' and python_version >= '3.10') and extra == 'tests-mypy'
Description-Content-Type: text/markdown
<p align="center">
<a href="https://www.attrs.org/">
<img src="https://raw.githubusercontent.com/python-attrs/attrs/main/docs/_static/attrs_logo.svg" width="35%" alt="attrs" />
</a>
</p>
*attrs* is the Python package that will bring back the **joy** of **writing classes** by relieving you from the drudgery of implementing object protocols (aka [dunder methods](https://www.attrs.org/en/latest/glossary.html#term-dunder-methods)).
[Trusted by NASA](https://docs.github.com/en/account-and-profile/setting-up-and-managing-your-github-profile/customizing-your-profile/personalizing-your-profile#list-of-qualifying-repositories-for-mars-2020-helicopter-contributor-achievement) for Mars missions since 2020!
Its main goal is to help you to write **concise** and **correct** software without slowing down your code.
## Sponsors
*attrs* would not be possible without our [amazing sponsors](https://github.com/sponsors/hynek).
Especially those generously supporting us at the *The Organization* tier and higher:
<!-- sponsor-break-begin -->
<p align="center">
<!-- [[[cog
import pathlib, tomllib
for sponsor in tomllib.loads(pathlib.Path("pyproject.toml").read_text())["tool"]["sponcon"]["sponsors"]:
print(f'<a href="{sponsor["url"]}"><img title="{sponsor["title"]}" src="https://www.attrs.org/en/25.3.0/_static/sponsors/{sponsor["img"]}" width="190" /></a>')
]]] -->
<a href="https://www.variomedia.de/"><img title="Variomedia AG" src="https://www.attrs.org/en/25.3.0/_static/sponsors/Variomedia.svg" width="190" /></a>
<a href="https://tidelift.com/?utm_source=lifter&utm_medium=referral&utm_campaign=hynek"><img title="Tidelift" src="https://www.attrs.org/en/25.3.0/_static/sponsors/Tidelift.svg" width="190" /></a>
<a href="https://klaviyo.com/"><img title="Klaviyo" src="https://www.attrs.org/en/25.3.0/_static/sponsors/Klaviyo.svg" width="190" /></a>
<a href="https://privacy-solutions.org/"><img title="Privacy Solutions" src="https://www.attrs.org/en/25.3.0/_static/sponsors/Privacy-Solutions.svg" width="190" /></a>
<a href="https://www.emsys-renewables.com/"><img title="emsys renewables" src="https://www.attrs.org/en/25.3.0/_static/sponsors/emsys-renewables.svg" width="190" /></a>
<a href="https://filepreviews.io/"><img title="FilePreviews" src="https://www.attrs.org/en/25.3.0/_static/sponsors/FilePreviews.svg" width="190" /></a>
<a href="https://polar.sh/"><img title="Polar" src="https://www.attrs.org/en/25.3.0/_static/sponsors/Polar.svg" width="190" /></a>
<!-- [[[end]]] -->
</p>
<!-- sponsor-break-end -->
<p align="center">
<strong>Please consider <a href="https://github.com/sponsors/hynek">joining them</a> to help make <em>attrs</em>s maintenance more sustainable!</strong>
</p>
<!-- teaser-end -->
## Example
*attrs* gives you a class decorator and a way to declaratively define the attributes on that class:
<!-- code-begin -->
```pycon
>>> from attrs import asdict, define, make_class, Factory
>>> @define
... class SomeClass:
... a_number: int = 42
... list_of_numbers: list[int] = Factory(list)
...
... def hard_math(self, another_number):
... return self.a_number + sum(self.list_of_numbers) * another_number
>>> sc = SomeClass(1, [1, 2, 3])
>>> sc
SomeClass(a_number=1, list_of_numbers=[1, 2, 3])
>>> sc.hard_math(3)
19
>>> sc == SomeClass(1, [1, 2, 3])
True
>>> sc != SomeClass(2, [3, 2, 1])
True
>>> asdict(sc)
{'a_number': 1, 'list_of_numbers': [1, 2, 3]}
>>> SomeClass()
SomeClass(a_number=42, list_of_numbers=[])
>>> C = make_class("C", ["a", "b"])
>>> C("foo", "bar")
C(a='foo', b='bar')
```
After *declaring* your attributes, *attrs* gives you:
- a concise and explicit overview of the class's attributes,
- a nice human-readable `__repr__`,
- equality-checking methods,
- an initializer,
- and much more,
*without* writing dull boilerplate code again and again and *without* runtime performance penalties.
---
This example uses *attrs*'s modern APIs that have been introduced in version 20.1.0, and the *attrs* package import name that has been added in version 21.3.0.
The classic APIs (`@attr.s`, `attr.ib`, plus their serious-business aliases) and the `attr` package import name will remain **indefinitely**.
Check out [*On The Core API Names*](https://www.attrs.org/en/latest/names.html) for an in-depth explanation!
### Hate Type Annotations!?
No problem!
Types are entirely **optional** with *attrs*.
Simply assign `attrs.field()` to the attributes instead of annotating them with types:
```python
from attrs import define, field
@define
class SomeClass:
a_number = field(default=42)
list_of_numbers = field(factory=list)
```
## Data Classes
On the tin, *attrs* might remind you of `dataclasses` (and indeed, `dataclasses` [are a descendant](https://hynek.me/articles/import-attrs/) of *attrs*).
In practice it does a lot more and is more flexible.
For instance, it allows you to define [special handling of NumPy arrays for equality checks](https://www.attrs.org/en/stable/comparison.html#customization), allows more ways to [plug into the initialization process](https://www.attrs.org/en/stable/init.html#hooking-yourself-into-initialization), has a replacement for `__init_subclass__`, and allows for stepping through the generated methods using a debugger.
For more details, please refer to our [comparison page](https://www.attrs.org/en/stable/why.html#data-classes), but generally speaking, we are more likely to commit crimes against nature to make things work that one would expect to work, but that are quite complicated in practice.
## Project Information
- [**Changelog**](https://www.attrs.org/en/stable/changelog.html)
- [**Documentation**](https://www.attrs.org/)
- [**PyPI**](https://pypi.org/project/attrs/)
- [**Source Code**](https://github.com/python-attrs/attrs)
- [**Contributing**](https://github.com/python-attrs/attrs/blob/main/.github/CONTRIBUTING.md)
- [**Third-party Extensions**](https://github.com/python-attrs/attrs/wiki/Extensions-to-attrs)
- **Get Help**: use the `python-attrs` tag on [Stack Overflow](https://stackoverflow.com/questions/tagged/python-attrs)
### *attrs* for Enterprise
Available as part of the [Tidelift Subscription](https://tidelift.com/?utm_source=lifter&utm_medium=referral&utm_campaign=hynek).
The maintainers of *attrs* and thousands of other packages are working with Tidelift to deliver commercial support and maintenance for the open source packages you use to build your applications.
Save time, reduce risk, and improve code health, while paying the maintainers of the exact packages you use.
## Release Information
### Changes
- Restore support for generator-based `field_transformer`s.
[#1417](https://github.com/python-attrs/attrs/issues/1417)
---
[Full changelog →](https://www.attrs.org/en/stable/changelog.html)

Some files were not shown because too many files have changed in this diff Show More