+------+------------------------------------------------+-------------------------------------------------------------------------------+-------------------------------------------------------------------------+-----------------------------------+---------------+-----------+------------+--------------+---------------+-------------------+------------------+-------------------+------------------+-------------------+-------------------+--------------------+
|      | mod_name                                       | base_op_type                                                                  | analy_op_type                                                           | shape                             | quant_dtype   |    qscale |     Cosine |           L1 |          Atol |   max_qscale_diff |   base_model_min |   analy_model_min |   base_model_max |   analy_model_max |   base_model_mean |   analy_model_mean |
|------+------------------------------------------------+-------------------------------------------------------------------------------+-------------------------------------------------------------------------+-----------------------------------+---------------+-----------+------------+--------------+---------------+-------------------+------------------+-------------------+------------------+-------------------+-------------------+--------------------|
|    0 | backbone.quant                                 | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([156, 3, 256, 704])    | qint8         | 1.0000000 |  0.6719549 |    0.1176119 |     0.5000000 |         0.5000000 |       -0.8750000 |        -1.0000000 |        0.8281250 |         1.0000000 |        -0.0533092 |         -0.0372840 |
|    1 | backbone.patch_embed.0.0                       | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.ConvReLU2d                         | torch.Size([156, 32, 128, 352])   | qint8         | 1.0000000 |  0.0190377 |    0.1036607 |     2.5832207 |         2.5832207 |       -2.2180142 |         0.0000000 |        2.5832207 |         2.0000000 |         0.0152377 |          0.0139203 |
|    2 | backbone.patch_embed.0.1                       | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 32, 128, 352])   | qint8         | 1.0000000 |  0.0649730 |    0.0668527 |     2.3213718 |         2.3213718 |       -1.3961307 |         0.0000000 |        1.3007953 |         2.0000000 |         0.0062548 |          0.0139203 |
|    3 | backbone.patch_embed.0.2                       | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 32, 128, 352])   | qint8         | 1.0000000 |  0.1046229 |    0.0429288 |     2.0000000 |         2.0000000 |        0.0000000 |         0.0000000 |        1.3007953 |         2.0000000 |         0.0301787 |          0.0139203 |
|    4 | backbone.patch_embed.1.0                       | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.ConvReLU2d                         | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0000000 |    0.0990676 |     2.8748043 |         2.8748043 |       -2.8748043 |         0.0000000 |        1.1927227 |         0.0000000 |        -0.0351011 |          0.0000000 |
|    5 | backbone.patch_embed.1.1                       | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0000000 |    0.0329909 |     0.6311034 |         0.6311034 |       -0.6311034 |         0.0000000 |        0.5390134 |         0.0000000 |         0.0183076 |          0.0000000 |
|    6 | backbone.patch_embed.1.2                       | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0000000 |    0.0256492 |     0.5390134 |         0.5390134 |        0.0000000 |         0.0000000 |        0.5390134 |         0.0000000 |         0.0256492 |          0.0000000 |
|    7 | backbone.stages.0.block.0.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 | -0.2756487 |    0.5332081 |     2.1680529 |         2.1680529 |       -0.4494418 |        -2.0000000 |        0.3971455 |         2.0000000 |         0.0111317 |         -0.0312500 |
|    8 | backbone.stages.0.block.0.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0551005 |    0.5866310 |    13.8283119 |        13.8283119 |      -11.9865141 |        -2.0000000 |       12.8283119 |         2.0000000 |         0.0178088 |         -0.0312500 |
|    9 | backbone.stages.0.block.0.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 | -0.1768321 |    0.4656250 |    21.4366531 |        21.4366531 |      -21.4366531 |        -2.0000000 |        3.0954785 |         2.0000000 |        -0.2300590 |          0.0468750 |
|   10 | backbone.stages.0.block.0.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 | -0.1285779 |    0.1938390 |     3.0360548 |         3.0360548 |       -0.1699712 |         0.0000000 |        3.0924373 |         2.0000000 |        -0.0271705 |          0.0781250 |
|   11 | backbone.stages.0.block.0.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   12 | backbone.stages.0.block.0.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   13 | backbone.stages.0.block.0.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0000000 |    0.0396093 |     0.6609020 |         0.6609020 |       -0.3579977 |         0.0000000 |        0.6609020 |         0.0000000 |         0.0204382 |          0.0000000 |
|   14 | backbone.stages.0.block.0.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0000000 |    0.0396093 |     0.6609020 |         0.6609020 |       -0.3579977 |         0.0000000 |        0.6609020 |         0.0000000 |         0.0204382 |          0.0000000 |
|   15 | backbone.stages.0.block.1.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 | -0.3917590 |    0.3579147 |     2.2383668 |         2.2383668 |       -0.4089391 |        -1.0000000 |        0.2609066 |         2.0000000 |         0.0022961 |          0.0156250 |
|   16 | backbone.stages.0.block.1.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 | -0.0818582 |    0.5338141 |     6.3339558 |         6.3339558 |       -6.2544293 |        -1.0000000 |        5.3428349 |         2.0000000 |        -0.0146429 |          0.0156250 |
|   17 | backbone.stages.0.block.1.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 | -0.1127944 |    0.4992315 |     8.1013803 |         8.1013803 |       -8.0779333 |        -1.0000000 |        2.2208343 |         2.0000000 |        -0.3816353 |          0.0078125 |
|   18 | backbone.stages.0.block.1.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 | -0.1272381 |    0.1467774 |     2.1915612 |         2.1915612 |       -0.1699712 |         0.0000000 |        2.1915612 |         2.0000000 |        -0.0643381 |          0.0390625 |
|   19 | backbone.stages.0.block.1.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   20 | backbone.stages.0.block.1.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   21 | backbone.stages.0.block.1.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0000000 |    0.0465277 |     0.6890951 |         0.6890951 |       -0.4564776 |         0.0000000 |        0.6890951 |         0.0000000 |         0.0064687 |          0.0000000 |
|   22 | backbone.stages.0.block.1.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0000000 |    0.0465277 |     0.6890951 |         0.6890951 |       -0.4564776 |         0.0000000 |        0.6890951 |         0.0000000 |         0.0064687 |          0.0000000 |
|   23 | backbone.stages.0.block.2.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 | -0.3512767 |    0.3668873 |     2.1657467 |         2.1657467 |       -0.4191610 |        -1.0000000 |        0.2589665 |         2.0000000 |        -0.0079984 |         -0.0468750 |
|   24 | backbone.stages.0.block.2.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0442896 |    0.5694529 |     6.8756247 |         6.8756247 |       -6.3365083 |        -1.0000000 |        5.8756247 |         2.0000000 |        -0.0295786 |         -0.0468750 |
|   25 | backbone.stages.0.block.2.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 |  0.0000000 |    0.5911953 |     7.4450231 |         7.4450231 |       -7.4450231 |         0.0000000 |        2.5679297 |         0.0000000 |        -0.5133070 |          0.0000000 |
|   26 | backbone.stages.0.block.2.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 |  0.0000000 |    0.1161063 |     2.5547938 |         2.5547938 |       -0.1699712 |         0.0000000 |        2.5547938 |         0.0000000 |        -0.0663839 |          0.0000000 |
|   27 | backbone.stages.0.block.2.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   28 | backbone.stages.0.block.2.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   29 | backbone.stages.0.block.2.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0000000 |    0.0602058 |     0.9706663 |         0.9706663 |       -0.5889622 |         0.0000000 |        0.9706663 |         0.0000000 |        -0.0000216 |          0.0000000 |
|   30 | backbone.stages.0.block.2.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0000000 |    0.0602058 |     0.9706663 |         0.9706663 |       -0.5889622 |         0.0000000 |        0.9706663 |         0.0000000 |        -0.0000216 |          0.0000000 |
|   31 | backbone.stages.0.block.3.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 | -0.1209666 |    0.0543633 |     1.1248182 |         1.1248182 |       -0.2090371 |         0.0000000 |        0.2662313 |         1.0000000 |         0.0032586 |          0.0156250 |
|   32 | backbone.stages.0.block.3.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0214847 |    0.4624359 |     8.2534819 |         8.2534819 |       -8.2534819 |         0.0000000 |        7.3828678 |         1.0000000 |         0.0267115 |          0.0156250 |
|   33 | backbone.stages.0.block.3.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 |  0.0000000 |    0.5025290 |     6.5794811 |         6.5794811 |       -6.5794811 |         0.0000000 |        6.1738920 |         0.0000000 |        -0.2839048 |          0.0000000 |
|   34 | backbone.stages.0.block.3.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 |  0.0000000 |    0.1644859 |     6.1738920 |         6.1738920 |       -0.1699712 |         0.0000000 |        6.1738920 |         0.0000000 |         0.0002398 |          0.0000000 |
|   35 | backbone.stages.0.block.3.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   36 | backbone.stages.0.block.3.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   37 | backbone.stages.0.block.3.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0000000 |    0.0728699 |     0.9991199 |         0.9991199 |       -0.8629689 |         0.0000000 |        0.9991199 |         0.0000000 |        -0.0013716 |          0.0000000 |
|   38 | backbone.stages.0.block.3.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0000000 |    0.0728699 |     0.9991199 |         0.9991199 |       -0.8629689 |         0.0000000 |        0.9991199 |         0.0000000 |        -0.0013716 |          0.0000000 |
|   39 | backbone.stage_norm.0                          | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.qat.batchnorm.BatchNorm2d                     | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0000000 |    0.0721000 |     2.3697283 |         2.3697283 |       -2.3697283 |         0.0000000 |        2.0236537 |         0.0000000 |         0.0037091 |          0.0000000 |
|   40 | backbone.up                                    | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer            | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer      | torch.Size([156, 64, 128, 352])   | qint8         | 1.0000000 |  0.0000000 |    0.0652005 |     1.9855907 |         1.9855907 |       -1.9855907 |         0.0000000 |        1.5560117 |         0.0000000 |         0.0037091 |          0.0000000 |
|   41 | backbone.downsample_block.0.proj.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  0.0000000 |    0.1806653 |     1.5089949 |         1.5089949 |       -1.4930454 |         0.0000000 |        1.5089949 |         0.0000000 |        -0.0049283 |          0.0000000 |
|   42 | backbone.downsample_block.0.proj.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  0.0000000 |    0.0497661 |     0.6920130 |         0.6920130 |       -0.5381714 |         0.0000000 |        0.6920130 |         0.0000000 |        -0.0025098 |          0.0000000 |
|   43 | backbone.stages.1.block.0.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 | -0.2614001 |    0.2808220 |     2.2019467 |         2.2019467 |       -0.2246089 |        -2.0000000 |        0.2607562 |         2.0000000 |         0.0049821 |         -0.0156250 |
|   44 | backbone.stages.1.block.0.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 | -0.0111563 |    0.5325462 |     6.7263727 |         6.7263727 |       -5.7263727 |        -2.0000000 |        4.8210330 |         2.0000000 |         0.0034509 |         -0.0156250 |
|   45 | backbone.stages.1.block.0.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 | -0.0825516 |    0.6068912 |     6.8430176 |         6.8430176 |       -5.8430176 |        -1.0000000 |        2.3497322 |         1.0000000 |        -0.5703663 |          0.0000000 |
|   46 | backbone.stages.1.block.0.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 | -0.0300538 |    0.1325744 |     2.3276601 |         2.3276601 |       -0.1699712 |         0.0000000 |        2.3276601 |         1.0000000 |        -0.1070861 |          0.0039062 |
|   47 | backbone.stages.1.block.0.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   48 | backbone.stages.1.block.0.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   49 | backbone.stages.1.block.0.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  0.0000000 |    0.0606745 |     0.7585943 |         0.7585943 |       -0.5458556 |         0.0000000 |        0.7585943 |         0.0000000 |         0.0022790 |          0.0000000 |
|   50 | backbone.stages.1.block.0.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  0.0000000 |    0.0606745 |     0.7585943 |         0.7585943 |       -0.5458556 |         0.0000000 |        0.7585943 |         0.0000000 |         0.0022790 |          0.0000000 |
|   51 | backbone.stages.1.block.1.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 | -0.3830993 |    0.3036986 |     4.1979327 |         4.1979327 |       -0.3329303 |        -4.0000000 |        0.2972635 |         2.0000000 |         0.0082485 |         -0.0937500 |
|   52 | backbone.stages.1.block.1.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  0.1743827 |    0.5239408 |     6.0691018 |         6.0691018 |       -6.0691018 |        -4.0000000 |        5.1569872 |         2.0000000 |        -0.0052009 |         -0.0937500 |
|   53 | backbone.stages.1.block.1.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  0.0367219 |    0.6081685 |     6.0286307 |         6.0286307 |       -6.0286307 |        -4.0000000 |        3.8951604 |         0.0000000 |        -0.5546219 |         -0.0195312 |
|   54 | backbone.stages.1.block.1.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  0.0000000 |    0.1271580 |     3.8949695 |         3.8949695 |       -0.1699712 |         0.0000000 |        3.8949695 |         0.0000000 |        -0.0992442 |          0.0000000 |
|   55 | backbone.stages.1.block.1.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   56 | backbone.stages.1.block.1.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   57 | backbone.stages.1.block.1.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  0.0000000 |    0.0732056 |     0.8074004 |         0.8074004 |       -0.6799405 |         0.0000000 |        0.8074004 |         0.0000000 |        -0.0077038 |          0.0000000 |
|   58 | backbone.stages.1.block.1.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  0.0000000 |    0.0732056 |     0.8074004 |         0.8074004 |       -0.6799405 |         0.0000000 |        0.8074004 |         0.0000000 |        -0.0077038 |          0.0000000 |
|   59 | backbone.stages.1.block.2.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 | -0.1703075 |    0.2050557 |     4.1600218 |         4.1600218 |       -0.3097573 |        -4.0000000 |        0.3525213 |         4.0000000 |         0.0014153 |          0.0000000 |
|   60 | backbone.stages.1.block.2.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  0.2162416 |    0.4717207 |     6.4708037 |         6.4708037 |       -5.8225808 |        -4.0000000 |        6.4708037 |         4.0000000 |         0.0244593 |          0.0000000 |
|   61 | backbone.stages.1.block.2.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  0.0734446 |    0.5736080 |     5.8984413 |         5.8984413 |       -5.8984413 |        -1.0000000 |        3.8019595 |         0.0000000 |        -0.5283288 |         -0.0078125 |
|   62 | backbone.stages.1.block.2.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  0.0000000 |    0.1307510 |     3.8016868 |         3.8016868 |       -0.1699712 |         0.0000000 |        3.8016868 |         0.0000000 |        -0.0995122 |          0.0000000 |
|   63 | backbone.stages.1.block.2.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   64 | backbone.stages.1.block.2.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   65 | backbone.stages.1.block.2.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  0.0000000 |    0.0888923 |     0.8073820 |         0.8073820 |       -0.6901461 |         0.0000000 |        0.8073820 |         0.0000000 |        -0.0082257 |          0.0000000 |
|   66 | backbone.stages.1.block.2.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  0.0000000 |    0.0888923 |     0.8073820 |         0.8073820 |       -0.6901461 |         0.0000000 |        0.8073820 |         0.0000000 |        -0.0082257 |          0.0000000 |
|   67 | backbone.stage_norm.1                          | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.qat.batchnorm.BatchNorm2d                     | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  0.0389042 |    0.1848188 |     3.9518437 |         3.9518437 |       -3.9518437 |        -1.0000000 |        4.3028646 |         1.0000000 |        -0.0013304 |          0.0078125 |
|   68 | backbone.downsample_block.1.proj.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.2919521 |     2.4785969 |         2.4785969 |       -2.4785969 |         0.0000000 |        2.0095170 |         0.0000000 |        -0.0159891 |          0.0000000 |
|   69 | backbone.downsample_block.1.proj.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.0464505 |     0.8470597 |         0.8470597 |       -0.8470597 |         0.0000000 |        0.6384847 |         0.0000000 |         0.0013419 |          0.0000000 |
|   70 | backbone.stages.2.block.0.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 | -0.2679625 |    0.2309953 |     4.1378498 |         4.1378498 |       -0.2466943 |        -4.0000000 |        0.2272134 |         2.0000000 |         0.0008373 |         -0.0208333 |
|   71 | backbone.stages.2.block.0.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0031275 |    0.4578625 |     5.3186378 |         5.3186378 |       -3.8617766 |        -4.0000000 |        3.3068337 |         2.0000000 |         0.0154946 |         -0.0208333 |
|   72 | backbone.stages.2.block.0.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.0988988 |    0.9439476 |     5.4520950 |         5.4520950 |       -5.4520950 |        -3.0000000 |        3.1001818 |         0.0000000 |        -0.9099663 |         -0.0182292 |
|   73 | backbone.stages.2.block.0.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.1284204 |     3.0971839 |         3.0971839 |       -0.1699712 |         0.0000000 |        3.0971839 |         0.0000000 |        -0.1016065 |          0.0000000 |
|   74 | backbone.stages.2.block.0.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   75 | backbone.stages.2.block.0.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   76 | backbone.stages.2.block.0.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.0538700 |     0.8790246 |         0.8790246 |       -0.8790246 |         0.0000000 |        0.6665116 |         0.0000000 |        -0.0003883 |          0.0000000 |
|   77 | backbone.stages.2.block.0.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.0538700 |     0.8790246 |         0.8790246 |       -0.8790246 |         0.0000000 |        0.6665116 |         0.0000000 |        -0.0003883 |          0.0000000 |
|   78 | backbone.stages.2.block.1.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 | -0.3239036 |    0.4005856 |     3.0940173 |         3.0940173 |       -0.1997702 |        -3.0000000 |        0.2997577 |         3.0000000 |         0.0010055 |          0.0208333 |
|   79 | backbone.stages.2.block.1.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0289121 |    0.5910801 |     4.2906737 |         4.2906737 |       -3.1562603 |        -3.0000000 |        3.2807672 |         3.0000000 |         0.0030762 |          0.0208333 |
|   80 | backbone.stages.2.block.1.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.1003830 |    0.9229759 |     5.3459892 |         5.3459892 |       -5.3459892 |        -1.0000000 |        3.4909663 |         0.0000000 |        -0.8814355 |         -0.0104167 |
|   81 | backbone.stages.2.block.1.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.1298329 |     3.4901264 |         3.4901264 |       -0.1699712 |         0.0000000 |        3.4901264 |         0.0000000 |        -0.0967530 |          0.0000000 |
|   82 | backbone.stages.2.block.1.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   83 | backbone.stages.2.block.1.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   84 | backbone.stages.2.block.1.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.0623792 |     0.8505574 |         0.8505574 |       -0.8505574 |         0.0000000 |        0.7485591 |         0.0000000 |         0.0030583 |          0.0000000 |
|   85 | backbone.stages.2.block.1.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.0623792 |     0.8505574 |         0.8505574 |       -0.8505574 |         0.0000000 |        0.7485591 |         0.0000000 |         0.0030583 |          0.0000000 |
|   86 | backbone.stages.2.block.2.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 | -0.2489251 |    0.3181024 |     3.0877101 |         3.0877101 |       -0.3054737 |        -3.0000000 |        0.4841736 |         3.0000000 |        -0.0025187 |          0.0416667 |
|   87 | backbone.stages.2.block.2.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.1079082 |    0.5063837 |     4.8260169 |         4.8260169 |       -4.6662579 |        -3.0000000 |        4.8260169 |         3.0000000 |        -0.0015780 |          0.0416667 |
|   88 | backbone.stages.2.block.2.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.0461203 |    0.8728197 |     5.6172366 |         5.6172366 |       -5.6172366 |        -1.0000000 |        3.8023784 |         0.0000000 |        -0.8235920 |         -0.0026042 |
|   89 | backbone.stages.2.block.2.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.1327637 |     3.8021059 |         3.8021059 |       -0.1699712 |         0.0000000 |        3.8021059 |         0.0000000 |        -0.0979074 |          0.0000000 |
|   90 | backbone.stages.2.block.2.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   91 | backbone.stages.2.block.2.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   92 | backbone.stages.2.block.2.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.0763765 |     0.8662520 |         0.8662520 |       -0.8662520 |         0.0000000 |        0.7928482 |         0.0000000 |         0.0058369 |          0.0000000 |
|   93 | backbone.stages.2.block.2.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.0763765 |     0.8662520 |         0.8662520 |       -0.8662520 |         0.0000000 |        0.7928482 |         0.0000000 |         0.0058369 |          0.0000000 |
|   94 | backbone.stages.2.block.3.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 | -0.2644500 |    0.1839022 |     3.1549990 |         3.1549990 |       -0.2293207 |        -3.0000000 |        0.3264680 |         2.0000000 |         0.0033636 |          0.0052083 |
|   95 | backbone.stages.2.block.3.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.1384782 |    0.3896442 |     6.1030192 |         6.1030192 |       -4.2361202 |        -3.0000000 |        6.1030192 |         2.0000000 |        -0.0067929 |          0.0052083 |
|   96 | backbone.stages.2.block.3.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.7492814 |     6.4780712 |         6.4780712 |       -6.4780712 |         0.0000000 |        5.1077442 |         0.0000000 |        -0.6851630 |          0.0000000 |
|   97 | backbone.stages.2.block.3.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.1369347 |     5.1077437 |         5.1077437 |       -0.1699712 |         0.0000000 |        5.1077437 |         0.0000000 |        -0.0919522 |          0.0000000 |
|   98 | backbone.stages.2.block.3.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   99 | backbone.stages.2.block.3.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  100 | backbone.stages.2.block.3.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.0887561 |     1.1023049 |         1.1023049 |       -0.8794131 |         0.0000000 |        1.1023049 |         0.0000000 |         0.0053021 |          0.0000000 |
|  101 | backbone.stages.2.block.3.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.0887561 |     1.1023049 |         1.1023049 |       -0.8794131 |         0.0000000 |        1.1023049 |         0.0000000 |         0.0053021 |          0.0000000 |
|  102 | backbone.stages.2.block.4.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 | -0.3201962 |    0.2465177 |     2.2446632 |         2.2446632 |       -0.3093322 |        -2.0000000 |        0.5285835 |         1.0000000 |        -0.0004385 |         -0.0572917 |
|  103 | backbone.stages.2.block.4.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.1606613 |    0.4326464 |     5.6093783 |         5.6093783 |       -5.6093783 |        -2.0000000 |        4.1001759 |         1.0000000 |        -0.0052008 |         -0.0572917 |
|  104 | backbone.stages.2.block.4.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.0535659 |    0.8863723 |     6.6591797 |         6.6591797 |       -7.2245617 |        -2.0000000 |        3.8892758 |         0.0000000 |        -0.8441266 |         -0.0104167 |
|  105 | backbone.stages.2.block.4.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.1321976 |     3.8890800 |         3.8890800 |       -0.1699712 |         0.0000000 |        3.8890800 |         0.0000000 |        -0.1041702 |          0.0000000 |
|  106 | backbone.stages.2.block.4.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  107 | backbone.stages.2.block.4.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  108 | backbone.stages.2.block.4.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.1032738 |     1.1146780 |         1.1146780 |       -0.8993114 |         0.0000000 |        1.1146780 |         0.0000000 |         0.0031444 |          0.0000000 |
|  109 | backbone.stages.2.block.4.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.1032738 |     1.1146780 |         1.1146780 |       -0.8993114 |         0.0000000 |        1.1146780 |         0.0000000 |         0.0031444 |          0.0000000 |
|  110 | backbone.stages.2.block.5.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 | -0.4567892 |    0.3469277 |     2.2174454 |         2.2174454 |       -0.5103102 |        -2.0000000 |        0.4267043 |         2.0000000 |         0.0007170 |          0.0208333 |
|  111 | backbone.stages.2.block.5.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.1403103 |    0.5059045 |     4.9481978 |         4.9481978 |       -3.9481978 |        -2.0000000 |        4.1253014 |         2.0000000 |        -0.0201092 |          0.0208333 |
|  112 | backbone.stages.2.block.5.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.0030739 |    0.9618232 |     6.2923970 |         6.2923970 |       -6.2923970 |         0.0000000 |        6.2314844 |         1.0000000 |        -0.9076580 |          0.0026042 |
|  113 | backbone.stages.2.block.5.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.0327642 |    0.1346247 |     6.2314844 |         6.2314844 |       -0.1699712 |         0.0000000 |        6.2314844 |         1.0000000 |        -0.0953203 |          0.0026042 |
|  114 | backbone.stages.2.block.5.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  115 | backbone.stages.2.block.5.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  116 | backbone.stages.2.block.5.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.1236993 |     1.5218859 |         1.5218859 |       -1.1945784 |         0.0000000 |        1.5218859 |         0.0000000 |        -0.0007839 |          0.0000000 |
|  117 | backbone.stages.2.block.5.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.1236993 |     1.5218859 |         1.5218859 |       -1.1945784 |         0.0000000 |        1.5218859 |         0.0000000 |        -0.0007839 |          0.0000000 |
|  118 | backbone.stages.2.block.6.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 | -0.0673884 |    0.1071443 |     3.1687362 |         3.1687362 |       -0.3120886 |        -3.0000000 |        0.2583454 |         1.0000000 |        -0.0004483 |         -0.0312500 |
|  119 | backbone.stages.2.block.6.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.2750997 |    0.2753727 |     5.7911067 |         5.7911067 |       -6.7437749 |        -3.0000000 |        4.3009601 |         1.0000000 |        -0.0045506 |         -0.0312500 |
|  120 | backbone.stages.2.block.6.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.6146308 |     6.3237238 |         6.3237238 |       -6.3237238 |         0.0000000 |        5.4138217 |         0.0000000 |        -0.5702111 |          0.0000000 |
|  121 | backbone.stages.2.block.6.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.1371761 |     5.4138217 |         5.4138217 |       -0.1699712 |         0.0000000 |        5.4138217 |         0.0000000 |        -0.1064179 |          0.0000000 |
|  122 | backbone.stages.2.block.6.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  123 | backbone.stages.2.block.6.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  124 | backbone.stages.2.block.6.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.1338712 |     1.6247715 |         1.6247715 |       -1.6247715 |         0.0000000 |        1.5985045 |         0.0000000 |         0.0022490 |          0.0000000 |
|  125 | backbone.stages.2.block.6.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.1338712 |     1.6247715 |         1.6247715 |       -1.6247715 |         0.0000000 |        1.5985045 |         0.0000000 |         0.0022490 |          0.0000000 |
|  126 | backbone.stages.2.block.7.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 | -0.1274763 |    0.0838078 |     2.1310077 |         2.1310077 |       -0.3949572 |        -1.0000000 |        0.3203029 |         2.0000000 |         0.0029081 |         -0.0052083 |
|  127 | backbone.stages.2.block.7.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.1535958 |    0.3057301 |     4.8618140 |         4.8618140 |       -4.8925786 |        -1.0000000 |        4.8618140 |         2.0000000 |        -0.0056151 |         -0.0052083 |
|  128 | backbone.stages.2.block.7.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.7090072 |     6.6784573 |         6.6784573 |       -6.6784573 |         0.0000000 |        5.8539329 |         0.0000000 |        -0.6466579 |          0.0000000 |
|  129 | backbone.stages.2.block.7.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.1406102 |     5.8539329 |         5.8539329 |       -0.1699712 |         0.0000000 |        5.8539329 |         0.0000000 |        -0.0961895 |          0.0000000 |
|  130 | backbone.stages.2.block.7.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  131 | backbone.stages.2.block.7.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  132 | backbone.stages.2.block.7.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.1442631 |     2.3575978 |         2.3575978 |       -2.1202052 |         0.0000000 |        2.3575978 |         0.0000000 |        -0.0004156 |          0.0000000 |
|  133 | backbone.stages.2.block.7.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.1442631 |     2.3575978 |         2.3575978 |       -2.1202052 |         0.0000000 |        2.3575978 |         0.0000000 |        -0.0004156 |          0.0000000 |
|  134 | backbone.stage_norm.2                          | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.qat.batchnorm.BatchNorm2d                     | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 | -0.0046415 |    0.8456784 |    13.1336031 |        13.1336031 |      -10.5867052 |        -2.0000000 |       12.1336031 |         2.0000000 |         0.0009658 |          0.0156250 |
|  135 | backbone.downsample_block.2.proj.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.5230171 |     7.3689590 |         7.3689590 |       -7.3689590 |         0.0000000 |        4.2946038 |         0.0000000 |        -0.0042220 |          0.0000000 |
|  136 | backbone.downsample_block.2.proj.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.0276852 |     0.3836133 |         0.3836133 |       -0.3727797 |         0.0000000 |        0.3836133 |         0.0000000 |        -0.0014076 |          0.0000000 |
|  137 | backbone.stages.3.block.0.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 | -0.1449913 |    0.0839058 |     1.1926833 |         1.1926833 |       -0.1926833 |        -1.0000000 |        0.2552128 |         1.0000000 |         0.0030581 |          0.0078125 |
|  138 | backbone.stages.3.block.0.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.0789777 |    0.3348195 |     3.6894433 |         3.6894433 |       -3.5679040 |        -1.0000000 |        3.6894433 |         1.0000000 |        -0.0117093 |          0.0078125 |
|  139 | backbone.stages.3.block.0.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  0.0141184 |    1.2929723 |     9.0210314 |         9.0210314 |       -8.1437206 |        -1.0000000 |        9.0210314 |         1.0000000 |        -1.2416308 |         -0.0017361 |
|  140 | backbone.stages.3.block.0.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 | -0.0095628 |    0.1147539 |     9.0210314 |         9.0210314 |       -0.1699712 |         0.0000000 |        9.0210314 |         1.0000000 |        -0.0758742 |          0.0008681 |
|  141 | backbone.stages.3.block.0.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  142 | backbone.stages.3.block.0.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  143 | backbone.stages.3.block.0.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.0367304 |     0.9630418 |         0.9630418 |       -0.9630418 |         0.0000000 |        0.7459374 |         0.0000000 |        -0.0011335 |          0.0000000 |
|  144 | backbone.stages.3.block.0.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.0367304 |     0.9630418 |         0.9630418 |       -0.9630418 |         0.0000000 |        0.7459374 |         0.0000000 |        -0.0011335 |          0.0000000 |
|  145 | backbone.stages.3.block.1.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 | -0.1612679 |    0.0904467 |     3.3078842 |         3.3078842 |       -0.2451220 |        -3.0000000 |        0.3078841 |         1.0000000 |        -0.0004142 |         -0.0260417 |
|  146 | backbone.stages.3.block.1.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.1424324 |    0.3319964 |     9.8115616 |         9.8115616 |       -5.1351380 |        -3.0000000 |        9.8115616 |         1.0000000 |         0.0041994 |         -0.0260417 |
|  147 | backbone.stages.3.block.1.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  0.0000000 |    1.2577647 |     7.4743228 |         7.4743228 |       -7.4743228 |         0.0000000 |        3.9670341 |         0.0000000 |        -1.2167437 |          0.0000000 |
|  148 | backbone.stages.3.block.1.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  0.0000000 |    0.1140808 |     3.9668899 |         3.9668899 |       -0.1699712 |         0.0000000 |        3.9668899 |         0.0000000 |        -0.0843448 |          0.0000000 |
|  149 | backbone.stages.3.block.1.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  150 | backbone.stages.3.block.1.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  151 | backbone.stages.3.block.1.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.0461328 |     1.3303739 |         1.3303739 |       -1.3303739 |         0.0000000 |        1.2285583 |         0.0000000 |        -0.0020695 |          0.0000000 |
|  152 | backbone.stages.3.block.1.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.0461328 |     1.3303739 |         1.3303739 |       -1.3303739 |         0.0000000 |        1.2285583 |         0.0000000 |        -0.0020695 |          0.0000000 |
|  153 | backbone.stages.3.block.2.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 | -0.0961628 |    0.1028763 |     1.2703696 |         1.2703696 |       -0.2659134 |        -1.0000000 |        0.3678736 |         1.0000000 |         0.0011593 |         -0.0156250 |
|  154 | backbone.stages.3.block.2.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.2008580 |    0.3146616 |     7.3146706 |         7.3146706 |       -4.5102749 |        -1.0000000 |        7.3146706 |         1.0000000 |         0.0038045 |         -0.0156250 |
|  155 | backbone.stages.3.block.2.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 | -0.0349645 |    1.2284395 |     6.0631108 |         6.0631108 |       -6.0631108 |         0.0000000 |        3.4293721 |         1.0000000 |        -1.1928277 |          0.0017361 |
|  156 | backbone.stages.3.block.2.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 | -0.0276536 |    0.1181985 |     3.4283347 |         3.4283347 |       -0.1699712 |         0.0000000 |        3.4283347 |         1.0000000 |        -0.0921254 |          0.0017361 |
|  157 | backbone.stages.3.block.2.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  158 | backbone.stages.3.block.2.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  159 | backbone.stages.3.block.2.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.0698133 |     1.2768915 |         1.2768915 |       -1.2301657 |         0.0000000 |        1.2768915 |         0.0000000 |        -0.0037980 |          0.0000000 |
|  160 | backbone.stages.3.block.2.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.0698133 |     1.2768915 |         1.2768915 |       -1.2301657 |         0.0000000 |        1.2768915 |         0.0000000 |        -0.0037980 |          0.0000000 |
|  161 | backbone.stages.3.block.3.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 | -0.1440612 |    0.1502416 |     2.0858967 |         2.0858967 |       -0.2924007 |        -1.0000000 |        0.3206407 |         2.0000000 |        -0.0010396 |         -0.0156250 |
|  162 | backbone.stages.3.block.3.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.1406818 |    0.3615084 |     7.0220504 |         7.0220504 |       -3.1507998 |        -1.0000000 |        7.0220504 |         2.0000000 |         0.0079710 |         -0.0156250 |
|  163 | backbone.stages.3.block.3.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  0.0000000 |    1.3235672 |     6.9803343 |         6.9803343 |       -6.9803343 |         0.0000000 |        3.7990394 |         0.0000000 |        -1.2955461 |          0.0000000 |
|  164 | backbone.stages.3.block.3.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  0.0000000 |    0.1092260 |     3.7987635 |         3.7987635 |       -0.1699712 |         0.0000000 |        3.7987635 |         0.0000000 |        -0.0891540 |          0.0000000 |
|  165 | backbone.stages.3.block.3.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  166 | backbone.stages.3.block.3.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  167 | backbone.stages.3.block.3.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.0908518 |     1.3722661 |         1.3722661 |       -1.3089640 |         0.0000000 |        1.3722661 |         0.0000000 |        -0.0068171 |          0.0000000 |
|  168 | backbone.stages.3.block.3.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.0908518 |     1.3722661 |         1.3722661 |       -1.3089640 |         0.0000000 |        1.3722661 |         0.0000000 |        -0.0068171 |          0.0000000 |
|  169 | backbone.stages.3.block.4.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 | -0.2643968 |    0.1568216 |     2.5436821 |         2.5436821 |       -0.4191600 |        -2.0000000 |        0.5436821 |         2.0000000 |        -0.0003240 |          0.0130208 |
|  170 | backbone.stages.3.block.4.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.1801704 |    0.3361461 |     8.8300571 |         8.8300571 |       -8.8300571 |        -2.0000000 |        2.8000822 |         2.0000000 |         0.0019033 |          0.0130208 |
|  171 | backbone.stages.3.block.4.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  0.0000000 |    1.1454672 |     6.7975883 |         6.7975883 |       -6.7975883 |         0.0000000 |        4.8655510 |         0.0000000 |        -1.1082340 |          0.0000000 |
|  172 | backbone.stages.3.block.4.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  0.0000000 |    0.1212620 |     4.8655481 |         4.8655481 |       -0.1699712 |         0.0000000 |        4.8655481 |         0.0000000 |        -0.0946301 |          0.0000000 |
|  173 | backbone.stages.3.block.4.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  174 | backbone.stages.3.block.4.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  175 | backbone.stages.3.block.4.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.1316800 |     2.6959741 |         2.6959741 |       -2.4191155 |         0.0000000 |        2.6959741 |         0.0000000 |        -0.0034095 |          0.0000000 |
|  176 | backbone.stages.3.block.4.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.1316800 |     2.6959741 |         2.6959741 |       -2.4191155 |         0.0000000 |        2.6959741 |         0.0000000 |        -0.0034095 |          0.0000000 |
|  177 | backbone.stages.3.block.5.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 | -0.3326188 |    0.2068362 |     2.2762475 |         2.2762475 |       -0.7093123 |        -1.0000000 |        0.7681467 |         2.0000000 |        -0.0023756 |          0.0182292 |
|  178 | backbone.stages.3.block.5.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.1948171 |    0.3746558 |     8.8965921 |         8.8965921 |       -7.8965921 |        -1.0000000 |        4.5267134 |         2.0000000 |         0.0053668 |          0.0182292 |
|  179 | backbone.stages.3.block.5.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  0.0000000 |    1.6335863 |     8.3486271 |         8.3486271 |       -8.2370605 |         0.0000000 |        8.3486271 |         0.0000000 |        -1.6113414 |          0.0000000 |
|  180 | backbone.stages.3.block.5.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  0.0000000 |    0.0901961 |     8.3486271 |         8.3486271 |       -0.1699712 |         0.0000000 |        8.3486271 |         0.0000000 |        -0.0733183 |          0.0000000 |
|  181 | backbone.stages.3.block.5.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  182 | backbone.stages.3.block.5.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  183 | backbone.stages.3.block.5.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.1849531 |     8.2709074 |         8.2709074 |       -8.2709074 |         0.0000000 |        7.7374482 |         0.0000000 |        -0.0040457 |          0.0000000 |
|  184 | backbone.stages.3.block.5.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.1849531 |     8.2709074 |         8.2709074 |       -8.2709074 |         0.0000000 |        7.7374482 |         0.0000000 |        -0.0040457 |          0.0000000 |
|  185 | backbone.stage_norm.3                          | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.qat.batchnorm.BatchNorm2d                     | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 | -0.0000839 |    0.2541206 |     9.6151495 |         9.6151495 |       -9.4426842 |        -1.0000000 |        9.6151495 |         1.0000000 |        -0.0006760 |          0.0052083 |
|  186 | neck.conv_extract.0.0                          | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  187 | neck.conv_extract.1.0                          | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  188 | neck.conv_extract.2.0                          | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  189 | neck.conv_extract.3.0                          | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.3691624 |     7.7702875 |         7.7702875 |       -7.0596309 |         0.0000000 |        7.7702875 |         0.0000000 |        -0.0114310 |          0.0000000 |
|  190 | neck.upscale.2                                 | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer            | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer      | torch.Size([156, 256, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.3126892 |     6.4649458 |         6.4649458 |       -6.0730400 |         0.0000000 |        6.4649458 |         0.0000000 |        -0.0114310 |          0.0000000 |
|  191 | neck.conv_add.0                                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 256, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    0.5459397 |    12.1734800 |        12.1734800 |      -12.1734800 |         0.0000000 |       12.0644331 |         0.0000000 |        -0.0140016 |          0.0000000 |
|  192 | neck.upscale.1                                 | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer            | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer      | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  0.0000000 |    0.4844743 |    12.1734800 |        12.1734800 |      -12.1734800 |         0.0000000 |       11.7007113 |         0.0000000 |        -0.0140016 |          0.0000000 |
|  193 | neck.conv_add.1                                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  0.0000000 |    0.4885206 |    12.0857582 |        12.0857582 |      -12.0857582 |         0.0000000 |       11.8623285 |         0.0000000 |        -0.0109680 |          0.0000000 |
|  194 | neck.upscale.0                                 | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer            | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer      | torch.Size([156, 256, 64, 176])   | qint8         | 1.0000000 |  0.0000000 |    0.4758066 |    12.0857582 |        12.0857582 |      -12.0857582 |         0.0000000 |       11.6716518 |         0.0000000 |        -0.0109680 |          0.0000000 |
|  195 | neck.conv_add.2                                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 256, 64, 176])   | qint8         | 1.0000000 |  0.0000000 |    0.4817617 |    12.2847271 |        12.2847271 |      -12.2847271 |         0.0000000 |       11.6905842 |         0.0000000 |        -0.0065705 |          0.0000000 |
|  196 | neck.fpn_conv.0.0                              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 64, 176])   | qint8         | 1.0000000 |  0.0000000 |    0.5533577 |     9.3026438 |         9.3026438 |       -8.1144848 |         0.0000000 |        9.3026438 |         0.0000000 |         0.0575284 |          0.0000000 |
|  197 | neck.fpn_conv.1.0                              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  0.0000000 |    0.5956138 |     8.5757637 |         8.5757637 |       -8.4537172 |         0.0000000 |        8.5757637 |         0.0000000 |         0.0393536 |          0.0000000 |
|  198 | neck.fpn_conv.2.0                              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    2.6014445 |    77.9110489 |        77.9110489 |      -77.9110489 |         0.0000000 |       70.3593140 |         0.0000000 |        -0.0351598 |          0.0000000 |
|  199 | neck.fpn_conv.3.0                              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 8, 22])     | qint8         | 1.0000000 |  0.0000000 |    0.2107436 |     2.5660279 |         2.5660279 |       -2.2936664 |         0.0000000 |        2.5660279 |         0.0000000 |         0.0004191 |          0.0000000 |
|  200 | head                                           | torch.Tensor.float                                                            | torch.Tensor.float                                                      | torch.Size([156, 256, 16, 44])    | qint8         | 1.0000000 |  0.0000000 |    2.6014445 |    77.9110489 |        77.9110489 |      -77.9110489 |         0.0000000 |       70.3593140 |         0.0000000 |        -0.0351598 |          0.0000000 |
|  201 | head                                           | torch.Tensor.sub                                                              | torch.Tensor.sub                                                        | torch.Size([26])                  | torch.float64 |           |  1.0000001 |    0.0000000 |     0.0000000 |                   |       -0.9000001 |        -0.9000001 |        9.1000004 |         9.1000004 |         1.2538462 |          1.2538462 |
|  202 | head                                           | torch.Tensor.to                                                               | torch.Tensor.to                                                         | torch.Size([26])                  | torch.float32 |           |  1.0000001 |    0.0000000 |     0.0000000 |                   |       -0.9000001 |        -0.9000001 |        9.1000004 |         9.1000004 |         1.2538462 |          1.2538462 |
|  203 | head                                           | torch.abs                                                                     | torch.abs                                                               | torch.Size([26])                  | torch.float32 |           |  1.0000001 |    0.0000000 |     0.0000000 |                   |        0.9000001 |         0.9000001 |        9.1000004 |         9.1000004 |         2.3615386 |          2.3615386 |
|  204 | head                                           | torch.Tensor.le                                                               | torch.Tensor.le                                                         | torch.Size([26])                  | torch.bool    |           |  1.0000001 |    0.0000000 |     0.0000000 |                   |        0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |         0.6923077 |          0.6923077 |
|  205 | head                                           | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([26, 4, 4])            | torch.float32 |           |  1.0000000 |    0.0000000 |     0.0000000 |                   |       -0.9999130 |        -0.9999130 |  3591784.0000000 |   3591784.0000000 |    271098.0000000 |     271098.0000000 |
|  206 | head                                           | torch.Tensor.to                                                               | torch.Tensor.to                                                         | torch.Size([26])                  | torch.float32 |           |  1.0000001 |    0.0000000 |     0.0000000 |                   |       -0.9000001 |        -0.9000001 |        9.1000004 |         9.1000004 |         1.2538462 |          1.2538462 |
|  207 | head                                           | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([3, 384, 26])          | torch.float32 |           |  0.0000000 |   12.4773417 |   376.1104126 |                   |     -376.1104126 |         0.0000000 |      113.7904434 |         0.0000000 |         1.9572173 |          0.0000000 |
|  208 | head                                           | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([26, 384, 3])          | torch.float32 |           |  0.0000000 |   12.4773417 |   376.1104126 |                   |     -376.1104126 |         0.0000000 |      113.7904434 |         0.0000000 |         1.9572173 |          0.0000000 |
|  209 | head                                           | torch.Tensor.add                                                              | torch.Tensor.add                                                        | torch.Size([26, 384, 3])          | torch.float32 |           | -0.1717048 |   37.4660225 |   409.1746216 |                   |     -409.1746216 |       -63.9622688 |      121.1069183 |         0.0000000 |         4.4326091 |        -22.9935493 |
|  210 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 384, 3, 1])       | torch.float32 |           | -0.1717048 |   37.4660225 |   409.1746216 |                   |     -409.1746216 |       -63.9622688 |      121.1069183 |         0.0000000 |         4.4326091 |        -22.9935493 |
|  211 | head                                           | torch.matmul                                                                  | torch.matmul                                                            | torch.Size([26, 384, 3, 1])       | torch.float32 |           | -0.1717041 |   37.4756813 |   409.0593262 |                   |     -409.0593262 |       -64.3646317 |      121.1055069 |         0.0000000 |         4.4347663 |        -23.0045090 |
|  212 | head                                           | torch.Tensor.squeeze                                                          | torch.Tensor.squeeze                                                    | torch.Size([26, 384, 3])          | torch.float32 |           | -0.1717041 |   37.4756813 |   409.0593262 |                   |     -409.0593262 |       -64.3646317 |      121.1055069 |         0.0000000 |         4.4347663 |        -23.0045090 |
|  213 | head                                           | torch.Tensor.add                                                              | torch.Tensor.add                                                        | torch.Size([26, 384, 3])          | torch.float32 |           |  0.5359403 |   37.4756813 |   409.0593262 |                   |     -475.4343262 |      -169.7500000 |      112.4251709 |       -11.9215603 |        -9.1052980 |        -36.5445747 |
|  214 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 1])            | torch.bool    |           |  1.0000001 |    0.0000000 |     0.0000000 |                   |        0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |         0.6923077 |          0.6923077 |
|  215 | head                                           | torch.where                                                                   | torch.where                                                             | torch.Size([26, 384, 256])        | torch.float32 |           |  0.0000000 |    0.5149521 |     4.7778430 |                   |       -4.7778430 |         0.0000000 |        3.5256600 |         0.0000000 |        -0.0037713 |          0.0000000 |
|  216 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 1])            | torch.bool    |           |  1.0000001 |    0.0000000 |     0.0000000 |                   |        0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |         0.6923077 |          0.6923077 |
|  217 | head                                           | torch.where                                                                   | torch.where                                                             | torch.Size([26, 384, 11])         | torch.float32 |           | -0.4448636 |   11.4929705 |   140.5460510 |                   |      -88.4849091 |       -51.2515602 |      112.4251709 |         0.0000000 |         1.5151513 |         -4.9116316 |
|  218 | head.instance_bank.anchor_quant_stub           | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 384, 11])         | qint16        | 0.0033570 |  1.0000026 |    0.0008805 |     0.0098839 |         2.9442733 |      -58.0691261 |       -58.0691223 |       98.1392365 |        98.1347351 |         2.8034270 |          2.8033648 |
|  219 | head.instance_bank.instance_feature_quant_stub | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 384, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.0000000 |     0.0000000 |         0.0000000 |        0.0000000 |         0.0000000 |        0.0000000 |         0.0000000 |         0.0000000 |          0.0000000 |
|  220 | head.instance_bank.anchor_quant_stub           | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 384, 11])         | qint16        | 0.0033570 | -0.4448638 |   11.4929886 |   140.5462189 |     41866.8020724 |      -88.4849091 |       -51.2510872 |      112.4251709 |         0.0000000 |         1.5151513 |         -4.9116597 |
|  221 | head.instance_bank.instance_feature_quant_stub | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 384, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.5149521 |     4.7778430 |         4.7778430 |       -4.7778430 |         0.0000000 |        3.5256600 |         0.0000000 |        -0.0037713 |          0.0000000 |
|  222 | head                                           | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 384, 11])         | qint16        | 0.0033570 | -0.4522487 |   11.2181511 |   111.2510834 |     33140.1806846 |      -60.0000000 |       -51.2510872 |       60.0000000 |         0.0000000 |         1.2460523 |         -4.9116597 |
|  223 | head                                           | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 384, 11])         | qint16        | 0.0033570 |  1.0000050 |    0.0008700 |     0.0098839 |         2.9442733 |      -58.0691261 |       -58.0691223 |       60.0000000 |        59.9993896 |         2.4505773 |          2.4505117 |
|  224 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 128, 11])         | qint16        | 0.0033570 | -0.4682774 |   11.3962688 |   111.2510834 |     33140.1806846 |      -60.0000000 |       -51.2510872 |       60.0000000 |         0.0000000 |         1.2995486 |         -4.8994370 |
|  225 | head.instance_bank.anchor_cat                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 11])         | qint16        | 0.0033570 |  0.6072873 |    2.8497200 |   111.2510834 |     33140.1806846 |      -60.0000000 |       -58.0691223 |       60.0000000 |        59.9993896 |         2.1628201 |          0.6130243 |
|  226 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 128, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.5149797 |     4.7778430 |         4.7778430 |       -4.7778430 |         0.0000000 |        3.4920952 |         0.0000000 |        -0.0037114 |          0.0000000 |
|  227 | head.instance_bank.feature_cat                 | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.1287449 |     4.7778430 |         4.7778430 |       -4.7778430 |         0.0000000 |        3.4920952 |         0.0000000 |        -0.0009278 |          0.0000000 |
|  228 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 256, 11])         | qint16        | 0.0033570 | -0.4442356 |   11.1290903 |   110.9456024 |     33049.1820731 |      -60.0000000 |       -50.9456024 |       60.0000000 |         0.0000000 |         1.2193040 |         -4.9177713 |
|  229 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 256, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.5149382 |     4.7676134 |         4.7676134 |       -4.7676134 |         0.0000000 |        3.5256600 |         0.0000000 |        -0.0038013 |          0.0000000 |
|  230 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 0.0033570 |  0.6376518 |    8.0702028 |   111.2510834 |     33140.1806846 |      -60.0000000 |       -58.0691223 |       60.0000000 |        59.9993896 |         8.7323294 |          1.2088040 |
|  231 | head.anchor_encoder.pos_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 | -0.1183562 |   12.9603186 |   162.7348633 |       162.7348633 |      -77.1842422 |         0.0000000 |       61.5234756 |        95.0000000 |        -0.7146500 |          3.4120378 |
|  232 | head.anchor_encoder.pos_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.2259746 |    7.0530276 |    95.0000000 |        95.0000000 |        0.0000000 |         0.0000000 |       61.5234756 |        95.0000000 |         5.1926417 |          3.4120378 |
|  233 | head.anchor_encoder.pos_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.2275877 |    2.4392626 |   127.7393265 |       127.7393265 |       -0.9346360 |        -1.0000000 |        3.9732180 |       127.0000000 |         0.0116597 |          1.6693844 |
|  234 | head.anchor_encoder.pos_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.8956881 |     6.2913299 |         6.2913299 |       -5.5234346 |         0.0000000 |        6.2913299 |         0.0000000 |        -0.2353930 |          0.0000000 |
|  235 | head.anchor_encoder.pos_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.3301476 |     6.2913299 |         6.2913299 |        0.0000000 |         0.0000000 |        6.2913299 |         0.0000000 |         0.3301475 |          0.0000000 |
|  236 | head.anchor_encoder.pos_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.6450030 |     6.3199091 |         6.3199091 |       -0.9847037 |         0.0000000 |        6.3199091 |         0.0000000 |         0.0796397 |          0.0000000 |
|  237 | head.anchor_encoder.pos_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    1.1708677 |     6.2585297 |         6.2585297 |       -6.2585297 |         0.0000000 |        5.7900782 |         0.0000000 |        -0.0593825 |          0.0000000 |
|  238 | head.anchor_encoder.pos_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.5557426 |     5.7900782 |         5.7900782 |        0.0000000 |         0.0000000 |        5.7900782 |         0.0000000 |         0.5557426 |          0.0000000 |
|  239 | head.anchor_encoder.pos_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.7940354 |     5.6570745 |         5.6570745 |       -0.8416602 |         0.0000000 |        5.6570745 |         0.0000000 |         0.0267929 |          0.0000000 |
|  240 | head.anchor_encoder.pos_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    1.3268834 |     8.3544235 |         8.3544235 |       -6.7751975 |         0.0000000 |        8.3544235 |         0.0000000 |        -0.2865011 |          0.0000000 |
|  241 | head.anchor_encoder.pos_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.5201912 |     8.3544235 |         8.3544235 |        0.0000000 |         0.0000000 |        8.3544235 |         0.0000000 |         0.5201911 |          0.0000000 |
|  242 | head.anchor_encoder.pos_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.6667947 |     7.4341941 |         7.4341941 |       -0.8761432 |         0.0000000 |        7.4341941 |         0.0000000 |         0.0308034 |          0.0000000 |
|  243 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 0.0033570 |  0.9239501 |    0.1576111 |     1.9853590 |       591.4113593 |        0.0000000 |         0.0000000 |        2.0789909 |         2.0779736 |         0.9559245 |          0.7988968 |
|  244 | head.anchor_encoder.size_fc.0                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0978448 |    0.6516879 |     2.4111583 |         2.4111583 |       -2.4111583 |         0.0000000 |        1.2628127 |         2.0000000 |        -0.3624174 |          0.0739746 |
|  245 | head.anchor_encoder.size_fc.1                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3034247 |    0.1629067 |     1.4614688 |         1.4614688 |        0.0000000 |         0.0000000 |        1.2628127 |         2.0000000 |         0.1263638 |          0.0739746 |
|  246 | head.anchor_encoder.size_fc.2                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.1992169 |    8.6190653 |   127.5196686 |       127.5196686 |       -0.7236204 |         0.0000000 |        3.9040172 |       127.0000000 |         0.0220680 |          8.0305176 |
|  247 | head.anchor_encoder.size_fc.3                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.4331455 |     2.5721762 |         2.5721762 |       -2.5721762 |         0.0000000 |        1.8647105 |         0.0000000 |        -0.0227550 |          0.0000000 |
|  248 | head.anchor_encoder.size_fc.4                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2051953 |     1.8647105 |         1.8647105 |        0.0000000 |         0.0000000 |        1.8647105 |         0.0000000 |         0.2051953 |          0.0000000 |
|  249 | head.anchor_encoder.size_fc.5                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7951117 |     3.4910169 |         3.4910169 |       -0.9243014 |         0.0000000 |        3.4910169 |         0.0000000 |         0.0102912 |          0.0000000 |
|  250 | head.anchor_encoder.size_fc.6                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7095833 |     2.0579250 |         2.0579250 |       -1.9531238 |         0.0000000 |        2.0579250 |         0.0000000 |        -0.0882557 |          0.0000000 |
|  251 | head.anchor_encoder.size_fc.7                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.3106638 |     2.0579250 |         2.0579250 |        0.0000000 |         0.0000000 |        2.0579250 |         0.0000000 |         0.3106638 |          0.0000000 |
|  252 | head.anchor_encoder.size_fc.8                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7990533 |     3.4191995 |         3.4191995 |       -0.8183190 |         0.0000000 |        3.4191995 |         0.0000000 |         0.0133170 |          0.0000000 |
|  253 | head.anchor_encoder.size_fc.9                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7617829 |     2.9671061 |         2.9671061 |       -2.5885885 |         0.0000000 |        2.9671061 |         0.0000000 |         0.1221219 |          0.0000000 |
|  254 | head.anchor_encoder.size_fc.10                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.4419524 |     2.9671061 |         2.9671061 |        0.0000000 |         0.0000000 |        2.9671061 |         0.0000000 |         0.4419524 |          0.0000000 |
|  255 | head.anchor_encoder.size_fc.11                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6868204 |     3.9256110 |         3.9256110 |       -1.1608952 |         0.0000000 |        3.9256110 |         0.0000000 |         0.0276167 |          0.0000000 |
|  256 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 2])          | qint16        | 0.0033570 |  0.8962387 |    0.0882780 |     1.2574540 |       374.5784108 |       -1.2574540 |        -0.1107805 |        1.0775599 |         1.0775921 |         0.2802716 |          0.3676718 |
|  257 | head.anchor_encoder.yaw_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3683985 |    0.4202925 |     1.2668183 |         1.2668183 |       -1.2668183 |         0.0000000 |        1.2641544 |         2.0000000 |        -0.0308702 |          0.0781250 |
|  258 | head.anchor_encoder.yaw_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.5400676 |    0.1849603 |     1.2641544 |         1.2641544 |        0.0000000 |         0.0000000 |        1.2641544 |         2.0000000 |         0.2044621 |          0.0781250 |
|  259 | head.anchor_encoder.yaw_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.4569848 |    8.6532106 |   127.8356094 |       127.8356094 |       -1.1665583 |         0.0000000 |        2.9565046 |       127.0000000 |         0.0080446 |          8.0770264 |
|  260 | head.anchor_encoder.yaw_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.9736328 |     3.4250467 |         3.4250467 |       -3.4250467 |         0.0000000 |        2.2103169 |         0.0000000 |        -0.1784895 |          0.0000000 |
|  261 | head.anchor_encoder.yaw_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.3975716 |     2.2103169 |         2.2103169 |        0.0000000 |         0.0000000 |        2.2103169 |         0.0000000 |         0.3975716 |          0.0000000 |
|  262 | head.anchor_encoder.yaw_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.8084189 |     3.7976856 |         3.7976856 |       -0.9829913 |         0.0000000 |        3.7976856 |         0.0000000 |        -0.0018637 |          0.0000000 |
|  263 | head.anchor_encoder.yaw_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6451470 |     3.4888008 |         3.4888008 |       -3.4888008 |         0.0000000 |        2.0787084 |         0.0000000 |        -0.0854857 |          0.0000000 |
|  264 | head.anchor_encoder.yaw_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2798306 |     2.0787084 |         2.0787084 |        0.0000000 |         0.0000000 |        2.0787084 |         0.0000000 |         0.2798307 |          0.0000000 |
|  265 | head.anchor_encoder.yaw_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7933429 |     3.4681168 |         3.4681168 |       -0.8800967 |         0.0000000 |        3.4681168 |         0.0000000 |         0.0053867 |          0.0000000 |
|  266 | head.anchor_encoder.yaw_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6433896 |     4.9416780 |         4.9416780 |       -4.9416780 |         0.0000000 |        2.6972690 |         0.0000000 |        -0.1562961 |          0.0000000 |
|  267 | head.anchor_encoder.yaw_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2435467 |     2.6972690 |         2.6972690 |        0.0000000 |         0.0000000 |        2.6972690 |         0.0000000 |         0.2435467 |          0.0000000 |
|  268 | head.anchor_encoder.yaw_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6243219 |     3.8050859 |         3.8050859 |       -0.8781686 |         0.0000000 |        3.8050859 |         0.0000000 |         0.0468953 |          0.0000000 |
|  269 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 0.0033570 |  0.0079192 |    2.1623070 |    46.9025307 |     13971.6243099 |      -46.9025307 |        -0.2920577 |       12.7601290 |         0.3524834 |        -1.9447609 |         -0.0050595 |
|  270 | head.anchor_encoder.vel_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0818831 |    1.7510883 |    31.0304756 |        31.0304756 |      -31.0304756 |         0.0000000 |       28.1223812 |         1.0000000 |        -0.0832179 |          0.2123108 |
|  271 | head.anchor_encoder.vel_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.2044143 |    0.8013318 |    27.1223812 |        27.1223812 |        0.0000000 |         0.0000000 |       28.1223812 |         1.0000000 |         0.8665385 |          0.2123108 |
|  272 | head.anchor_encoder.vel_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.6720473 |   27.1748371 |   127.8293304 |       127.8293304 |       -0.9795623 |         0.0000000 |        3.3557878 |       127.0000000 |         0.0330911 |         26.9634724 |
|  273 | head.anchor_encoder.vel_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    1.2211267 |     3.5685937 |         3.5685937 |       -3.1788363 |         0.0000000 |        3.5685937 |         0.0000000 |        -0.0966753 |          0.0000000 |
|  274 | head.anchor_encoder.vel_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.5622257 |     3.5685937 |         3.5685937 |        0.0000000 |         0.0000000 |        3.5685937 |         0.0000000 |         0.5622257 |          0.0000000 |
|  275 | head.anchor_encoder.vel_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.8108401 |     4.2005625 |         4.2005625 |       -0.8997287 |         0.0000000 |        4.2005625 |         0.0000000 |         0.0379029 |          0.0000000 |
|  276 | head.anchor_encoder.vel_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    1.1525319 |     4.6151385 |         4.6151385 |       -4.6151385 |         0.0000000 |        4.0496030 |         0.0000000 |        -0.1083963 |          0.0000000 |
|  277 | head.anchor_encoder.vel_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.5220678 |     4.0496030 |         4.0496030 |        0.0000000 |         0.0000000 |        4.0496030 |         0.0000000 |         0.5220677 |          0.0000000 |
|  278 | head.anchor_encoder.vel_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.7803996 |     5.6684661 |         5.6684661 |       -0.8944852 |         0.0000000 |        5.6684661 |         0.0000000 |         0.0214211 |          0.0000000 |
|  279 | head.anchor_encoder.vel_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.9432135 |     4.4064236 |         4.4064236 |       -4.4064236 |         0.0000000 |        4.0892005 |         0.0000000 |        -0.2835981 |          0.0000000 |
|  280 | head.anchor_encoder.vel_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.3298078 |     4.0892005 |         4.0892005 |        0.0000000 |         0.0000000 |        4.0892005 |         0.0000000 |         0.3298078 |          0.0000000 |
|  281 | head.anchor_encoder.vel_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.5836998 |     5.2244482 |         5.2244482 |       -0.8005500 |         0.0000000 |        5.2244482 |         0.0000000 |         0.0175311 |          0.0000000 |
|  282 | head.anchor_encoder.cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.6432151 |     7.4341941 |         7.4341941 |       -1.1608952 |         0.0000000 |        7.4341941 |         0.0000000 |         0.0290985 |          0.0000000 |
|  283 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 256, 3])          | qint16        | 0.0033570 | -0.5443952 |   31.4343834 |   110.9456024 |     33049.1820731 |      -60.0000000 |       -50.9456024 |       60.0000000 |         0.0000000 |        11.6494894 |        -18.0318298 |
|  284 | head.anchor_encoder.pos_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 128])        | qint8         | 1.0000000 | -0.4943782 |   18.9544868 |   160.5740814 |       160.5740814 |      -72.6938705 |         0.0000000 |       57.8333549 |        98.0000000 |        -1.8859849 |          9.0049582 |
|  285 | head.anchor_encoder.pos_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.0508963 |   12.6565971 |    98.0000000 |        98.0000000 |        0.0000000 |         0.0000000 |       57.8333549 |        98.0000000 |         4.4119058 |          9.0049582 |
|  286 | head.anchor_encoder.pos_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.4441785 |    7.0088205 |   125.6329498 |       125.6329498 |       -0.8286572 |         0.0000000 |        3.9292498 |       127.0000000 |         0.0160960 |          6.4110579 |
|  287 | head.anchor_encoder.pos_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.0000000 |    1.5462489 |     6.2223501 |         6.2223501 |       -4.2945523 |         0.0000000 |        6.2223501 |         0.0000000 |        -0.0688666 |          0.0000000 |
|  288 | head.anchor_encoder.pos_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.7386912 |     6.2223501 |         6.2223501 |        0.0000000 |         0.0000000 |        6.2223501 |         0.0000000 |         0.7386912 |          0.0000000 |
|  289 | head.anchor_encoder.pos_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.7169854 |     5.8657508 |         5.8657508 |       -0.9423806 |         0.0000000 |        5.8657508 |         0.0000000 |         0.0737280 |          0.0000000 |
|  290 | head.anchor_encoder.pos_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.0000000 |    1.5559396 |     5.7557969 |         5.7557969 |       -5.7557969 |         0.0000000 |        5.2824621 |         0.0000000 |         0.1178781 |          0.0000000 |
|  291 | head.anchor_encoder.pos_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.8369089 |     5.2824621 |         5.2824621 |        0.0000000 |         0.0000000 |        5.2824621 |         0.0000000 |         0.8369089 |          0.0000000 |
|  292 | head.anchor_encoder.pos_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.8081744 |     3.9166238 |         3.9166238 |       -0.8413311 |         0.0000000 |        3.9166238 |         0.0000000 |         0.0261370 |          0.0000000 |
|  293 | head.anchor_encoder.pos_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.0000000 |    1.3960781 |     6.7751975 |         6.7751975 |       -6.7751975 |         0.0000000 |        4.8005762 |         0.0000000 |         0.0286759 |          0.0000000 |
|  294 | head.anchor_encoder.pos_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.7123770 |     4.8005762 |         4.8005762 |        0.0000000 |         0.0000000 |        4.8005762 |         0.0000000 |         0.7123770 |          0.0000000 |
|  295 | head.anchor_encoder.pos_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.7605880 |     4.7639360 |         4.7639360 |       -0.8779437 |         0.0000000 |        4.7639360 |         0.0000000 |         0.0510163 |          0.0000000 |
|  296 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 256, 3])          | qint16        | 0.0033570 |  0.0000000 |    0.6182560 |     1.7912309 |       533.5832644 |        0.0000000 |         0.0000000 |        1.7912309 |         0.0000000 |         0.6182561 |          0.0000000 |
|  297 | head.anchor_encoder.size_fc.0                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.1194327 |    0.5345360 |     2.1295295 |         2.1295295 |       -2.1295295 |         0.0000000 |        1.0618346 |         1.0000000 |        -0.2738692 |          0.0625000 |
|  298 | head.anchor_encoder.size_fc.1                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.3267664 |    0.1416978 |     1.0618346 |         1.0618346 |        0.0000000 |         0.0000000 |        1.0618346 |         1.0000000 |         0.1189690 |          0.0625000 |
|  299 | head.anchor_encoder.size_fc.2                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.2700402 |    8.5406227 |   127.4852676 |       127.4852676 |       -0.7236204 |         0.0000000 |        3.7097483 |       127.0000000 |         0.0209044 |          7.9375005 |
|  300 | head.anchor_encoder.size_fc.3                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.4956681 |     2.5721762 |         2.5721762 |       -2.5721762 |         0.0000000 |        1.8647105 |         0.0000000 |        -0.0297388 |          0.0000000 |
|  301 | head.anchor_encoder.size_fc.4                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2329647 |     1.8647105 |         1.8647105 |        0.0000000 |         0.0000000 |        1.8647105 |         0.0000000 |         0.2329647 |          0.0000000 |
|  302 | head.anchor_encoder.size_fc.5                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7671599 |     3.4910169 |         3.4910169 |       -0.8993499 |         0.0000000 |        3.4910169 |         0.0000000 |         0.0068144 |          0.0000000 |
|  303 | head.anchor_encoder.size_fc.6                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6807614 |     2.0579250 |         2.0579250 |       -1.8541396 |         0.0000000 |        2.0579250 |         0.0000000 |         0.0119288 |          0.0000000 |
|  304 | head.anchor_encoder.size_fc.7                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.3463451 |     2.0579250 |         2.0579250 |        0.0000000 |         0.0000000 |        2.0579250 |         0.0000000 |         0.3463452 |          0.0000000 |
|  305 | head.anchor_encoder.size_fc.8                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7717237 |     3.3763087 |         3.3763087 |       -0.8183190 |         0.0000000 |        3.3763087 |         0.0000000 |         0.0097693 |          0.0000000 |
|  306 | head.anchor_encoder.size_fc.9                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7053855 |     2.9240520 |         2.9240520 |       -2.5831430 |         0.0000000 |        2.9240520 |         0.0000000 |         0.1781958 |          0.0000000 |
|  307 | head.anchor_encoder.size_fc.10                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.4417906 |     2.9240520 |         2.9240520 |        0.0000000 |         0.0000000 |        2.9240520 |         0.0000000 |         0.4417906 |          0.0000000 |
|  308 | head.anchor_encoder.size_fc.11                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6707128 |     3.6101770 |         3.6101770 |       -1.0291806 |         0.0000000 |        3.6101770 |         0.0000000 |         0.0235412 |          0.0000000 |
|  309 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 256, 2])          | qint16        | 0.0033570 |  0.0000000 |    0.3468237 |     1.0031675 |       298.8299218 |       -1.0031675 |         0.0000000 |        0.0657841 |         0.0000000 |        -0.3458020 |          0.0000000 |
|  310 | head.anchor_encoder.yaw_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.1182639 |    0.3926792 |     1.1535835 |         1.1535835 |       -1.1535835 |         0.0000000 |        1.1434097 |         1.0000000 |         0.0983358 |          0.0625000 |
|  311 | head.anchor_encoder.yaw_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.1597144 |    0.2625713 |     1.1434097 |         1.1434097 |        0.0000000 |         0.0000000 |        1.1434097 |         1.0000000 |         0.2284436 |          0.0625000 |
|  312 | head.anchor_encoder.yaw_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0585492 |    8.6710835 |   127.8051682 |       127.8051682 |       -1.1665583 |         0.0000000 |        2.8828382 |       127.0000000 |        -0.0005965 |          7.9375005 |
|  313 | head.anchor_encoder.yaw_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7665399 |     3.4250467 |         3.4250467 |       -3.4250467 |         0.0000000 |        1.7952765 |         0.0000000 |        -0.2327319 |          0.0000000 |
|  314 | head.anchor_encoder.yaw_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2669040 |     1.7952765 |         1.7952765 |        0.0000000 |         0.0000000 |        1.7952765 |         0.0000000 |         0.2669040 |          0.0000000 |
|  315 | head.anchor_encoder.yaw_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7395827 |     3.7843029 |         3.7843029 |       -0.8718229 |         0.0000000 |        3.7843029 |         0.0000000 |         0.0033014 |          0.0000000 |
|  316 | head.anchor_encoder.yaw_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7922223 |     3.6122193 |         3.6122193 |       -3.6122193 |         0.0000000 |        1.5138774 |         0.0000000 |        -0.2461304 |          0.0000000 |
|  317 | head.anchor_encoder.yaw_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2730460 |     1.5138774 |         1.5138774 |        0.0000000 |         0.0000000 |        1.5138774 |         0.0000000 |         0.2730460 |          0.0000000 |
|  318 | head.anchor_encoder.yaw_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7621113 |     2.9326568 |         2.9326568 |       -0.8206952 |         0.0000000 |        2.9326568 |         0.0000000 |         0.0095996 |          0.0000000 |
|  319 | head.anchor_encoder.yaw_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.9819832 |     4.9396334 |         4.9396334 |       -4.9396334 |         0.0000000 |        2.4999142 |         0.0000000 |        -0.5192455 |          0.0000000 |
|  320 | head.anchor_encoder.yaw_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2313690 |     2.4999142 |         2.4999142 |        0.0000000 |         0.0000000 |        2.4999142 |         0.0000000 |         0.2313690 |          0.0000000 |
|  321 | head.anchor_encoder.yaw_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6051936 |     3.6749005 |         3.6749005 |       -0.8781686 |         0.0000000 |        3.6749005 |         0.0000000 |         0.0624293 |          0.0000000 |
|  322 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 256, 3])          | qint16        | 0.0033570 |  0.0000000 |    8.5228167 |    49.5641327 |     14764.4792574 |      -49.5641327 |         0.0000000 |       13.8485432 |         0.0000000 |        -7.5664282 |          0.0000000 |
|  323 | head.anchor_encoder.vel_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.0638264 |    5.7408938 |    31.3755741 |        31.3755741 |      -31.3755741 |         0.0000000 |       28.4245663 |         1.0000000 |        -0.2935162 |          0.2031250 |
|  324 | head.anchor_encoder.vel_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.2683408 |    2.6954999 |    27.4245663 |        27.4245663 |        0.0000000 |         0.0000000 |       28.4245663 |         1.0000000 |         2.7518778 |          0.2031250 |
|  325 | head.anchor_encoder.vel_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.3260643 |   26.2460289 |   127.8264542 |       127.8264542 |       -0.9263648 |         0.0000000 |        3.3083827 |       127.0000000 |         0.0231196 |         25.7968769 |
|  326 | head.anchor_encoder.vel_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.7914613 |     3.2098315 |         3.2098315 |       -3.0925813 |         0.0000000 |        3.2098315 |         0.0000000 |        -0.1659864 |          0.0000000 |
|  327 | head.anchor_encoder.vel_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.3127375 |     3.2098315 |         3.2098315 |        0.0000000 |         0.0000000 |        3.2098315 |         0.0000000 |         0.3127375 |          0.0000000 |
|  328 | head.anchor_encoder.vel_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.7611333 |     4.2276940 |         4.2276940 |       -0.9200273 |         0.0000000 |        4.2276940 |         0.0000000 |         0.0361244 |          0.0000000 |
|  329 | head.anchor_encoder.vel_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.9681689 |     4.6255884 |         4.6255884 |       -4.6255884 |         0.0000000 |        4.0476789 |         0.0000000 |        -0.1308718 |          0.0000000 |
|  330 | head.anchor_encoder.vel_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.4186486 |     4.0476789 |         4.0476789 |        0.0000000 |         0.0000000 |        4.0476789 |         0.0000000 |         0.4186486 |          0.0000000 |
|  331 | head.anchor_encoder.vel_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.7582261 |     5.4647732 |         5.4647732 |       -0.8958653 |         0.0000000 |        5.4647732 |         0.0000000 |         0.0203687 |          0.0000000 |
|  332 | head.anchor_encoder.vel_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.9148182 |     4.1192098 |         4.1192098 |       -4.1192098 |         0.0000000 |        4.0669861 |         0.0000000 |        -0.1962998 |          0.0000000 |
|  333 | head.anchor_encoder.vel_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.3592592 |     4.0669861 |         4.0669861 |        0.0000000 |         0.0000000 |        4.0669861 |         0.0000000 |         0.3592592 |          0.0000000 |
|  334 | head.anchor_encoder.vel_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.6290504 |     5.2068014 |         5.2068014 |       -0.7984312 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0283586 |          0.0000000 |
|  335 | head.anchor_encoder.cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 256, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.6970450 |     5.2068014 |         5.2068014 |       -1.0291806 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0433441 |          0.0000000 |
|  336 | head                                           | torch.Tensor.unbind                                                           | torch.Tensor.unbind                                                     | torch.Size([3, 256, 704])         | torch.float32 |           |  1.0006758 |    0.0000000 |     0.0000000 |                   |       -0.5703125 |        -0.5703125 |        0.6093750 |         0.6093750 |        -0.0800466 |         -0.0800466 |
|  337 | head                                           | torch.Tensor.double                                                           | torch.Tensor.double                                                     | torch.Size([156, 4, 4])           | torch.float64 |           |  1.0000005 |    0.0000000 |     0.0000000 |                   |    -2119.9514160 |     -2119.9514160 |     2768.2631836 |      2768.2631836 |        22.1685505 |         22.1685505 |
|  338 | head                                           | torch.matmul                                                                  | torch.matmul                                                            | torch.Size([156, 4, 4])           | torch.float64 |           |  0.9999996 |    0.0000000 |     0.0000000 |                   |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3267042 |          0.3267042 |
|  339 | head                                           | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 6, 4, 4])         | torch.float64 |           |  0.9999996 |    0.0000000 |     0.0000000 |                   |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3267042 |          0.3267042 |
|  340 | head                                           | torch.Tensor.float                                                            | torch.Tensor.float                                                      | torch.Size([26, 6, 4, 4])         | torch.float32 |           |  0.9999996 |    0.0000000 |     0.0000000 |                   |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3267042 |          0.3267042 |
|  341 | head.mat_quant_stub                            | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 6, 4, 4])         | qint16        | 1.0000000 |  0.9977411 |    0.1576173 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3267042 |          0.3229167 |
|  342 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 256, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
|  343 | head.layers.0.query_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.3859800 |     7.4341941 |     16239.9969890 |       -4.7778430 |         0.0000000 |        7.4341941 |         0.0000000 |         0.0140853 |          0.0000000 |
|  344 | head.layers.0.key_cat                          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 256, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
|  345 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.3859800 |     7.4341941 |     16239.9969890 |       -4.7778430 |         0.0000000 |        7.4341941 |         0.0000000 |         0.0140853 |          0.0000000 |
|  346 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
|  347 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
|  348 | head.layers.0.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.3859800 |     7.4341941 |     16239.9969890 |       -4.7778430 |         0.0000000 |        7.4341941 |         0.0000000 |         0.0140853 |          0.0000000 |
|  349 | head.layers.0.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
|  350 | head.layers.0.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
|  351 | head.layers.0.attn.q_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.2806580 |    11.6534891 |        11.6534891 |      -10.3831472 |         0.0000000 |       11.6534891 |         0.0000000 |         0.0106833 |          0.0000000 |
|  352 | head.layers.0.attn.k_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.5745622 |    10.3745241 |        10.3745241 |       -7.6022348 |         0.0000000 |       10.3745241 |         0.0000000 |         0.0594729 |          0.0000000 |
|  353 | head.layers.0.attn.v_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.1017408 |     1.4133743 |         1.4133743 |       -1.2426922 |         0.0000000 |        1.4133743 |         0.0000000 |         0.0013951 |          0.0000000 |
|  354 | head.layers.0.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.2806580 |    11.6534891 |        11.6534891 |      -10.3831472 |         0.0000000 |       11.6534891 |         0.0000000 |         0.0106833 |          0.0000000 |
|  355 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.2806580 |    11.6534891 |        11.6534891 |      -10.3831472 |         0.0000000 |       11.6534891 |         0.0000000 |         0.0106833 |          0.0000000 |
|  356 | head.layers.0.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  357 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  358 | head.layers.0.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1017408 |     1.4133743 |         1.4133743 |       -1.2426922 |         0.0000000 |        1.4133743 |         0.0000000 |         0.0013951 |          0.0000000 |
|  359 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 256, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1017408 |     1.4133743 |         1.4133743 |       -1.2426922 |         0.0000000 |        1.4133743 |         0.0000000 |         0.0013951 |          0.0000000 |
|  360 | head.layers.0.attn                             | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.0000000 |    0.1600823 |     1.4566861 |        11.6534891 |       -1.2978934 |         0.0000000 |        1.4566861 |         0.0000000 |         0.0013354 |          0.0000000 |
|  361 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  362 | head.layers.0.attn.matmul                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |    7.8956761 |    71.9449387 |        71.9449387 |      -70.9854584 |         0.0000000 |       71.9449387 |         0.0000000 |         0.8957291 |          0.0000000 |
|  363 | head.layers.0.attn.softmax                     | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.9999796 |         0.9999796 |        0.0000000 |         0.0000000 |        0.9999796 |         0.0000000 |         0.0039062 |          0.0000000 |
|  364 | head.layers.0.attn.attention_drop              | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.9999796 |         0.9999796 |        0.0000000 |         0.0000000 |        0.9999796 |         0.0000000 |         0.0039062 |          0.0000000 |
|  365 | head.layers.0.attn.attn_matmul                 | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.0959115 |     1.1532352 |         1.1532352 |       -1.1532352 |         0.0000000 |        1.1118083 |         0.0000000 |         0.0023849 |          0.0000000 |
|  366 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  367 | head.layers.0.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  368 | head.layers.0.attn.out_proj                    | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2117039 |     1.4436820 |         1.4436820 |       -1.4281918 |         0.0000000 |        1.4436820 |         0.0000000 |         0.0447921 |          0.0000000 |
|  369 | head.layers.0.attn                             | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.9999796 |         0.9999796 |        0.0000000 |         0.0000000 |        0.9999796 |         0.0000000 |         0.0039062 |          0.0000000 |
|  370 | head.layers.0.attn.attn_weights_mean           | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.3683937 |         0.3683937 |        0.0000001 |         0.0000000 |        0.3683937 |         0.0000000 |         0.0039062 |          0.0000000 |
|  371 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  372 | head.layers.0.dropout                          | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2117039 |     1.4436820 |         1.4436820 |       -1.4281918 |         0.0000000 |        1.4436820 |         0.0000000 |         0.0447921 |          0.0000000 |
|  373 | head.layers.0.add                              | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.0000000 |    0.4368049 |     7.5278821 |         7.5278821 |       -4.7735815 |         0.0000000 |        7.5278821 |         0.0000000 |         0.0588774 |          0.0000000 |
|  374 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.0000000 |    0.6509308 |    12.1957655 |      7992.4949192 |       -8.8317614 |         0.0000000 |       12.1957655 |         0.0000000 |        -0.0114790 |          0.0000000 |
|  375 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1508913 |     5.7587562 |     37740.0085082 |       -5.7587562 |         0.0000000 |        5.6021671 |         0.0000000 |        -0.0054155 |          0.0000000 |
|  376 | head.layers.1.query_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6470729 |    12.1957655 |     26641.6497307 |       -8.8317614 |         0.0000000 |       12.1957655 |         0.0000000 |         0.0088097 |          0.0000000 |
|  377 | head.layers.1.key_cat                          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0002719 |  0.0000000 |    0.6470729 |    12.1957655 |     44851.7120843 |       -8.8317614 |         0.0000000 |       12.1957655 |         0.0000000 |         0.0088097 |          0.0000000 |
|  378 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6470729 |    12.1957655 |     26641.6497307 |       -8.8317614 |         0.0000000 |       12.1957655 |         0.0000000 |         0.0088097 |          0.0000000 |
|  379 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0002719 |  0.0000000 |    0.6470729 |    12.1957655 |     44851.7120843 |       -8.8317614 |         0.0000000 |       12.1957655 |         0.0000000 |         0.0088097 |          0.0000000 |
|  380 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1508913 |     5.7587562 |     37740.0085082 |       -5.7587562 |         0.0000000 |        5.6021671 |         0.0000000 |        -0.0054155 |          0.0000000 |
|  381 | head.layers.1.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6470729 |    12.1957655 |     26641.6497307 |       -8.8317614 |         0.0000000 |       12.1957655 |         0.0000000 |         0.0088097 |          0.0000000 |
|  382 | head.layers.1.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0002719 |  0.0000000 |    0.6470729 |    12.1957655 |     44851.7120843 |       -8.8317614 |         0.0000000 |       12.1957655 |         0.0000000 |         0.0088097 |          0.0000000 |
|  383 | head.layers.1.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1508913 |     5.7587562 |     37740.0085082 |       -5.7587562 |         0.0000000 |        5.6021671 |         0.0000000 |        -0.0054155 |          0.0000000 |
|  384 | head.layers.1.attn.q_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.2343470 |     9.9739780 |         9.9739780 |       -9.9739780 |         0.0000000 |        9.1662474 |         0.0000000 |        -0.0038037 |          0.0000000 |
|  385 | head.layers.1.attn.k_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.3051546 |    17.4281483 |        17.4281483 |      -10.8004189 |         0.0000000 |       17.4281483 |         0.0000000 |         0.0952018 |          0.0000000 |
|  386 | head.layers.1.attn.v_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.1709637 |     3.5321841 |         3.5321841 |       -2.7347982 |         0.0000000 |        3.5321841 |         0.0000000 |         0.0004952 |          0.0000000 |
|  387 | head.layers.1.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.2343470 |     9.9739780 |         9.9739780 |       -9.9739780 |         0.0000000 |        9.1662474 |         0.0000000 |        -0.0038037 |          0.0000000 |
|  388 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.2343470 |     9.9739780 |         9.9739780 |       -9.9739780 |         0.0000000 |        9.1662474 |         0.0000000 |        -0.0038037 |          0.0000000 |
|  389 | head.layers.1.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.3051546 |    17.4281483 |        17.4281483 |      -10.8004189 |         0.0000000 |       17.4281483 |         0.0000000 |         0.0952018 |          0.0000000 |
|  390 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.3051546 |    17.4281483 |        17.4281483 |      -10.8004189 |         0.0000000 |       17.4281483 |         0.0000000 |         0.0952018 |          0.0000000 |
|  391 | head.layers.1.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1709637 |     3.5321841 |         3.5321841 |       -2.7347982 |         0.0000000 |        3.5321841 |         0.0000000 |         0.0004952 |          0.0000000 |
|  392 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1709637 |     3.5321841 |         3.5321841 |       -2.7347982 |         0.0000000 |        3.5321841 |         0.0000000 |         0.0004952 |          0.0000000 |
|  393 | head.layers.1.attn                             | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.0000000 |    0.1542934 |     1.2467473 |         9.9739780 |       -1.2467473 |         0.0000000 |        1.1457809 |         0.0000000 |        -0.0004755 |          0.0000000 |
|  394 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  395 | head.layers.1.attn.matmul                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    6.1759152 |    42.7535667 |        42.7535667 |      -39.1752815 |         0.0000000 |       42.7535667 |         0.0000000 |         0.1270905 |          0.0000000 |
|  396 | head.layers.1.attn.softmax                     | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9997606 |         0.9997606 |        0.0000000 |         0.0000000 |        0.9997606 |         0.0000000 |         0.0019531 |          0.0000000 |
|  397 | head.layers.1.attn.attention_drop              | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9997606 |         0.9997606 |        0.0000000 |         0.0000000 |        0.9997606 |         0.0000000 |         0.0019531 |          0.0000000 |
|  398 | head.layers.1.attn.attn_matmul                 | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1638033 |     3.0342016 |         3.0342016 |       -2.2279072 |         0.0000000 |        3.0342016 |         0.0000000 |         0.0071389 |          0.0000000 |
|  399 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  400 | head.layers.1.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  401 | head.layers.1.attn.out_proj                    | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2915589 |     2.3400450 |         2.3400450 |       -2.3400450 |         0.0000000 |        2.1420734 |         0.0000000 |         0.0163831 |          0.0000000 |
|  402 | head.layers.1.attn                             | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 512])     | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9997606 |         0.9997606 |        0.0000000 |         0.0000000 |        0.9997606 |         0.0000000 |         0.0019531 |          0.0000000 |
|  403 | head.layers.1.attn.attn_weights_mean           | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.1428397 |         0.1428397 |        0.0000001 |         0.0000000 |        0.1428397 |         0.0000000 |         0.0019531 |          0.0000000 |
|  404 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  405 | head.layers.1.dropout                          | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2915589 |     2.3400450 |         2.3400450 |       -2.3400450 |         0.0000000 |        2.1420734 |         0.0000000 |         0.0163831 |          0.0000000 |
|  406 | head.layers.1.add                              | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.0000000 |    0.7330003 |    12.6169729 |        12.6169729 |       -8.8112335 |         0.0000000 |       12.6169729 |         0.0000000 |         0.0251928 |          0.0000000 |
|  407 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.0000000 |    1.5063708 |    27.8301163 |     18238.4667031 |      -27.8301163 |         0.0000000 |       25.7133026 |         0.0000000 |        -0.0422795 |          0.0000000 |
|  408 | head.layers.2                                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4914112 |     6.8519669 |         6.8519669 |       -6.8519669 |         0.0000000 |        5.9203916 |         0.0000000 |         0.0024549 |          0.0000000 |
|  409 | head.layers.3.kps_generator.offset             | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 24])         | qint16        | 1.0000000 |  0.0000000 |    1.2675953 |     6.0831432 |         6.0831432 |       -6.0831432 |         0.0000000 |        4.6373544 |         0.0000000 |        -0.5083886 |          0.0000000 |
|  410 | head.layers.3.kps_generator                    | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.0000000 |    1.2675953 |     6.0831432 |         6.0831432 |       -6.0831432 |         0.0000000 |        4.6373544 |         0.0000000 |        -0.5083886 |          0.0000000 |
|  411 | head.layers.3.kps_generator                    | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 3])       | qint16        | 0.0033570 |  0.6376518 |    8.0702028 |   111.2510834 |     33140.1806846 |      -60.0000000 |       -58.0691223 |       60.0000000 |        59.9993896 |         8.7323294 |          1.2088040 |
|  412 | head.layers.3.kps_generator.keypoints_add      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.6487051 |    8.9215517 |   111.7640457 |       111.7640457 |      -62.5510216 |       -58.0000000 |       63.7318192 |        60.0000000 |         8.2239408 |          1.2368289 |
|  413 | head.layers.3.weight_add                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.8625444 |     7.7171421 |         7.7171421 |       -6.5961156 |         0.0000000 |        7.7171421 |         0.0000000 |         0.0315533 |          0.0000000 |
|  414 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 3, 4])         | qint16        | 1.0000000 |  0.9977293 |    0.2101564 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3522723 |          0.3472222 |
|  415 | head.layers.3                                  | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 6, 12])           | qint16        | 1.0000000 |  0.9977293 |    0.2101564 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3522723 |          0.3472222 |
|  416 | head.layers.3.camera_encoder.0                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.7740716 |     6.5803881 |         6.5803881 |       -5.1108408 |         0.0000000 |        6.5803881 |         0.0000000 |        -0.1209187 |          0.0000000 |
|  417 | head.layers.3.camera_encoder.1                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.3265764 |     6.5803881 |         6.5803881 |        0.0000000 |         0.0000000 |        6.5803881 |         0.0000000 |         0.3265764 |          0.0000000 |
|  418 | head.layers.3.camera_encoder.2                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.8016938 |     4.0335946 |         4.0335946 |       -0.7927587 |         0.0000000 |        4.0335946 |         0.0000000 |         0.0087450 |          0.0000000 |
|  419 | head.layers.3.camera_encoder.3                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    2.2743580 |    27.9104614 |        27.9104614 |       -9.4951468 |         0.0000000 |       27.9104614 |         0.0000000 |         0.1394914 |          0.0000000 |
|  420 | head.layers.3.camera_encoder.4                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    1.2069247 |    27.9104614 |        27.9104614 |        0.0000000 |         0.0000000 |       27.9104614 |         0.0000000 |         1.2069247 |          0.0000000 |
|  421 | head.layers.3.camera_encoder.5                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.4995683 |     7.4379067 |         7.4379067 |       -0.9630507 |         0.0000000 |        7.4379067 |         0.0000000 |         0.0216157 |          0.0000000 |
|  422 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.8625444 |     7.7171421 |         7.7171421 |       -6.5961156 |         0.0000000 |        7.7171421 |         0.0000000 |         0.0315533 |          0.0000000 |
|  423 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 6, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.4995683 |     7.4379067 |         7.4379067 |       -0.9630507 |         0.0000000 |        7.4379067 |         0.0000000 |         0.0216157 |          0.0000000 |
|  424 | head.layers.3.cam_add                          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 6, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.8934079 |    12.4618416 |        12.4618416 |       -4.2509594 |         0.0000000 |       12.4618416 |         0.0000000 |         0.0531690 |          0.0000000 |
|  425 | head.layers.3.weights_fc                       | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 6, 64])      | qint8         | 1.0000000 |  0.0000000 |    1.6191875 |     9.4322243 |         9.4322243 |       -9.4322243 |         0.0000000 |        7.6857080 |         0.0000000 |        -0.2863957 |          0.0000000 |
|  426 | head.layers.3                                  | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.0000000 |    1.6191875 |     9.4322243 |         9.4322243 |       -9.4322243 |         0.0000000 |        7.6857080 |         0.0000000 |        -0.2863957 |          0.0000000 |
|  427 | head.layers.3.weight_softmax                   | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.0000000 |    0.0208333 |     0.9224828 |         0.9224828 |        0.0000005 |         0.0000000 |        0.9224828 |         0.0000000 |         0.0208333 |          0.0000000 |
|  428 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  0.8216676 |    8.7658691 |   111.7640457 |       111.7640457 |      -62.5510216 |       -58.0000000 |       63.7318192 |        60.0000000 |        11.7259026 |          5.5649042 |
|  429 | head.layers.3                                  | torch.ones_like                                                               | torch.ones_like                                                         | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
|  430 | head.layers.3.point_quant_stub                 | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
|  431 | head.layers.3.point_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 8, 4])       | qint16        | 1.0000000 |  0.6488566 |    6.6911640 |   111.7640457 |       111.7640457 |      -62.5510216 |       -58.0000000 |       63.7318192 |        60.0000000 |         6.4179559 |          1.1776217 |
|  432 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 1, 1, 4, 4])   | qint16        | 1.0000000 |  0.9977411 |    0.1576173 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3267042 |          0.3229167 |
|  433 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 512, 8, 1, 4]) | qint16        | 1.0000000 |  0.6488566 |    6.6911640 |   111.7640457 |       111.7640457 |      -62.5510216 |       -58.0000000 |       63.7318192 |        60.0000000 |         6.4179559 |          1.1776217 |
|  434 | head.layers.3.point_matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 4, 4]) | qint16        | 1.0000000 |  0.0585227 |    5.1514692 |   796.0960083 |       796.0960083 |     -366.0960083 |      -180.0000000 |      280.7984619 |       440.0000000 |        -0.1977498 |          1.5117548 |
|  435 | head.layers.3.point_sum                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 6, 512, 8, 4])    | qint16        | 0.0244144 |  0.0475493 |   15.9793186 |   829.9739990 |     33995.2162742 |     -322.9931030 |      -180.0076294 |      351.5519714 |       518.0010376 |        -0.7909992 |          6.0472698 |
|  436 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.7157552 |   11.1865139 |   123.3548965 |      5052.5394669 |      -73.4068222 |       -63.9902344 |       73.8754425 |        61.9882507 |        -0.9596712 |          1.5886035 |
|  437 | head.layers.3                                  | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.8145121 |    6.0112057 |    56.1454048 |      2299.6806909 |        0.0100000 |         0.0000000 |       73.8754425 |        61.9882507 |        11.2169704 |         12.4062328 |
|  438 | head.layers.3.reciprocal_op                    | horizon_plugin_pytorch.nn.reciprocal.Reciprocal                               | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.7888864 |   46.6549377 |    99.9816895 |     32761.5000992 |        0.0135363 |         0.0152590 |      100.0000000 |         1.2787061 |        47.0878754 |          0.6503971 |
|  439 | head.layers.3                                  | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.7888864 |   46.6549377 |    99.9816895 |     32761.5000992 |        0.0135363 |         0.0152590 |      100.0000000 |         1.2787061 |        47.0878754 |          0.6503971 |
|  440 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0244144 | -0.0646392 |   26.3648834 |   829.9739990 |     33995.2162742 |     -322.9931030 |      -180.0076294 |      351.5519714 |       518.0010376 |        -1.6021627 |         10.7997427 |
|  441 | head.layers.3.point_mul                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.4435586 | 1424.6093750 | 35145.1953125 | 115162018.7670476 |   -32299.3105469 |       -10.0001526 |    35155.1953125 |         9.9998474 |       622.5297852 |          2.1499586 |
|  442 | head.layers.3                                  | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.4547172 |    0.6131608 |     2.1998703 |      7208.4251644 |       -1.1000000 |        -1.0998703 |        1.1000000 |         1.0998703 |         0.0951098 |          0.2038586 |
|  443 | head.layers.3                                  | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([156, 512, 8, 2])      | qint16        | 0.0003052 |  0.4547172 |    0.6131608 |     2.1998703 |      7208.4251644 |       -1.1000000 |        -1.0998703 |        1.1000000 |         1.0998703 |         0.0951098 |          0.2038586 |
|  444 | head.layers.3                                  | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer            | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer      | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.0000000 |    0.5016719 |    53.8489227 |        53.8489227 |      -53.8489227 |         0.0000000 |       49.5139618 |         0.0000000 |         0.0121814 |          0.0000000 |
|  445 | head.layers.3.feat_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.0000000 |    0.5016719 |    53.8489227 |        53.8489227 |      -53.8489227 |         0.0000000 |       49.5139618 |         0.0000000 |         0.0121814 |          0.0000000 |
|  446 | head.layers.3                                  | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 6, 256, 512, 8])  | qint8         | 1.0000000 |  0.0000000 |    0.5016719 |    53.8489227 |        53.8489227 |      -53.8489227 |         0.0000000 |       49.5139618 |         0.0000000 |         0.0121814 |          0.0000000 |
|  447 | head.layers.3                                  | torch.Tensor.permute                                                          | torch.Tensor.permute                                                    | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.0000000 |    0.5016719 |    53.8489227 |        53.8489227 |      -53.8489227 |         0.0000000 |       49.5139618 |         0.0000000 |         0.0121814 |          0.0000000 |
|  448 | head.layers.3                                  | torch.Tensor.contiguous                                                       | torch.Tensor.contiguous                                                 | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.0000000 |    0.5016720 |    53.8489227 |        53.8489227 |      -53.8489227 |         0.0000000 |       49.5139618 |         0.0000000 |         0.0121814 |          0.0000000 |
|  449 | head.layers.3                                  | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.0000000 |    0.5016720 |    53.8489227 |        53.8489227 |      -53.8489227 |         0.0000000 |       49.5139618 |         0.0000000 |         0.0121814 |          0.0000000 |
|  450 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 48, 8, 1])   | qint8         | 1.0000000 |  0.0000000 |    0.0208333 |     0.9224828 |         0.9224828 |        0.0000005 |         0.0000000 |        0.9224828 |         0.0000000 |         0.0208333 |          0.0000000 |
|  451 | head.layers.3                                  | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.0000000 |    0.5016720 |    53.8489227 |        53.8489227 |      -53.8489227 |         0.0000000 |       49.5139618 |         0.0000000 |         0.0121814 |          0.0000000 |
|  452 | head.layers.3.feat_mul                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.0000000 |    0.0097255 |     8.1612844 |         8.1612844 |       -8.1612844 |         0.0000000 |        8.1404810 |         0.0000000 |         0.0000361 |          0.0000000 |
|  453 | head.layers.3                                  | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.0000000 |    0.0097255 |     8.1612844 |         8.1612844 |       -8.1612844 |         0.0000000 |        8.1404810 |         0.0000000 |         0.0000361 |          0.0000000 |
|  454 | head.layers.3.feat_sum                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3620962 |     8.4357128 |         8.4357128 |       -8.4357128 |         0.0000000 |        8.4316349 |         0.0000000 |         0.0017325 |          0.0000000 |
|  455 | head.layers.3.output_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4741955 |     9.7919588 |         9.7919588 |       -9.6286917 |         0.0000000 |        9.7919588 |         0.0000000 |        -0.0023195 |          0.0000000 |
|  456 | head.layers.3.proj_drop                        | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4741955 |     9.7919588 |         9.7919588 |       -9.6286917 |         0.0000000 |        9.7919588 |         0.0000000 |        -0.0023195 |          0.0000000 |
|  457 | head.layers.3.residual_op                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.4828034 |     9.7919588 |         9.7919588 |       -9.6286917 |         0.0000000 |        9.7919588 |         0.0000000 |         0.0000677 |          0.0000000 |
|  458 | head.layers.4.pre_norm                         | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.5543298 |     8.9640045 |         8.9640045 |       -8.9640045 |         0.0000000 |        8.5441151 |         0.0000000 |         0.0038249 |          0.0000000 |
|  459 | head.layers.4.layers.0.0                       | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.0000000 |    2.9239919 |    18.9969559 |        18.9969559 |      -18.9969559 |         0.0000000 |       13.7888508 |         0.0000000 |        -2.5107088 |          0.0000000 |
|  460 | head.layers.4.layers.0.2                       | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.0000000 |    0.2066415 |    13.7888508 |        13.7888508 |        0.0000000 |         0.0000000 |       13.7888508 |         0.0000000 |         0.2066415 |          0.0000000 |
|  461 | head.layers.4.layers.1                         | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.5826797 |    31.9613380 |        31.9613380 |      -31.9613380 |         0.0000000 |       31.1244659 |         0.0000000 |        -0.0097152 |          0.0000000 |
|  462 | head.layers.4.layers.2                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.5826797 |    31.9613380 |        31.9613380 |      -31.9613380 |         0.0000000 |       31.1244659 |         0.0000000 |        -0.0097152 |          0.0000000 |
|  463 | head.layers.4.identity_fc                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  464 | head.layers.4.short_add                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.linear.LinearAdd                          | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    3.1663830 |    38.6006165 |        38.6006165 |      -38.4510803 |         0.0000000 |       38.6006165 |         0.0000000 |         0.0691997 |          0.0000000 |
|  465 | head.layers.5                                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.7236657 |     4.1468334 |         4.1468334 |       -4.1468334 |         0.0000000 |        3.7271025 |         0.0000000 |        -0.0035535 |          0.0000000 |
|  466 | head.layers.6.add1                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.9032249 |     8.0562115 |         8.0562115 |       -4.4454021 |         0.0000000 |        8.0562115 |         0.0000000 |         0.0255449 |          0.0000000 |
|  467 | head.layers.6.layers.0                         | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.5011439 |    13.1194830 |        13.1194830 |       -9.5059967 |         0.0000000 |       13.1194830 |         0.0000000 |        -0.6346456 |          0.0000000 |
|  468 | head.layers.6.layers.1                         | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4332492 |    13.1194830 |        13.1194830 |        0.0000000 |         0.0000000 |       13.1194830 |         0.0000000 |         0.4332492 |          0.0000000 |
|  469 | head.layers.6.layers.2                         | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.3399650 |    15.6678905 |        15.6678905 |      -15.6678905 |         0.0000000 |        9.3536186 |         0.0000000 |        -0.6303101 |          0.0000000 |
|  470 | head.layers.6.layers.3                         | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3548274 |     9.3536186 |         9.3536186 |        0.0000000 |         0.0000000 |        9.3536186 |         0.0000000 |         0.3548274 |          0.0000000 |
|  471 | head.layers.6.layers.4                         | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.6864012 |     7.3110342 |         7.3110342 |       -0.7876508 |         0.0000000 |        7.3110342 |         0.0000000 |         0.0417728 |          0.0000000 |
|  472 | head.layers.6.layers.5                         | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.3448861 |    11.0122004 |        11.0122004 |      -11.0122004 |         0.0000000 |        8.7753201 |         0.0000000 |        -0.5691777 |          0.0000000 |
|  473 | head.layers.6.layers.6                         | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3878541 |     8.7753201 |         8.7753201 |        0.0000000 |         0.0000000 |        8.7753201 |         0.0000000 |         0.3878541 |          0.0000000 |
|  474 | head.layers.6.layers.7                         | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.6496054 |    18.9131355 |        18.9131355 |      -10.2166042 |         0.0000000 |       18.9131355 |         0.0000000 |        -0.3308689 |          0.0000000 |
|  475 | head.layers.6.layers.8                         | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.6593682 |    18.9131355 |        18.9131355 |        0.0000000 |         0.0000000 |       18.9131355 |         0.0000000 |         0.6593682 |          0.0000000 |
|  476 | head.layers.6.layers.9                         | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.6645655 |    11.3135853 |        11.3135853 |       -0.8749418 |         0.0000000 |       11.3135853 |         0.0000000 |         0.0229872 |          0.0000000 |
|  477 | head.layers.6.layers.10                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.0000000 |    2.0317330 |    35.9097328 |        35.9097328 |      -35.9097328 |         0.0000000 |       12.1763744 |         0.0000000 |        -1.1236478 |          0.0000000 |
|  478 | head.layers.6.layers.11.scale_quant_stub       | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([11])                  | qint16        | 1.0000000 |  0.9428862 |    0.2346313 |     0.3893506 |         0.3893506 |        0.1426757 |         0.0000000 |        1.2617201 |         1.0000000 |         0.6749083 |          0.5454546 |
|  479 | head.layers.6.layers.11.mul                    | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.0000000 |    2.1591465 |    45.3080292 |        45.3080292 |      -45.3080292 |         0.0000000 |       15.3631763 |         0.0000000 |        -1.2075341 |          0.0000000 |
|  480 | head.layers.6.add2                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.5427043 |    4.9220486 |   112.3172073 |       112.3172073 |      -59.8177528 |       -58.0000000 |       62.9555511 |        60.0000000 |         0.9552861 |          0.6780485 |
|  481 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 11])         | torch.float32 |           |  0.5427043 |    4.9220486 |   112.3172073 |                   |      -59.8177528 |       -58.0000000 |       62.9555511 |        60.0000000 |         0.9552861 |          0.6780485 |
|  482 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.6404526 |    8.6006489 |   112.3172073 |       112.3172073 |      -59.8177528 |       -58.0000000 |       62.9555511 |        60.0000000 |         8.4762030 |          1.2368289 |
|  483 | head.anchor_encoder.pos_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 | -0.1040450 |   12.8208942 |   161.2517700 |       161.2517700 |      -75.6317444 |         0.0000000 |       60.2171745 |        95.0000000 |        -0.7006063 |          3.4413030 |
|  484 | head.anchor_encoder.pos_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.2288687 |    6.9881616 |    95.0000000 |        95.0000000 |        0.0000000 |         0.0000000 |       60.2171745 |        95.0000000 |         5.1321268 |          3.4413030 |
|  485 | head.anchor_encoder.pos_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.1462452 |    2.3930755 |   127.7486420 |       127.7486420 |       -0.9147352 |        -1.0000000 |        3.9519279 |       127.0000000 |         0.0109702 |          1.5994991 |
|  486 | head.anchor_encoder.pos_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.8756202 |     6.3197436 |         6.3197436 |       -5.5547028 |         0.0000000 |        6.3197436 |         0.0000000 |        -0.2358995 |          0.0000000 |
|  487 | head.anchor_encoder.pos_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.3198603 |     6.3197436 |         6.3197436 |        0.0000000 |         0.0000000 |        6.3197436 |         0.0000000 |         0.3198604 |          0.0000000 |
|  488 | head.anchor_encoder.pos_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.6462111 |     6.3536572 |         6.3536572 |       -0.9798773 |         0.0000000 |        6.3536572 |         0.0000000 |         0.0792969 |          0.0000000 |
|  489 | head.anchor_encoder.pos_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    1.1519662 |     6.1781960 |         6.1781960 |       -6.1781960 |         0.0000000 |        5.7477298 |         0.0000000 |        -0.0609129 |          0.0000000 |
|  490 | head.anchor_encoder.pos_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.5455267 |     5.7477298 |         5.7477298 |        0.0000000 |         0.0000000 |        5.7477298 |         0.0000000 |         0.5455267 |          0.0000000 |
|  491 | head.anchor_encoder.pos_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.7970374 |     5.6968312 |         5.6968312 |       -0.8669773 |         0.0000000 |        5.6968312 |         0.0000000 |         0.0265058 |          0.0000000 |
|  492 | head.anchor_encoder.pos_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    1.3151101 |     8.3907242 |         8.3907242 |       -5.9531503 |         0.0000000 |        8.3907242 |         0.0000000 |        -0.2907874 |          0.0000000 |
|  493 | head.anchor_encoder.pos_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.5121614 |     8.3907242 |         8.3907242 |        0.0000000 |         0.0000000 |        8.3907242 |         0.0000000 |         0.5121614 |          0.0000000 |
|  494 | head.anchor_encoder.pos_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.6672958 |     7.4383578 |         7.4383578 |       -0.8777751 |         0.0000000 |        7.4383578 |         0.0000000 |         0.0305445 |          0.0000000 |
|  495 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.8651512 |    0.5088918 |     2.0714471 |         2.0714471 |       -0.3782175 |         0.0000000 |        2.5641122 |         2.0000000 |         0.9051737 |          0.9993490 |
|  496 | head.anchor_encoder.size_fc.0                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0936550 |    0.6481394 |     2.8749139 |         2.8749139 |       -2.8749139 |         0.0000000 |        1.5615941 |         1.0000000 |        -0.3507633 |          0.0858765 |
|  497 | head.anchor_encoder.size_fc.1                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.2807618 |    0.1692309 |     1.5615941 |         1.5615941 |        0.0000000 |         0.0000000 |        1.5615941 |         1.0000000 |         0.1281453 |          0.0858765 |
|  498 | head.anchor_encoder.size_fc.2                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.1690635 |   11.4948034 |   127.6216431 |       127.6216431 |       -0.7128552 |         0.0000000 |        4.0014601 |       127.0000000 |         0.0219103 |         10.9063110 |
|  499 | head.anchor_encoder.size_fc.3                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.4307660 |     1.8896893 |         1.8896893 |       -1.8896893 |         0.0000000 |        1.7037756 |         0.0000000 |        -0.0162454 |          0.0000000 |
|  500 | head.anchor_encoder.size_fc.4                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2072603 |     1.7037756 |         1.7037756 |        0.0000000 |         0.0000000 |        1.7037756 |         0.0000000 |         0.2072603 |          0.0000000 |
|  501 | head.anchor_encoder.size_fc.5                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7787688 |     3.5817695 |         3.5817695 |       -0.9293023 |         0.0000000 |        3.5817695 |         0.0000000 |         0.0094467 |          0.0000000 |
|  502 | head.anchor_encoder.size_fc.6                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6946103 |     2.3198621 |         2.3198621 |       -2.1402876 |         0.0000000 |        2.3198621 |         0.0000000 |        -0.0638946 |          0.0000000 |
|  503 | head.anchor_encoder.size_fc.7                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.3153578 |     2.3198621 |         2.3198621 |        0.0000000 |         0.0000000 |        2.3198621 |         0.0000000 |         0.3153578 |          0.0000000 |
|  504 | head.anchor_encoder.size_fc.8                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7780361 |     3.5813935 |         3.5813935 |       -0.8479678 |         0.0000000 |        3.5813935 |         0.0000000 |         0.0121773 |          0.0000000 |
|  505 | head.anchor_encoder.size_fc.9                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7413274 |     3.0191188 |         3.0191188 |       -2.6576526 |         0.0000000 |        3.0191188 |         0.0000000 |         0.1494321 |          0.0000000 |
|  506 | head.anchor_encoder.size_fc.10                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.4453798 |     3.0191188 |         3.0191188 |        0.0000000 |         0.0000000 |        3.0191188 |         0.0000000 |         0.4453798 |          0.0000000 |
|  507 | head.anchor_encoder.size_fc.11                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7326021 |     3.9499559 |         3.9499559 |       -1.2935432 |         0.0000000 |        3.9499559 |         0.0000000 |         0.0308888 |          0.0000000 |
|  508 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 2])          | qint16        | 1.0000000 | -0.7444325 |    0.9184290 |     2.4207721 |         2.4207721 |       -1.6905025 |         0.0000000 |        0.2028949 |         1.0000000 |        -0.5234910 |          0.3750000 |
|  509 | head.anchor_encoder.yaw_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0177093 |    0.4831460 |     2.1875129 |         2.1875129 |       -1.4613452 |         0.0000000 |        1.5436790 |         2.0000000 |         0.1357533 |          0.0859375 |
|  510 | head.anchor_encoder.yaw_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0441071 |    0.3464116 |     2.0000000 |         2.0000000 |        0.0000000 |         0.0000000 |        1.5436790 |         2.0000000 |         0.2724878 |          0.0859375 |
|  511 | head.anchor_encoder.yaw_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 | -0.1316783 |    8.7045212 |   127.9786377 |       127.9786377 |       -1.2713966 |         0.0000000 |        3.0787067 |       127.0000000 |        -0.0005696 |          7.9375005 |
|  512 | head.anchor_encoder.yaw_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6432512 |     3.4558949 |         3.4558949 |       -3.4558949 |         0.0000000 |        1.9219804 |         0.0000000 |        -0.1682311 |          0.0000000 |
|  513 | head.anchor_encoder.yaw_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2375100 |     1.9219804 |         1.9219804 |        0.0000000 |         0.0000000 |        1.9219804 |         0.0000000 |         0.2375100 |          0.0000000 |
|  514 | head.anchor_encoder.yaw_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7764836 |     4.1334162 |         4.1334162 |       -0.9142925 |         0.0000000 |        4.1334162 |         0.0000000 |         0.0113746 |          0.0000000 |
|  515 | head.anchor_encoder.yaw_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.8527501 |     3.7721257 |         3.7721257 |       -3.7721257 |         0.0000000 |        2.5250242 |         0.0000000 |        -0.2285971 |          0.0000000 |
|  516 | head.anchor_encoder.yaw_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.3120765 |     2.5250242 |         2.5250242 |        0.0000000 |         0.0000000 |        2.5250242 |         0.0000000 |         0.3120765 |          0.0000000 |
|  517 | head.anchor_encoder.yaw_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.8243697 |     3.8073251 |         3.8073251 |       -0.9129892 |         0.0000000 |        3.8073251 |         0.0000000 |         0.0151282 |          0.0000000 |
|  518 | head.anchor_encoder.yaw_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.9784603 |     5.1872139 |         5.1872139 |       -5.1872139 |         0.0000000 |        2.7184572 |         0.0000000 |        -0.4139445 |          0.0000000 |
|  519 | head.anchor_encoder.yaw_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2822579 |     2.7184572 |         2.7184572 |        0.0000000 |         0.0000000 |        2.7184572 |         0.0000000 |         0.2822579 |          0.0000000 |
|  520 | head.anchor_encoder.yaw_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7091149 |     4.9776297 |         4.9776297 |       -1.0180539 |         0.0000000 |        4.9776297 |         0.0000000 |         0.0956924 |          0.0000000 |
|  521 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.0000000 |    8.3256855 |    45.5036354 |        45.5036354 |      -45.5036354 |         0.0000000 |       15.4541864 |         0.0000000 |        -5.5296664 |          0.0000000 |
|  522 | head.anchor_encoder.vel_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0536251 |    5.9902072 |    29.3458691 |        29.3458691 |      -29.3458691 |         0.0000000 |       26.6175117 |         1.0000000 |        -0.1328447 |          0.2031250 |
|  523 | head.anchor_encoder.vel_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.2706777 |    2.9143872 |    25.6175117 |        25.6175117 |        0.0000000 |         0.0000000 |       26.6175117 |         1.0000000 |         2.9429758 |          0.2031250 |
|  524 | head.anchor_encoder.vel_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.1350131 |   26.3583660 |   127.8707657 |       127.8707657 |       -1.0093508 |         0.0000000 |        3.5560648 |       127.0000000 |         0.0149166 |         25.7968769 |
|  525 | head.anchor_encoder.vel_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.5645382 |     3.5557196 |         3.5557196 |       -3.2268422 |         0.0000000 |        3.5557196 |         0.0000000 |        -0.1265522 |          0.0000000 |
|  526 | head.anchor_encoder.vel_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.2189930 |     3.5557196 |         3.5557196 |        0.0000000 |         0.0000000 |        3.5557196 |         0.0000000 |         0.2189931 |          0.0000000 |
|  527 | head.anchor_encoder.vel_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.7533725 |     4.2762866 |         4.2762866 |       -0.9126957 |         0.0000000 |        4.2762866 |         0.0000000 |         0.0331671 |          0.0000000 |
|  528 | head.anchor_encoder.vel_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.8680617 |     4.6173363 |         4.6173363 |       -4.6173363 |         0.0000000 |        4.1411872 |         0.0000000 |        -0.1057384 |          0.0000000 |
|  529 | head.anchor_encoder.vel_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.3811617 |     4.1411872 |         4.1411872 |        0.0000000 |         0.0000000 |        4.1411872 |         0.0000000 |         0.3811617 |          0.0000000 |
|  530 | head.anchor_encoder.vel_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.7779561 |     5.6286812 |         5.6286812 |       -0.8936968 |         0.0000000 |        5.6286812 |         0.0000000 |         0.0200932 |          0.0000000 |
|  531 | head.anchor_encoder.vel_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.8537802 |     4.9881468 |         4.9881468 |       -4.9881468 |         0.0000000 |        4.8272099 |         0.0000000 |        -0.1925995 |          0.0000000 |
|  532 | head.anchor_encoder.vel_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.3305904 |     4.8272099 |         4.8272099 |        0.0000000 |         0.0000000 |        4.8272099 |         0.0000000 |         0.3305904 |          0.0000000 |
|  533 | head.anchor_encoder.vel_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.6106034 |     5.6603370 |         5.6603370 |       -0.7790579 |         0.0000000 |        5.6603370 |         0.0000000 |         0.0285032 |          0.0000000 |
|  534 | head.anchor_encoder.cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.6665133 |     7.4383578 |         7.4383578 |       -1.2935432 |         0.0000000 |        7.4383578 |         0.0000000 |         0.0382207 |          0.0000000 |
|  535 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 256, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
|  536 | head.layers.7.query_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6950895 |     7.4383578 |     16249.0926835 |       -4.1468334 |         0.0000000 |        7.4383578 |         0.0000000 |         0.0173336 |          0.0000000 |
|  537 | head.layers.7.key_cat                          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 256, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
|  538 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6950895 |     7.4383578 |     16249.0926835 |       -4.1468334 |         0.0000000 |        7.4383578 |         0.0000000 |         0.0173336 |          0.0000000 |
|  539 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
|  540 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
|  541 | head.layers.7.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6950895 |     7.4383578 |     16249.0926835 |       -4.1468334 |         0.0000000 |        7.4383578 |         0.0000000 |         0.0173336 |          0.0000000 |
|  542 | head.layers.7.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
|  543 | head.layers.7.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
|  544 | head.layers.7.attn.q_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.7033368 |    10.4631891 |        10.4631891 |      -10.4631891 |         0.0000000 |        9.0081711 |         0.0000000 |         0.0032096 |          0.0000000 |
|  545 | head.layers.7.attn.k_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.8622130 |     9.1868200 |         9.1868200 |       -8.4902906 |         0.0000000 |        9.1868200 |         0.0000000 |        -0.0419200 |          0.0000000 |
|  546 | head.layers.7.attn.v_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.0796017 |     2.2832065 |         2.2832065 |       -1.4546176 |         0.0000000 |        2.2832065 |         0.0000000 |        -0.0036280 |          0.0000000 |
|  547 | head.layers.7.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.7033368 |    10.4631891 |        10.4631891 |      -10.4631891 |         0.0000000 |        9.0081711 |         0.0000000 |         0.0032096 |          0.0000000 |
|  548 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.7033368 |    10.4631891 |        10.4631891 |      -10.4631891 |         0.0000000 |        9.0081711 |         0.0000000 |         0.0032096 |          0.0000000 |
|  549 | head.layers.7.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  550 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  551 | head.layers.7.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.0796017 |     2.2832065 |         2.2832065 |       -1.4546176 |         0.0000000 |        2.2832065 |         0.0000000 |        -0.0036280 |          0.0000000 |
|  552 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 256, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.0796017 |     2.2832065 |         2.2832065 |       -1.4546176 |         0.0000000 |        2.2832065 |         0.0000000 |        -0.0036280 |          0.0000000 |
|  553 | head.layers.7.attn                             | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.0000000 |    0.2129171 |     1.3078986 |        10.4631891 |       -1.3078986 |         0.0000000 |        1.1260214 |         0.0000000 |         0.0004012 |          0.0000000 |
|  554 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  555 | head.layers.7.attn.matmul                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |   15.9519968 |   107.3746719 |       107.3746719 |     -100.3855515 |         0.0000000 |      107.3746719 |         0.0000000 |        -4.5826025 |          0.0000000 |
|  556 | head.layers.7.attn.softmax                     | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.9999999 |         0.9999999 |        0.0000000 |         0.0000000 |        0.9999999 |         0.0000000 |         0.0039062 |          0.0000000 |
|  557 | head.layers.7.attn.attention_drop              | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.9999999 |         0.9999999 |        0.0000000 |         0.0000000 |        0.9999999 |         0.0000000 |         0.0039062 |          0.0000000 |
|  558 | head.layers.7.attn.attn_matmul                 | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.0635710 |     1.6071935 |         1.6071935 |       -1.0297741 |         0.0000000 |        1.6071935 |         0.0000000 |         0.0005415 |          0.0000000 |
|  559 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  560 | head.layers.7.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  561 | head.layers.7.attn.out_proj                    | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.1164553 |     0.9301038 |         0.9301038 |       -0.9070358 |         0.0000000 |        0.9301038 |         0.0000000 |         0.0191230 |          0.0000000 |
|  562 | head.layers.7.attn                             | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.9999999 |         0.9999999 |        0.0000000 |         0.0000000 |        0.9999999 |         0.0000000 |         0.0039062 |          0.0000000 |
|  563 | head.layers.7.attn.attn_weights_mean           | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.7596642 |         0.7596642 |        0.0000000 |         0.0000000 |        0.7596642 |         0.0000000 |         0.0039062 |          0.0000000 |
|  564 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  565 | head.layers.7.dropout                          | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.1164553 |     0.9301038 |         0.9301038 |       -0.9070358 |         0.0000000 |        0.9301038 |         0.0000000 |         0.0191230 |          0.0000000 |
|  566 | head.layers.7.add                              | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.0000000 |    0.6690827 |     7.3896914 |         7.3896914 |       -4.3068266 |         0.0000000 |        7.3896914 |         0.0000000 |         0.0364566 |          0.0000000 |
|  567 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.0000000 |    0.6674927 |     9.2265730 |      6046.6346107 |       -8.8109894 |         0.0000000 |        9.2265730 |         0.0000000 |        -0.0027992 |          0.0000000 |
|  568 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1814153 |     4.4241567 |     28993.7107161 |       -4.4241567 |         0.0000000 |        3.2139957 |         0.0000000 |         0.0006922 |          0.0000000 |
|  569 | head.layers.8.query_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6670030 |     9.2265730 |     20155.4487023 |       -8.8109894 |         0.0000000 |        9.2265730 |         0.0000000 |         0.0177108 |          0.0000000 |
|  570 | head.layers.8.key_cat                          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0002277 |  0.0000000 |    0.6670030 |     9.2265730 |     40517.5179723 |       -8.8109894 |         0.0000000 |        9.2265730 |         0.0000000 |         0.0177108 |          0.0000000 |
|  571 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6670030 |     9.2265730 |     20155.4487023 |       -8.8109894 |         0.0000000 |        9.2265730 |         0.0000000 |         0.0177108 |          0.0000000 |
|  572 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0002277 |  0.0000000 |    0.6670030 |     9.2265730 |     40517.5179723 |       -8.8109894 |         0.0000000 |        9.2265730 |         0.0000000 |         0.0177108 |          0.0000000 |
|  573 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1814153 |     4.4241567 |     28993.7107161 |       -4.4241567 |         0.0000000 |        3.2139957 |         0.0000000 |         0.0006922 |          0.0000000 |
|  574 | head.layers.8.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6670030 |     9.2265730 |     20155.4487023 |       -8.8109894 |         0.0000000 |        9.2265730 |         0.0000000 |         0.0177108 |          0.0000000 |
|  575 | head.layers.8.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0002277 |  0.0000000 |    0.6670030 |     9.2265730 |     40517.5179723 |       -8.8109894 |         0.0000000 |        9.2265730 |         0.0000000 |         0.0177108 |          0.0000000 |
|  576 | head.layers.8.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1814153 |     4.4241567 |     28993.7107161 |       -4.4241567 |         0.0000000 |        3.2139957 |         0.0000000 |         0.0006922 |          0.0000000 |
|  577 | head.layers.8.attn.q_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.0767403 |     7.9901357 |         7.9901357 |       -7.9901357 |         0.0000000 |        7.0327268 |         0.0000000 |        -0.0113954 |          0.0000000 |
|  578 | head.layers.8.attn.k_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.0781386 |     9.4287710 |         9.4287710 |       -9.4287710 |         0.0000000 |        6.8479962 |         0.0000000 |        -0.0343830 |          0.0000000 |
|  579 | head.layers.8.attn.v_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.1793489 |     3.6170430 |         3.6170430 |       -2.6817710 |         0.0000000 |        3.6170430 |         0.0000000 |        -0.0060015 |          0.0000000 |
|  580 | head.layers.8.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.0767403 |     7.9901357 |         7.9901357 |       -7.9901357 |         0.0000000 |        7.0327268 |         0.0000000 |        -0.0113954 |          0.0000000 |
|  581 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.0767403 |     7.9901357 |         7.9901357 |       -7.9901357 |         0.0000000 |        7.0327268 |         0.0000000 |        -0.0113954 |          0.0000000 |
|  582 | head.layers.8.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.0781386 |     9.4287710 |         9.4287710 |       -9.4287710 |         0.0000000 |        6.8479962 |         0.0000000 |        -0.0343830 |          0.0000000 |
|  583 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.0781386 |     9.4287710 |         9.4287710 |       -9.4287710 |         0.0000000 |        6.8479962 |         0.0000000 |        -0.0343830 |          0.0000000 |
|  584 | head.layers.8.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1793489 |     3.6170430 |         3.6170430 |       -2.6817710 |         0.0000000 |        3.6170430 |         0.0000000 |        -0.0060015 |          0.0000000 |
|  585 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1793489 |     3.6170430 |         3.6170430 |       -2.6817710 |         0.0000000 |        3.6170430 |         0.0000000 |        -0.0060015 |          0.0000000 |
|  586 | head.layers.8.attn                             | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.0000000 |    0.1345925 |     0.9987670 |         7.9901357 |       -0.9987670 |         0.0000000 |        0.8790908 |         0.0000000 |        -0.0014244 |          0.0000000 |
|  587 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  588 | head.layers.8.attn.matmul                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    3.9475808 |    36.8060303 |        36.8060303 |      -36.7864990 |         0.0000000 |       36.8060303 |         0.0000000 |        -0.4264499 |          0.0000000 |
|  589 | head.layers.8.attn.softmax                     | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9602864 |         0.9602864 |        0.0000000 |         0.0000000 |        0.9602864 |         0.0000000 |         0.0019531 |          0.0000000 |
|  590 | head.layers.8.attn.attention_drop              | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9602864 |         0.9602864 |        0.0000000 |         0.0000000 |        0.9602864 |         0.0000000 |         0.0019531 |          0.0000000 |
|  591 | head.layers.8.attn.attn_matmul                 | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1302873 |     2.9772983 |         2.9772983 |       -2.3082454 |         0.0000000 |        2.9772983 |         0.0000000 |        -0.0026151 |          0.0000000 |
|  592 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  593 | head.layers.8.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  594 | head.layers.8.attn.out_proj                    | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2144276 |     1.5473903 |         1.5473903 |       -1.5473903 |         0.0000000 |        1.4779103 |         0.0000000 |         0.0052419 |          0.0000000 |
|  595 | head.layers.8.attn                             | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 512])     | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9602864 |         0.9602864 |        0.0000000 |         0.0000000 |        0.9602864 |         0.0000000 |         0.0019531 |          0.0000000 |
|  596 | head.layers.8.attn.attn_weights_mean           | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.1394550 |         0.1394550 |        0.0000004 |         0.0000000 |        0.1394550 |         0.0000000 |         0.0019531 |          0.0000000 |
|  597 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  598 | head.layers.8.dropout                          | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2144276 |     1.5473903 |         1.5473903 |       -1.5473903 |         0.0000000 |        1.4779103 |         0.0000000 |         0.0052419 |          0.0000000 |
|  599 | head.layers.8.add                              | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.0000000 |    0.7110018 |     9.4065533 |         9.4065533 |       -8.4865198 |         0.0000000 |        9.4065533 |         0.0000000 |         0.0229526 |          0.0000000 |
|  600 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.0000000 |    1.3774306 |    30.0017700 |     19661.6599869 |      -30.0017700 |         0.0000000 |       24.0435677 |         0.0000000 |        -0.0154488 |          0.0000000 |
|  601 | head.layers.9                                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4983643 |     6.9199944 |         6.9199944 |       -6.9199944 |         0.0000000 |        6.1704583 |         0.0000000 |         0.0012878 |          0.0000000 |
|  602 | head.layers.10.kps_generator.offset            | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 24])         | qint16        | 1.0000000 |  0.0000000 |    1.0919281 |     8.1859026 |         8.1859026 |       -8.1859026 |         0.0000000 |        3.6012852 |         0.0000000 |        -0.5216796 |          0.0000000 |
|  603 | head.layers.10.kps_generator                   | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.0000000 |    1.0919281 |     8.1859026 |         8.1859026 |       -8.1859026 |         0.0000000 |        3.6012852 |         0.0000000 |        -0.5216796 |          0.0000000 |
|  604 | head.layers.10.kps_generator                   | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 3])       | qint16        | 1.0000000 |  0.6404526 |    8.6006489 |   112.3172073 |       112.3172073 |      -59.8177528 |       -58.0000000 |       62.9555511 |        60.0000000 |         8.4762030 |          1.2368289 |
|  605 | head.layers.10.kps_generator.keypoints_add     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.6453708 |    9.0443525 |   113.4920959 |       113.4920959 |      -64.9907684 |       -58.0000000 |       65.1730957 |        60.0000000 |         7.9545217 |          1.2368289 |
|  606 | head.layers.10.weight_add                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.8830388 |     7.7612200 |         7.7612200 |       -7.3417168 |         0.0000000 |        7.7612200 |         0.0000000 |         0.0395085 |          0.0000000 |
|  607 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 3, 4])         | qint16        | 1.0000000 |  0.9977293 |    0.2101564 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3522723 |          0.3472222 |
|  608 | head.layers.10                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 6, 12])           | qint16        | 1.0000000 |  0.9977293 |    0.2101564 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3522723 |          0.3472222 |
|  609 | head.layers.10.camera_encoder.0                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.7379591 |     6.5923667 |         6.5923667 |       -6.5923667 |         0.0000000 |        5.3010874 |         0.0000000 |        -0.0614436 |          0.0000000 |
|  610 | head.layers.10.camera_encoder.1                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.3382578 |     5.3010874 |         5.3010874 |        0.0000000 |         0.0000000 |        5.3010874 |         0.0000000 |         0.3382578 |          0.0000000 |
|  611 | head.layers.10.camera_encoder.2                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.7935766 |     5.0326171 |         5.0326171 |       -0.8844092 |         0.0000000 |        5.0326171 |         0.0000000 |         0.0176572 |          0.0000000 |
|  612 | head.layers.10.camera_encoder.3                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    2.2369249 |    24.4463539 |        24.4463539 |      -13.8601160 |         0.0000000 |       24.4463539 |         0.0000000 |        -0.0114386 |          0.0000000 |
|  613 | head.layers.10.camera_encoder.4                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    1.1127431 |    24.4463539 |        24.4463539 |        0.0000000 |         0.0000000 |       24.4463539 |         0.0000000 |         1.1127431 |          0.0000000 |
|  614 | head.layers.10.camera_encoder.5                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.4867007 |     7.1053290 |         7.1053290 |       -1.0743425 |         0.0000000 |        7.1053290 |         0.0000000 |         0.0232620 |          0.0000000 |
|  615 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.8830388 |     7.7612200 |         7.7612200 |       -7.3417168 |         0.0000000 |        7.7612200 |         0.0000000 |         0.0395085 |          0.0000000 |
|  616 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 6, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.4867007 |     7.1053290 |         7.1053290 |       -1.0743425 |         0.0000000 |        7.1053290 |         0.0000000 |         0.0232620 |          0.0000000 |
|  617 | head.layers.10.cam_add                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 6, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.9927106 |    11.3114300 |        11.3114300 |       -7.2501507 |         0.0000000 |       11.3114300 |         0.0000000 |         0.0627705 |          0.0000000 |
|  618 | head.layers.10.weights_fc                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 6, 64])      | qint8         | 1.0000000 |  0.0000000 |    1.9632803 |    10.0503626 |        10.0503626 |      -10.0503626 |         0.0000000 |        7.8620095 |         0.0000000 |        -0.3185916 |          0.0000000 |
|  619 | head.layers.10                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.0000000 |    1.9632803 |    10.0503626 |        10.0503626 |      -10.0503626 |         0.0000000 |        7.8620095 |         0.0000000 |        -0.3185916 |          0.0000000 |
|  620 | head.layers.10.weight_softmax                  | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.0000000 |    0.0208333 |     0.8265363 |         0.8265363 |        0.0000002 |         0.0000000 |        0.8265363 |         0.0000000 |         0.0208333 |          0.0000000 |
|  621 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  0.8201931 |    9.3903160 |   113.4920959 |       113.4920959 |      -64.9907684 |       -58.0000000 |       65.1730957 |        60.0000000 |        10.4436817 |          5.5649042 |
|  622 | head.layers.10                                 | torch.ones_like                                                               | torch.ones_like                                                         | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
|  623 | head.layers.10.point_quant_stub                | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
|  624 | head.layers.10.point_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 8, 4])       | qint16        | 1.0000000 |  0.6455227 |    6.7832646 |   113.4920959 |       113.4920959 |      -64.9907684 |       -58.0000000 |       65.1730957 |        60.0000000 |         6.2158923 |          1.1776217 |
|  625 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 1, 1, 4, 4])   | qint16        | 1.0000000 |  0.9977411 |    0.1576173 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3267042 |          0.3229167 |
|  626 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 512, 8, 1, 4]) | qint16        | 1.0000000 |  0.6455227 |    6.7832646 |   113.4920959 |       113.4920959 |      -64.9907684 |       -58.0000000 |       65.1730957 |        60.0000000 |         6.2158923 |          1.1776217 |
|  627 | head.layers.10.point_matmul                    | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 4, 4]) | qint16        | 1.0000000 |  0.0575049 |    5.1351309 |   795.0856934 |       795.0856934 |     -365.0857239 |      -180.0000000 |      261.4109192 |       440.0000000 |        -0.2334914 |          1.5117548 |
|  628 | head.layers.10.point_sum                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 6, 512, 8, 4])    | qint16        | 0.0244144 |  0.0480059 |   15.9973516 |   828.4863281 |     33934.2822039 |     -320.4954834 |      -180.0076294 |      332.1846313 |       518.0010376 |        -0.9339653 |          6.0472698 |
|  629 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.7156898 |   11.3438387 |   125.4575958 |      5138.6647152 |      -74.2145386 |       -63.9902344 |       73.6929550 |        61.9882507 |        -0.9966986 |          1.5886035 |
|  630 | head.layers.10                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.8144572 |    6.0587049 |    57.6181145 |      2360.0019580 |        0.0100000 |         0.0000000 |       73.6929550 |        61.9882507 |        11.1826534 |         12.4062328 |
|  631 | head.layers.10.reciprocal_op                   | horizon_plugin_pytorch.nn.reciprocal.Reciprocal                               | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.7923284 |   46.6978798 |    99.9816895 |     32761.5000992 |        0.0135698 |         0.0152590 |      100.0000000 |         1.2787061 |        47.1339645 |          0.6503971 |
|  632 | head.layers.10                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.7923284 |   46.6978798 |    99.9816895 |     32761.5000992 |        0.0135698 |         0.0152590 |      100.0000000 |         1.2787061 |        47.1339645 |          0.6503971 |
|  633 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0244144 | -0.0645772 |   26.3222904 |   828.4863281 |     33934.2822039 |     -320.4954834 |      -180.0076294 |      332.1846313 |       518.0010376 |        -1.8695812 |         10.7997427 |
|  634 | head.layers.10.point_mul                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.4491310 | 1420.5218506 | 33208.4648438 | 108815837.2020935 |   -32049.5488281 |       -10.0001526 |    33218.4648438 |         9.9998474 |       606.5057983 |          2.1499586 |
|  635 | head.layers.10                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.4593751 |    0.6124087 |     2.1998703 |      7208.4251644 |       -1.1000000 |        -1.0998703 |        1.1000000 |         1.0998703 |         0.0981337 |          0.2038586 |
|  636 | head.layers.10                                 | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([156, 512, 8, 2])      | qint16        | 0.0003052 |  0.4593751 |    0.6124087 |     2.1998703 |      7208.4251644 |       -1.1000000 |        -1.0998703 |        1.1000000 |         1.0998703 |         0.0981337 |          0.2038586 |
|  637 | head.layers.10                                 | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer            | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer      | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.0000000 |    0.4866624 |    56.9694252 |        56.9694252 |      -56.9694252 |         0.0000000 |       53.4973030 |         0.0000000 |         0.0192764 |          0.0000000 |
|  638 | head.layers.10.feat_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.0000000 |    0.4866624 |    56.9694252 |        56.9694252 |      -56.9694252 |         0.0000000 |       53.4973030 |         0.0000000 |         0.0192764 |          0.0000000 |
|  639 | head.layers.10                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 6, 256, 512, 8])  | qint8         | 1.0000000 |  0.0000000 |    0.4866624 |    56.9694252 |        56.9694252 |      -56.9694252 |         0.0000000 |       53.4973030 |         0.0000000 |         0.0192764 |          0.0000000 |
|  640 | head.layers.10                                 | torch.Tensor.permute                                                          | torch.Tensor.permute                                                    | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.0000000 |    0.4866624 |    56.9694252 |        56.9694252 |      -56.9694252 |         0.0000000 |       53.4973030 |         0.0000000 |         0.0192764 |          0.0000000 |
|  641 | head.layers.10                                 | torch.Tensor.contiguous                                                       | torch.Tensor.contiguous                                                 | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.0000000 |    0.4866624 |    56.9694252 |        56.9694252 |      -56.9694252 |         0.0000000 |       53.4973030 |         0.0000000 |         0.0192764 |          0.0000000 |
|  642 | head.layers.10                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.0000000 |    0.4866624 |    56.9694252 |        56.9694252 |      -56.9694252 |         0.0000000 |       53.4973030 |         0.0000000 |         0.0192764 |          0.0000000 |
|  643 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 48, 8, 1])   | qint8         | 1.0000000 |  0.0000000 |    0.0208333 |     0.8265363 |         0.8265363 |        0.0000002 |         0.0000000 |        0.8265363 |         0.0000000 |         0.0208333 |          0.0000000 |
|  644 | head.layers.10                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.0000000 |    0.4866624 |    56.9694252 |        56.9694252 |      -56.9694252 |         0.0000000 |       53.4973030 |         0.0000000 |         0.0192764 |          0.0000000 |
|  645 | head.layers.10.feat_mul                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.0000000 |    0.0073932 |     9.0969772 |         9.0969772 |       -8.3039818 |         0.0000000 |        9.0969772 |         0.0000000 |         0.0000708 |          0.0000000 |
|  646 | head.layers.10                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.0000000 |    0.0073932 |     9.0969772 |         9.0969772 |       -8.3039818 |         0.0000000 |        9.0969772 |         0.0000000 |         0.0000708 |          0.0000000 |
|  647 | head.layers.10.feat_sum                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.2889938 |     9.6043720 |         9.6043720 |       -9.4884453 |         0.0000000 |        9.6043720 |         0.0000000 |         0.0033989 |          0.0000000 |
|  648 | head.layers.10.output_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4406250 |    13.9364386 |        13.9364386 |      -10.0577822 |         0.0000000 |       13.9364386 |         0.0000000 |        -0.0126724 |          0.0000000 |
|  649 | head.layers.10.proj_drop                       | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4406250 |    13.9364386 |        13.9364386 |      -10.0577822 |         0.0000000 |       13.9364386 |         0.0000000 |        -0.0126724 |          0.0000000 |
|  650 | head.layers.10.residual_op                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.4694947 |    13.9364386 |        13.9364386 |      -10.0577822 |         0.0000000 |       13.9364386 |         0.0000000 |        -0.0056923 |          0.0000000 |
|  651 | head.layers.11.pre_norm                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.5380355 |     8.0663033 |         8.0663033 |       -8.0663033 |         0.0000000 |        7.8473573 |         0.0000000 |         0.0018022 |          0.0000000 |
|  652 | head.layers.11.layers.0.0                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.0000000 |    2.3058262 |    15.8709230 |        15.8709230 |      -15.8709230 |         0.0000000 |       12.4762926 |         0.0000000 |        -1.5337054 |          0.0000000 |
|  653 | head.layers.11.layers.0.2                      | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.0000000 |    0.3860604 |    12.4762926 |        12.4762926 |        0.0000000 |         0.0000000 |       12.4762926 |         0.0000000 |         0.3860604 |          0.0000000 |
|  654 | head.layers.11.layers.1                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    3.1546953 |    45.3102455 |        45.3102455 |      -38.4103546 |         0.0000000 |       45.3102455 |         0.0000000 |         0.0226187 |          0.0000000 |
|  655 | head.layers.11.layers.2                        | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    3.1546953 |    45.3102455 |        45.3102455 |      -38.4103546 |         0.0000000 |       45.3102455 |         0.0000000 |         0.0226187 |          0.0000000 |
|  656 | head.layers.11.identity_fc                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  657 | head.layers.11.short_add                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.linear.LinearAdd                          | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    4.4289474 |    44.2022667 |        44.2022667 |      -41.0777969 |         0.0000000 |       44.2022667 |         0.0000000 |         0.0341342 |          0.0000000 |
|  658 | head.layers.12                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.7325945 |     4.4020753 |         4.4020753 |       -4.4020753 |         0.0000000 |        3.8545034 |         0.0000000 |        -0.0018893 |          0.0000000 |
|  659 | head.layers.13.add1                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.9435897 |     7.9268794 |         7.9268794 |       -4.2850084 |         0.0000000 |        7.9268794 |         0.0000000 |         0.0363314 |          0.0000000 |
|  660 | head.layers.13.layers.0                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.5512187 |     9.5861740 |         9.5861740 |       -9.5311737 |         0.0000000 |        9.5861740 |         0.0000000 |        -0.5773760 |          0.0000000 |
|  661 | head.layers.13.layers.1                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4869213 |     9.5861740 |         9.5861740 |        0.0000000 |         0.0000000 |        9.5861740 |         0.0000000 |         0.4869213 |          0.0000000 |
|  662 | head.layers.13.layers.2                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.2943250 |    10.9225693 |        10.9225693 |      -10.9225693 |         0.0000000 |       10.1361361 |         0.0000000 |        -0.2881917 |          0.0000000 |
|  663 | head.layers.13.layers.3                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.5030667 |    10.1361361 |        10.1361361 |        0.0000000 |         0.0000000 |       10.1361361 |         0.0000000 |         0.5030667 |          0.0000000 |
|  664 | head.layers.13.layers.4                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.7408025 |     6.9642663 |         6.9642663 |       -0.8287929 |         0.0000000 |        6.9642663 |         0.0000000 |         0.0340462 |          0.0000000 |
|  665 | head.layers.13.layers.5                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.3249056 |     8.9257898 |         8.9257898 |       -7.3633561 |         0.0000000 |        8.9257898 |         0.0000000 |        -0.5553964 |          0.0000000 |
|  666 | head.layers.13.layers.6                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3847547 |     8.9257898 |         8.9257898 |        0.0000000 |         0.0000000 |        8.9257898 |         0.0000000 |         0.3847547 |          0.0000000 |
|  667 | head.layers.13.layers.7                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.2225150 |    16.3542671 |        16.3542671 |       -7.1829901 |         0.0000000 |       16.3542671 |         0.0000000 |        -0.4406968 |          0.0000000 |
|  668 | head.layers.13.layers.8                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3909092 |    16.3542671 |        16.3542671 |        0.0000000 |         0.0000000 |       16.3542671 |         0.0000000 |         0.3909092 |          0.0000000 |
|  669 | head.layers.13.layers.9                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.6003203 |    10.5594673 |        10.5594673 |       -0.9034730 |         0.0000000 |       10.5594673 |         0.0000000 |         0.0280741 |          0.0000000 |
|  670 | head.layers.13.layers.10                       | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.0000000 |    1.0263001 |    11.6095848 |        11.6095848 |       -5.8411531 |         0.0000000 |       11.6095848 |         0.0000000 |        -0.1193234 |          0.0000000 |
|  671 | head.layers.13.layers.11.scale_quant_stub      | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([11])                  | qint16        | 1.0000000 |  0.9069562 |    0.1719654 |     0.4907171 |         0.4907171 |        0.0353014 |         0.0000000 |        0.9935754 |         1.0000000 |         0.2909352 |          0.2727273 |
|  672 | head.layers.13.layers.11.mul                   | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.0000000 |    0.5070809 |     7.5633597 |         7.5633597 |       -5.8036261 |         0.0000000 |        7.5633597 |         0.0000000 |        -0.2109119 |          0.0000000 |
|  673 | head.layers.13.add2                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.5495523 |    5.1210518 |   112.5461731 |       112.5461731 |      -60.9660416 |       -58.0000000 |       64.7206650 |        60.0000000 |         0.7443742 |          0.6780485 |
|  674 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 11])         | torch.float32 |           |  0.5495523 |    5.1210518 |   112.5461731 |                   |      -60.9660416 |       -58.0000000 |       64.7206650 |        60.0000000 |         0.7443742 |          0.6780485 |
|  675 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.6422864 |    8.7534990 |   112.5461731 |       112.5461731 |      -60.9660416 |       -58.0000000 |       64.7206650 |        60.0000000 |         8.3521595 |          1.2368289 |
|  676 | head.anchor_encoder.pos_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 | -0.0984443 |   12.7968445 |   160.4830627 |       160.4830627 |      -74.8401108 |         0.0000000 |       59.5616608 |        95.0000000 |        -0.7025925 |          3.4413030 |
|  677 | head.anchor_encoder.pos_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.2299563 |    6.9743271 |    95.0000000 |        95.0000000 |        0.0000000 |         0.0000000 |       59.5616608 |        95.0000000 |         5.1199250 |          3.4413030 |
|  678 | head.anchor_encoder.pos_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.1119867 |    2.3980627 |   127.7031631 |       127.7031631 |       -0.9089724 |        -1.0000000 |        3.9988158 |       127.0000000 |         0.0114727 |          1.5994991 |
|  679 | head.anchor_encoder.pos_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.8794156 |     6.3184443 |         6.3184443 |       -5.5470533 |         0.0000000 |        6.3184443 |         0.0000000 |        -0.2279205 |          0.0000000 |
|  680 | head.anchor_encoder.pos_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.3257475 |     6.3184443 |         6.3184443 |        0.0000000 |         0.0000000 |        6.3184443 |         0.0000000 |         0.3257475 |          0.0000000 |
|  681 | head.anchor_encoder.pos_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.6464646 |     6.3515434 |         6.3515434 |       -0.9792548 |         0.0000000 |        6.3515434 |         0.0000000 |         0.0791825 |          0.0000000 |
|  682 | head.anchor_encoder.pos_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    1.1395612 |     6.1850944 |         6.1850944 |       -6.1850944 |         0.0000000 |        5.7178812 |         0.0000000 |        -0.0525922 |          0.0000000 |
|  683 | head.anchor_encoder.pos_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.5434845 |     5.7178812 |         5.7178812 |        0.0000000 |         0.0000000 |        5.7178812 |         0.0000000 |         0.5434845 |          0.0000000 |
|  684 | head.anchor_encoder.pos_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.7979251 |     5.7150550 |         5.7150550 |       -0.8686314 |         0.0000000 |        5.7150550 |         0.0000000 |         0.0264403 |          0.0000000 |
|  685 | head.anchor_encoder.pos_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    1.2957784 |     8.4020557 |         8.4020557 |       -5.7060418 |         0.0000000 |        8.4020557 |         0.0000000 |        -0.2763848 |          0.0000000 |
|  686 | head.anchor_encoder.pos_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.5096968 |     8.4020557 |         8.4020557 |        0.0000000 |         0.0000000 |        8.4020557 |         0.0000000 |         0.5096969 |          0.0000000 |
|  687 | head.anchor_encoder.pos_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.6692601 |     7.4429398 |         7.4429398 |       -0.8781066 |         0.0000000 |        7.4429398 |         0.0000000 |         0.0311306 |          0.0000000 |
|  688 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.8643918 |    0.4955470 |     2.1547022 |         2.1547022 |       -0.5520454 |         0.0000000 |        2.4753942 |         2.0000000 |         0.9228369 |          0.9993490 |
|  689 | head.anchor_encoder.size_fc.0                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0901138 |    0.6547547 |     2.6925750 |         2.6925750 |       -2.6925750 |         0.0000000 |        1.4267265 |         1.0000000 |        -0.3541585 |          0.0858765 |
|  690 | head.anchor_encoder.size_fc.1                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.2765695 |    0.1710102 |     1.4267265 |         1.4267265 |        0.0000000 |         0.0000000 |        1.4267265 |         1.0000000 |         0.1295860 |          0.0858765 |
|  691 | head.anchor_encoder.size_fc.2                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.1654903 |   11.4971724 |   127.6018906 |       127.6018906 |       -0.7146325 |         0.0000000 |        3.9782619 |       127.0000000 |         0.0223180 |         10.9063110 |
|  692 | head.anchor_encoder.size_fc.3                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.4256846 |     1.7909709 |         1.7909709 |       -1.7909709 |         0.0000000 |        1.6439540 |         0.0000000 |        -0.0243988 |          0.0000000 |
|  693 | head.anchor_encoder.size_fc.4                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2006429 |     1.6439540 |         1.6439540 |        0.0000000 |         0.0000000 |        1.6439540 |         0.0000000 |         0.2006429 |          0.0000000 |
|  694 | head.anchor_encoder.size_fc.5                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7856224 |     3.4960148 |         3.4960148 |       -0.9158402 |         0.0000000 |        3.4960148 |         0.0000000 |         0.0095773 |          0.0000000 |
|  695 | head.anchor_encoder.size_fc.6                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7010252 |     2.2884371 |         2.2884371 |       -2.0759532 |         0.0000000 |        2.2884371 |         0.0000000 |        -0.0730491 |          0.0000000 |
|  696 | head.anchor_encoder.size_fc.7                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.3139881 |     2.2884371 |         2.2884371 |        0.0000000 |         0.0000000 |        2.2884371 |         0.0000000 |         0.3139881 |          0.0000000 |
|  697 | head.anchor_encoder.size_fc.8                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7667339 |     3.5809600 |         3.5809600 |       -0.8443317 |         0.0000000 |        3.5809600 |         0.0000000 |         0.0118357 |          0.0000000 |
|  698 | head.anchor_encoder.size_fc.9                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7379431 |     3.0090775 |         3.0090775 |       -2.5833285 |         0.0000000 |        3.0090775 |         0.0000000 |         0.1300436 |          0.0000000 |
|  699 | head.anchor_encoder.size_fc.10                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.4339933 |     3.0090775 |         3.0090775 |        0.0000000 |         0.0000000 |        3.0090775 |         0.0000000 |         0.4339934 |          0.0000000 |
|  700 | head.anchor_encoder.size_fc.11                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7155823 |     3.9318666 |         3.9318666 |       -1.2742829 |         0.0000000 |        3.9318666 |         0.0000000 |         0.0293063 |          0.0000000 |
|  701 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 2])          | qint16        | 1.0000000 | -0.7583329 |    0.9045631 |     2.3206410 |         2.3206410 |       -1.7166638 |         0.0000000 |        0.1689172 |         1.0000000 |        -0.5161880 |          0.3750000 |
|  702 | head.anchor_encoder.yaw_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0203683 |    0.4766243 |     2.1341944 |         2.1341944 |       -1.4594258 |         0.0000000 |        1.5890337 |         2.0000000 |         0.1341516 |          0.0859375 |
|  703 | head.anchor_encoder.yaw_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0433282 |    0.3423807 |     2.0000000 |         2.0000000 |        0.0000000 |         0.0000000 |        1.5890337 |         2.0000000 |         0.2683952 |          0.0859375 |
|  704 | head.anchor_encoder.yaw_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 | -0.1338718 |    8.7006397 |   127.9649200 |       127.9649200 |       -1.2669742 |         0.0000000 |        3.0532725 |       127.0000000 |        -0.0006082 |          7.9375005 |
|  705 | head.anchor_encoder.yaw_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6263215 |     3.2999501 |         3.2999501 |       -3.2999501 |         0.0000000 |        1.9538969 |         0.0000000 |        -0.1712025 |          0.0000000 |
|  706 | head.anchor_encoder.yaw_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2275595 |     1.9538969 |         1.9538969 |        0.0000000 |         0.0000000 |        1.9538969 |         0.0000000 |         0.2275596 |          0.0000000 |
|  707 | head.anchor_encoder.yaw_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7659550 |     4.0617452 |         4.0617452 |       -0.9146835 |         0.0000000 |        4.0617452 |         0.0000000 |         0.0091459 |          0.0000000 |
|  708 | head.anchor_encoder.yaw_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.8456649 |     3.7701013 |         3.7701013 |       -3.7701013 |         0.0000000 |        2.5342805 |         0.0000000 |        -0.2433762 |          0.0000000 |
|  709 | head.anchor_encoder.yaw_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.3011443 |     2.5342805 |         2.5342805 |        0.0000000 |         0.0000000 |        2.5342805 |         0.0000000 |         0.3011443 |          0.0000000 |
|  710 | head.anchor_encoder.yaw_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.8173996 |     3.8247609 |         3.8247609 |       -0.9109468 |         0.0000000 |        3.8247609 |         0.0000000 |         0.0131810 |          0.0000000 |
|  711 | head.anchor_encoder.yaw_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    1.0050889 |     5.1867137 |         5.1867137 |       -5.1867137 |         0.0000000 |        2.7216392 |         0.0000000 |        -0.4714604 |          0.0000000 |
|  712 | head.anchor_encoder.yaw_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2668143 |     2.7216392 |         2.7216392 |        0.0000000 |         0.0000000 |        2.7216392 |         0.0000000 |         0.2668143 |          0.0000000 |
|  713 | head.anchor_encoder.yaw_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6693417 |     4.9649129 |         4.9649129 |       -0.9946409 |         0.0000000 |        4.9649129 |         0.0000000 |         0.0874524 |          0.0000000 |
|  714 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.0000000 |    8.9251032 |    46.1030350 |        46.1030350 |      -46.1030350 |         0.0000000 |       15.7246752 |         0.0000000 |        -6.2014985 |          0.0000000 |
|  715 | head.anchor_encoder.vel_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0633153 |    5.9543920 |    28.7362156 |        28.7362156 |      -28.7362156 |         0.0000000 |       26.0704479 |         1.0000000 |        -0.2796780 |          0.2031250 |
|  716 | head.anchor_encoder.vel_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.2832761 |    2.8150101 |    25.0704479 |        25.0704479 |        0.0000000 |         0.0000000 |       26.0704479 |         1.0000000 |         2.8597040 |          0.2031250 |
|  717 | head.anchor_encoder.vel_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.1397241 |   26.3531113 |   127.8325424 |       127.8325424 |       -0.8906125 |         0.0000000 |        3.5586653 |       127.0000000 |         0.0181459 |         25.7968769 |
|  718 | head.anchor_encoder.vel_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.5578126 |     3.0734835 |         3.0734835 |       -2.9966898 |         0.0000000 |        3.0734835 |         0.0000000 |        -0.1597551 |          0.0000000 |
|  719 | head.anchor_encoder.vel_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.1990288 |     3.0734835 |         3.0734835 |        0.0000000 |         0.0000000 |        3.0734835 |         0.0000000 |         0.1990288 |          0.0000000 |
|  720 | head.anchor_encoder.vel_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.7665138 |     4.2795768 |         4.2795768 |       -0.9166543 |         0.0000000 |        4.2795768 |         0.0000000 |         0.0362004 |          0.0000000 |
|  721 | head.anchor_encoder.vel_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.9086897 |     4.8020887 |         4.8020887 |       -4.8020887 |         0.0000000 |        4.1310468 |         0.0000000 |        -0.1553238 |          0.0000000 |
|  722 | head.anchor_encoder.vel_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.3766830 |     4.1310468 |         4.1310468 |        0.0000000 |         0.0000000 |        4.1310468 |         0.0000000 |         0.3766830 |          0.0000000 |
|  723 | head.anchor_encoder.vel_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.7597720 |     5.7302361 |         5.7302361 |       -0.8924642 |         0.0000000 |        5.7302361 |         0.0000000 |         0.0201877 |          0.0000000 |
|  724 | head.anchor_encoder.vel_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.8929014 |     4.8599339 |         4.8599339 |       -4.8599339 |         0.0000000 |        4.4784646 |         0.0000000 |        -0.1913516 |          0.0000000 |
|  725 | head.anchor_encoder.vel_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.3507749 |     4.4784646 |         4.4784646 |        0.0000000 |         0.0000000 |        4.4784646 |         0.0000000 |         0.3507749 |          0.0000000 |
|  726 | head.anchor_encoder.vel_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.6245633 |     5.3909011 |         5.3909011 |       -0.7765247 |         0.0000000 |        5.3909011 |         0.0000000 |         0.0276593 |          0.0000000 |
|  727 | head.anchor_encoder.cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.6638864 |     7.4429398 |         7.4429398 |       -1.2742829 |         0.0000000 |        7.4429398 |         0.0000000 |         0.0370749 |          0.0000000 |
|  728 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 256, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
|  729 | head.layers.14.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6982404 |     7.4429398 |     16259.1019058 |       -4.4020753 |         0.0000000 |        7.4429398 |         0.0000000 |         0.0175928 |          0.0000000 |
|  730 | head.layers.14.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 256, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
|  731 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6982404 |     7.4429398 |     16259.1019058 |       -4.4020753 |         0.0000000 |        7.4429398 |         0.0000000 |         0.0175928 |          0.0000000 |
|  732 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
|  733 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
|  734 | head.layers.14.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6982404 |     7.4429398 |     16259.1019058 |       -4.4020753 |         0.0000000 |        7.4429398 |         0.0000000 |         0.0175928 |          0.0000000 |
|  735 | head.layers.14.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
|  736 | head.layers.14.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
|  737 | head.layers.14.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.7754998 |     9.8951101 |         9.8951101 |       -9.7514448 |         0.0000000 |        9.8951101 |         0.0000000 |        -0.0385062 |          0.0000000 |
|  738 | head.layers.14.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.8109860 |     9.3473539 |         9.3473539 |       -8.8085613 |         0.0000000 |        9.3473539 |         0.0000000 |        -0.0452838 |          0.0000000 |
|  739 | head.layers.14.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.0859880 |     1.4748851 |         1.4748851 |       -1.4748851 |         0.0000000 |        1.3610518 |         0.0000000 |        -0.0058053 |          0.0000000 |
|  740 | head.layers.14.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.7754998 |     9.8951101 |         9.8951101 |       -9.7514448 |         0.0000000 |        9.8951101 |         0.0000000 |        -0.0385062 |          0.0000000 |
|  741 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.7754998 |     9.8951101 |         9.8951101 |       -9.7514448 |         0.0000000 |        9.8951101 |         0.0000000 |        -0.0385062 |          0.0000000 |
|  742 | head.layers.14.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  743 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  744 | head.layers.14.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.0859880 |     1.4748851 |         1.4748851 |       -1.4748851 |         0.0000000 |        1.3610518 |         0.0000000 |        -0.0058053 |          0.0000000 |
|  745 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 256, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.0859880 |     1.4748851 |         1.4748851 |       -1.4748851 |         0.0000000 |        1.3610518 |         0.0000000 |        -0.0058053 |          0.0000000 |
|  746 | head.layers.14.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.0000000 |    0.2219375 |     1.2368888 |         9.8951101 |       -1.2189306 |         0.0000000 |        1.2368888 |         0.0000000 |        -0.0048133 |          0.0000000 |
|  747 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  748 | head.layers.14.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |   14.6562710 |   115.8679810 |       115.8679810 |     -115.8679810 |         0.0000000 |       74.6919708 |         0.0000000 |        -3.2179203 |          0.0000000 |
|  749 | head.layers.14.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.9999850 |         0.9999850 |        0.0000000 |         0.0000000 |        0.9999850 |         0.0000000 |         0.0039062 |          0.0000000 |
|  750 | head.layers.14.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.9999850 |         0.9999850 |        0.0000000 |         0.0000000 |        0.9999850 |         0.0000000 |         0.0039062 |          0.0000000 |
|  751 | head.layers.14.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.0747903 |     1.1403947 |         1.1403947 |       -1.1403947 |         0.0000000 |        1.1030540 |         0.0000000 |        -0.0044009 |          0.0000000 |
|  752 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  753 | head.layers.14.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  754 | head.layers.14.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.1312663 |     1.2122025 |         1.2122025 |       -1.0209106 |         0.0000000 |        1.2122025 |         0.0000000 |         0.0117613 |          0.0000000 |
|  755 | head.layers.14.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.9999850 |         0.9999850 |        0.0000000 |         0.0000000 |        0.9999850 |         0.0000000 |         0.0039062 |          0.0000000 |
|  756 | head.layers.14.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.7199981 |         0.7199981 |        0.0000000 |         0.0000000 |        0.7199981 |         0.0000000 |         0.0039062 |          0.0000000 |
|  757 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  758 | head.layers.14.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.1312663 |     1.2122025 |         1.2122025 |       -1.0209106 |         0.0000000 |        1.2122025 |         0.0000000 |         0.0117613 |          0.0000000 |
|  759 | head.layers.14.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.0000000 |    0.6871529 |     7.5661221 |         7.5661221 |       -4.2436876 |         0.0000000 |        7.5661221 |         0.0000000 |         0.0293541 |          0.0000000 |
|  760 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.0000000 |    0.6886823 |    11.2961035 |      7402.9014157 |      -11.2961035 |         0.0000000 |       11.2789898 |         0.0000000 |         0.0222141 |          0.0000000 |
|  761 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1812066 |     4.5623002 |     29899.0344019 |       -4.5623002 |         0.0000000 |        3.7034891 |         0.0000000 |        -0.0018177 |          0.0000000 |
|  762 | head.layers.15.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6762844 |    11.2961035 |     24676.3380523 |      -11.2961035 |         0.0000000 |       11.2789898 |         0.0000000 |         0.0296445 |          0.0000000 |
|  763 | head.layers.15.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0002464 |  0.0000000 |    0.6762844 |    11.2961035 |     45842.8676202 |      -11.2961035 |         0.0000000 |       11.2789898 |         0.0000000 |         0.0296445 |          0.0000000 |
|  764 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6762844 |    11.2961035 |     24676.3380523 |      -11.2961035 |         0.0000000 |       11.2789898 |         0.0000000 |         0.0296445 |          0.0000000 |
|  765 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0002464 |  0.0000000 |    0.6762844 |    11.2961035 |     45842.8676202 |      -11.2961035 |         0.0000000 |       11.2789898 |         0.0000000 |         0.0296445 |          0.0000000 |
|  766 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1812066 |     4.5623002 |     29899.0344019 |       -4.5623002 |         0.0000000 |        3.7034891 |         0.0000000 |        -0.0018177 |          0.0000000 |
|  767 | head.layers.15.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6762844 |    11.2961035 |     24676.3380523 |      -11.2961035 |         0.0000000 |       11.2789898 |         0.0000000 |         0.0296445 |          0.0000000 |
|  768 | head.layers.15.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0002464 |  0.0000000 |    0.6762844 |    11.2961035 |     45842.8676202 |      -11.2961035 |         0.0000000 |       11.2789898 |         0.0000000 |         0.0296445 |          0.0000000 |
|  769 | head.layers.15.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1812066 |     4.5623002 |     29899.0344019 |       -4.5623002 |         0.0000000 |        3.7034891 |         0.0000000 |        -0.0018177 |          0.0000000 |
|  770 | head.layers.15.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.2237558 |     9.7100601 |         9.7100601 |       -9.7100601 |         0.0000000 |        7.8958721 |         0.0000000 |        -0.0252149 |          0.0000000 |
|  771 | head.layers.15.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.3881698 |    11.5058899 |        11.5058899 |      -11.5058899 |         0.0000000 |       10.1481504 |         0.0000000 |         0.0246373 |          0.0000000 |
|  772 | head.layers.15.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2137608 |     3.0474529 |         3.0474529 |       -3.0474529 |         0.0000000 |        2.8030882 |         0.0000000 |        -0.0037322 |          0.0000000 |
|  773 | head.layers.15.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.2237558 |     9.7100601 |         9.7100601 |       -9.7100601 |         0.0000000 |        7.8958721 |         0.0000000 |        -0.0252149 |          0.0000000 |
|  774 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.2237558 |     9.7100601 |         9.7100601 |       -9.7100601 |         0.0000000 |        7.8958721 |         0.0000000 |        -0.0252149 |          0.0000000 |
|  775 | head.layers.15.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.3881698 |    11.5058899 |        11.5058899 |      -11.5058899 |         0.0000000 |       10.1481504 |         0.0000000 |         0.0246373 |          0.0000000 |
|  776 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.3881698 |    11.5058899 |        11.5058899 |      -11.5058899 |         0.0000000 |       10.1481504 |         0.0000000 |         0.0246373 |          0.0000000 |
|  777 | head.layers.15.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.2137608 |     3.0474529 |         3.0474529 |       -3.0474529 |         0.0000000 |        2.8030882 |         0.0000000 |        -0.0037322 |          0.0000000 |
|  778 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.2137608 |     3.0474529 |         3.0474529 |       -3.0474529 |         0.0000000 |        2.8030882 |         0.0000000 |        -0.0037322 |          0.0000000 |
|  779 | head.layers.15.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.0000000 |    0.1529695 |     1.2137575 |         9.7100601 |       -1.2137575 |         0.0000000 |        0.9869840 |         0.0000000 |        -0.0031519 |          0.0000000 |
|  780 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  781 | head.layers.15.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    5.6853352 |    54.8108253 |        54.8108253 |      -49.9130287 |         0.0000000 |       54.8108253 |         0.0000000 |        -0.4814257 |          0.0000000 |
|  782 | head.layers.15.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9979329 |         0.9979329 |        0.0000000 |         0.0000000 |        0.9979329 |         0.0000000 |         0.0019531 |          0.0000000 |
|  783 | head.layers.15.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9979329 |         0.9979329 |        0.0000000 |         0.0000000 |        0.9979329 |         0.0000000 |         0.0019531 |          0.0000000 |
|  784 | head.layers.15.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1435333 |     2.6797822 |         2.6797822 |       -2.6797822 |         0.0000000 |        2.4541111 |         0.0000000 |        -0.0052626 |          0.0000000 |
|  785 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  786 | head.layers.15.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  787 | head.layers.15.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2402058 |     1.6519465 |         1.6519465 |       -1.6519465 |         0.0000000 |        1.5394824 |         0.0000000 |         0.0115001 |          0.0000000 |
|  788 | head.layers.15.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 512])     | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9979329 |         0.9979329 |        0.0000000 |         0.0000000 |        0.9979329 |         0.0000000 |         0.0019531 |          0.0000000 |
|  789 | head.layers.15.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.4773574 |         0.4773574 |        0.0000005 |         0.0000000 |        0.4773574 |         0.0000000 |         0.0019531 |          0.0000000 |
|  790 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  791 | head.layers.15.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2402058 |     1.6519465 |         1.6519465 |       -1.6519465 |         0.0000000 |        1.5394824 |         0.0000000 |         0.0115001 |          0.0000000 |
|  792 | head.layers.15.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.0000000 |    0.6997384 |    11.7201481 |        11.7201481 |      -10.2311363 |         0.0000000 |       11.7201481 |         0.0000000 |         0.0411446 |          0.0000000 |
|  793 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.0000000 |    1.3109220 |    28.9194050 |     18952.3320604 |      -28.9194050 |         0.0000000 |       25.1059170 |         0.0000000 |        -0.0140256 |          0.0000000 |
|  794 | head.layers.16                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.5307721 |     7.0202994 |         7.0202994 |       -7.0202994 |         0.0000000 |        6.3727536 |         0.0000000 |         0.0017113 |          0.0000000 |
|  795 | head.layers.17.kps_generator.offset            | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 24])         | qint16        | 1.0000000 |  0.0000000 |    1.1751673 |     5.8241587 |         5.8241587 |       -5.8241587 |         0.0000000 |        5.5672221 |         0.0000000 |        -0.1842943 |          0.0000000 |
|  796 | head.layers.17.kps_generator                   | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.0000000 |    1.1751673 |     5.8241587 |         5.8241587 |       -5.8241587 |         0.0000000 |        5.5672221 |         0.0000000 |        -0.1842943 |          0.0000000 |
|  797 | head.layers.17.kps_generator                   | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 3])       | qint16        | 1.0000000 |  0.6422864 |    8.7534990 |   112.5461731 |       112.5461731 |      -60.9660416 |       -58.0000000 |       64.7206650 |        60.0000000 |         8.3521595 |          1.2368289 |
|  798 | head.layers.17.kps_generator.keypoints_add     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.6483262 |    9.0636492 |   113.5463562 |       113.5463562 |      -64.6912460 |       -58.0000000 |       68.5240326 |        60.0000000 |         8.1678658 |          1.2368289 |
|  799 | head.layers.17.weight_add                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.8934198 |     7.8027935 |         7.8027935 |       -7.5129437 |         0.0000000 |        7.8027935 |         0.0000000 |         0.0387862 |          0.0000000 |
|  800 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 3, 4])         | qint16        | 1.0000000 |  0.9977293 |    0.2101564 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3522723 |          0.3472222 |
|  801 | head.layers.17                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 6, 12])           | qint16        | 1.0000000 |  0.9977293 |    0.2101564 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3522723 |          0.3472222 |
|  802 | head.layers.17.camera_encoder.0                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.7482497 |     6.1943283 |         6.1943283 |       -6.1943283 |         0.0000000 |        5.5422120 |         0.0000000 |        -0.1244024 |          0.0000000 |
|  803 | head.layers.17.camera_encoder.1                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.3119237 |     5.5422120 |         5.5422120 |        0.0000000 |         0.0000000 |        5.5422120 |         0.0000000 |         0.3119237 |          0.0000000 |
|  804 | head.layers.17.camera_encoder.2                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.7815985 |     4.6364288 |         4.6364288 |       -0.8844011 |         0.0000000 |        4.6364288 |         0.0000000 |         0.0187403 |          0.0000000 |
|  805 | head.layers.17.camera_encoder.3                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    2.1257794 |    33.1644135 |        33.1644135 |      -15.0362740 |         0.0000000 |       33.1644135 |         0.0000000 |        -0.2752628 |          0.0000000 |
|  806 | head.layers.17.camera_encoder.4                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.9252583 |    33.1644135 |        33.1644135 |        0.0000000 |         0.0000000 |       33.1644135 |         0.0000000 |         0.9252583 |          0.0000000 |
|  807 | head.layers.17.camera_encoder.5                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.4274279 |     8.0350962 |         8.0350962 |       -0.8831643 |         0.0000000 |        8.0350962 |         0.0000000 |         0.0272648 |          0.0000000 |
|  808 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.8934198 |     7.8027935 |         7.8027935 |       -7.5129437 |         0.0000000 |        7.8027935 |         0.0000000 |         0.0387862 |          0.0000000 |
|  809 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 6, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.4274279 |     8.0350962 |         8.0350962 |       -0.8831643 |         0.0000000 |        8.0350962 |         0.0000000 |         0.0272648 |          0.0000000 |
|  810 | head.layers.17.cam_add                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 6, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.9935988 |    14.2484722 |        14.2484722 |       -6.4161563 |         0.0000000 |       14.2484722 |         0.0000000 |         0.0660510 |          0.0000000 |
|  811 | head.layers.17.weights_fc                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 6, 64])      | qint8         | 1.0000000 |  0.0000000 |    2.0543892 |    11.0555649 |        11.0555649 |      -11.0555649 |         0.0000000 |        9.2100954 |         0.0000000 |        -0.6440306 |          0.0000000 |
|  812 | head.layers.17                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.0000000 |    2.0543892 |    11.0555649 |        11.0555649 |      -11.0555649 |         0.0000000 |        9.2100954 |         0.0000000 |        -0.6440306 |          0.0000000 |
|  813 | head.layers.17.weight_softmax                  | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.0000000 |    0.0208333 |     0.8307908 |         0.8307908 |        0.0000002 |         0.0000000 |        0.8307908 |         0.0000000 |         0.0208333 |          0.0000000 |
|  814 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  0.8176657 |    9.5423479 |   113.5463562 |       113.5463562 |      -64.6912460 |       -58.0000000 |       68.5240326 |        60.0000000 |        11.0575018 |          5.5649042 |
|  815 | head.layers.17                                 | torch.ones_like                                                               | torch.ones_like                                                         | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
|  816 | head.layers.17.point_quant_stub                | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
|  817 | head.layers.17.point_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 8, 4])       | qint16        | 1.0000000 |  0.6484762 |    6.7977362 |   113.5463562 |       113.5463562 |      -64.6912460 |       -58.0000000 |       68.5240326 |        60.0000000 |         6.3758993 |          1.1776217 |
|  818 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 1, 1, 4, 4])   | qint16        | 1.0000000 |  0.9977411 |    0.1576173 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3267042 |          0.3229167 |
|  819 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 512, 8, 1, 4]) | qint16        | 1.0000000 |  0.6484762 |    6.7977362 |   113.5463562 |       113.5463562 |      -64.6912460 |       -58.0000000 |       68.5240326 |        60.0000000 |         6.3758993 |          1.1776217 |
|  820 | head.layers.17.point_matmul                    | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 4, 4]) | qint16        | 1.0000000 |  0.0740440 |    5.1293850 |   780.2541504 |       780.2541504 |     -350.2541504 |      -180.0000000 |      258.5354309 |       440.0000000 |        -0.2877243 |          1.5117548 |
|  821 | head.layers.17.point_sum                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 6, 512, 8, 4])    | qint16        | 0.0244144 |  0.0629968 |   15.8591995 |   811.4815063 |     33237.7753318 |     -306.5794373 |      -180.0076294 |      329.2130737 |       518.0010376 |        -1.1508971 |          6.0472698 |
|  822 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.7160528 |   11.3906546 |   125.8335190 |      5154.0622928 |      -74.6946030 |       -63.9902344 |       73.6387939 |        61.9882507 |        -1.1004387 |          1.5886035 |
|  823 | head.layers.17                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.8130744 |    6.0869594 |    57.4215813 |      2351.9520808 |        0.0100000 |         0.0000000 |       73.6387939 |        61.9882507 |        11.1084843 |         12.4062328 |
|  824 | head.layers.17.reciprocal_op                   | horizon_plugin_pytorch.nn.reciprocal.Reciprocal                               | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.7990369 |   47.8769836 |    99.9816895 |     32761.5000992 |        0.0135798 |         0.0152590 |      100.0000000 |         1.2787061 |        48.3292084 |          0.6503971 |
|  825 | head.layers.17                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.7990369 |   47.8769836 |    99.9816895 |     32761.5000992 |        0.0135798 |         0.0152590 |      100.0000000 |         1.2787061 |        48.3292084 |          0.6503971 |
|  826 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0244144 | -0.0498990 |   26.0225735 |   811.4815063 |     33237.7753318 |     -306.5794373 |      -180.0076294 |      329.2130737 |       518.0010376 |        -2.2515748 |         10.7997427 |
|  827 | head.layers.17.point_mul                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.4459211 | 1382.7216797 | 32911.3085938 | 107842130.4596793 |   -30657.9433594 |       -10.0001526 |    32921.3085938 |         9.9998474 |       591.4005737 |          2.1499586 |
|  828 | head.layers.17                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.4633754 |    0.6116686 |     2.1998703 |      7208.4251644 |       -1.1000000 |        -1.0998703 |        1.1000000 |         1.0998703 |         0.0885745 |          0.2038586 |
|  829 | head.layers.17                                 | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([156, 512, 8, 2])      | qint16        | 0.0003052 |  0.4633754 |    0.6116686 |     2.1998703 |      7208.4251644 |       -1.1000000 |        -1.0998703 |        1.1000000 |         1.0998703 |         0.0885745 |          0.2038586 |
|  830 | head.layers.17                                 | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer            | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer      | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.0000000 |    0.4369018 |    51.8514404 |        51.8514404 |      -51.8514404 |         0.0000000 |       45.6271400 |         0.0000000 |         0.0098722 |          0.0000000 |
|  831 | head.layers.17.feat_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.0000000 |    0.4369018 |    51.8514404 |        51.8514404 |      -51.8514404 |         0.0000000 |       45.6271400 |         0.0000000 |         0.0098722 |          0.0000000 |
|  832 | head.layers.17                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 6, 256, 512, 8])  | qint8         | 1.0000000 |  0.0000000 |    0.4369018 |    51.8514404 |        51.8514404 |      -51.8514404 |         0.0000000 |       45.6271400 |         0.0000000 |         0.0098722 |          0.0000000 |
|  833 | head.layers.17                                 | torch.Tensor.permute                                                          | torch.Tensor.permute                                                    | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.0000000 |    0.4369018 |    51.8514404 |        51.8514404 |      -51.8514404 |         0.0000000 |       45.6271400 |         0.0000000 |         0.0098722 |          0.0000000 |
|  834 | head.layers.17                                 | torch.Tensor.contiguous                                                       | torch.Tensor.contiguous                                                 | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.0000000 |    0.4369018 |    51.8514404 |        51.8514404 |      -51.8514404 |         0.0000000 |       45.6271400 |         0.0000000 |         0.0098722 |          0.0000000 |
|  835 | head.layers.17                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.0000000 |    0.4369018 |    51.8514404 |        51.8514404 |      -51.8514404 |         0.0000000 |       45.6271400 |         0.0000000 |         0.0098722 |          0.0000000 |
|  836 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 48, 8, 1])   | qint8         | 1.0000000 |  0.0000000 |    0.0208333 |     0.8307908 |         0.8307908 |        0.0000002 |         0.0000000 |        0.8307908 |         0.0000000 |         0.0208333 |          0.0000000 |
|  837 | head.layers.17                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.0000000 |    0.4369018 |    51.8514404 |        51.8514404 |      -51.8514404 |         0.0000000 |       45.6271400 |         0.0000000 |         0.0098722 |          0.0000000 |
|  838 | head.layers.17.feat_mul                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.0000000 |    0.0048432 |     7.4758368 |         7.4758368 |       -7.4758368 |         0.0000000 |        5.8407202 |         0.0000000 |        -0.0000019 |          0.0000000 |
|  839 | head.layers.17                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.0000000 |    0.0048432 |     7.4758368 |         7.4758368 |       -7.4758368 |         0.0000000 |        5.8407202 |         0.0000000 |        -0.0000019 |          0.0000000 |
|  840 | head.layers.17.feat_sum                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.1890101 |     8.1650257 |         8.1650257 |       -8.1650257 |         0.0000000 |        7.5853529 |         0.0000000 |        -0.0000904 |          0.0000000 |
|  841 | head.layers.17.output_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.2839333 |     8.7760124 |         8.7760124 |       -8.7760124 |         0.0000000 |        8.2063465 |         0.0000000 |         0.0127516 |          0.0000000 |
|  842 | head.layers.17.proj_drop                       | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.2839333 |     8.7760124 |         8.7760124 |       -8.7760124 |         0.0000000 |        8.2063465 |         0.0000000 |         0.0127516 |          0.0000000 |
|  843 | head.layers.17.residual_op                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.4073527 |     8.7760124 |         8.7760124 |       -8.7760124 |         0.0000000 |        8.2063465 |         0.0000000 |         0.0072314 |          0.0000000 |
|  844 | head.layers.18.pre_norm                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.5449211 |     8.5039587 |         8.5039587 |       -8.0949011 |         0.0000000 |        8.5039587 |         0.0000000 |        -0.0015801 |          0.0000000 |
|  845 | head.layers.18.layers.0.0                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.0000000 |    2.2664034 |    15.3310871 |        15.3310871 |      -15.3310871 |         0.0000000 |       13.3514929 |         0.0000000 |        -1.5495920 |          0.0000000 |
|  846 | head.layers.18.layers.0.2                      | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.0000000 |    0.3584056 |    13.3514929 |        13.3514929 |        0.0000000 |         0.0000000 |       13.3514929 |         0.0000000 |         0.3584056 |          0.0000000 |
|  847 | head.layers.18.layers.1                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    2.9118514 |    32.2975426 |        32.2975426 |      -30.0759449 |         0.0000000 |       32.2975426 |         0.0000000 |         0.0589710 |          0.0000000 |
|  848 | head.layers.18.layers.2                        | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    2.9118514 |    32.2975426 |        32.2975426 |      -30.0759449 |         0.0000000 |       32.2975426 |         0.0000000 |         0.0589710 |          0.0000000 |
|  849 | head.layers.18.identity_fc                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  850 | head.layers.18.short_add                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.linear.LinearAdd                          | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    4.0657234 |    38.0096092 |        38.0096092 |      -33.3633575 |         0.0000000 |       38.0096092 |         0.0000000 |         0.0569816 |          0.0000000 |
|  851 | head.layers.19                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.7483887 |     4.2000036 |         4.2000036 |       -4.2000036 |         0.0000000 |        4.1324372 |         0.0000000 |        -0.0037000 |          0.0000000 |
|  852 | head.layers.20.add1                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.9345696 |     8.0928955 |         8.0928955 |       -4.3241525 |         0.0000000 |        8.0928955 |         0.0000000 |         0.0333749 |          0.0000000 |
|  853 | head.layers.20.layers.0                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.3984871 |    10.8278551 |        10.8278551 |       -8.8428926 |         0.0000000 |       10.8278551 |         0.0000000 |        -0.4589201 |          0.0000000 |
|  854 | head.layers.20.layers.1                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4697835 |    10.8278551 |        10.8278551 |        0.0000000 |         0.0000000 |       10.8278551 |         0.0000000 |         0.4697835 |          0.0000000 |
|  855 | head.layers.20.layers.2                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.1264592 |    13.0131950 |        13.0131950 |      -11.9617977 |         0.0000000 |       13.0131950 |         0.0000000 |        -0.3426663 |          0.0000000 |
|  856 | head.layers.20.layers.3                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3918965 |    13.0131950 |        13.0131950 |        0.0000000 |         0.0000000 |       13.0131950 |         0.0000000 |         0.3918965 |          0.0000000 |
|  857 | head.layers.20.layers.4                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.7073647 |     6.9978495 |         6.9978495 |       -0.7974926 |         0.0000000 |        6.9978495 |         0.0000000 |         0.0453741 |          0.0000000 |
|  858 | head.layers.20.layers.5                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.3292676 |     7.9459128 |         7.9459128 |       -7.9459128 |         0.0000000 |        7.3138614 |         0.0000000 |        -0.4884540 |          0.0000000 |
|  859 | head.layers.20.layers.6                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4204068 |     7.3138614 |         7.3138614 |        0.0000000 |         0.0000000 |        7.3138614 |         0.0000000 |         0.4204068 |          0.0000000 |
|  860 | head.layers.20.layers.7                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.2850741 |    19.8699532 |        19.8699532 |       -6.7916288 |         0.0000000 |       19.8699532 |         0.0000000 |        -0.6493746 |          0.0000000 |
|  861 | head.layers.20.layers.8                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3178498 |    19.8699532 |        19.8699532 |        0.0000000 |         0.0000000 |       19.8699532 |         0.0000000 |         0.3178498 |          0.0000000 |
|  862 | head.layers.20.layers.9                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.5259117 |    11.2470684 |        11.2470684 |       -0.8317759 |         0.0000000 |       11.2470684 |         0.0000000 |         0.0339071 |          0.0000000 |
|  863 | head.layers.20.layers.10                       | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.0000000 |    0.7640033 |     8.5224724 |         8.5224724 |       -8.5224724 |         0.0000000 |        6.3673315 |         0.0000000 |        -0.1729554 |          0.0000000 |
|  864 | head.layers.20.layers.11.scale_quant_stub      | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([11])                  | qint16        | 1.0000000 |  0.8807191 |    0.1182117 |     0.3782869 |         0.3782869 |        0.0143771 |         0.0000000 |        1.0027264 |         1.0000000 |         0.2091208 |          0.0909091 |
|  865 | head.layers.20.layers.11.mul                   | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.0000000 |    0.1299120 |     3.2862599 |         3.2862599 |       -3.2862599 |         0.0000000 |        1.6482769 |         0.0000000 |        -0.0331804 |          0.0000000 |
|  866 | head.layers.20.add2                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.5511377 |    5.1586452 |   112.7409058 |       112.7409058 |      -61.3643532 |       -58.0000000 |       65.7300262 |        60.0000000 |         0.7111938 |          0.6780485 |
|  867 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 11])         | torch.float32 |           |  0.5511377 |    5.1586452 |   112.7409058 |                   |      -61.3643532 |       -58.0000000 |       65.7300262 |        60.0000000 |         0.7111938 |          0.6780485 |
|  868 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.6434982 |    8.8328009 |   112.7409058 |       112.7409058 |      -61.3643532 |       -58.0000000 |       65.7300262 |        60.0000000 |         8.2899294 |          1.2368289 |
|  869 | head.anchor_encoder.pos_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 | -0.0960667 |   12.7819786 |   160.1282349 |       160.1282349 |      -74.3983536 |         0.0000000 |       59.2099609 |        95.0000000 |        -0.6974280 |          3.4413030 |
|  870 | head.anchor_encoder.pos_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.2300082 |    6.9705424 |    95.0000000 |        95.0000000 |        0.0000000 |         0.0000000 |       59.2099609 |        95.0000000 |         5.1140084 |          3.4413030 |
|  871 | head.anchor_encoder.pos_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.1006217 |    2.4010687 |   127.7151108 |       127.7151108 |       -0.9092913 |        -1.0000000 |        4.0663471 |       127.0000000 |         0.0112715 |          1.5994991 |
|  872 | head.anchor_encoder.pos_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.8675719 |     6.3178120 |         6.3178120 |       -5.5490651 |         0.0000000 |        6.3178120 |         0.0000000 |        -0.2287304 |          0.0000000 |
|  873 | head.anchor_encoder.pos_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.3194208 |     6.3178120 |         6.3178120 |        0.0000000 |         0.0000000 |        6.3178120 |         0.0000000 |         0.3194208 |          0.0000000 |
|  874 | head.anchor_encoder.pos_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.6450923 |     6.3500824 |         6.3500824 |       -0.9787710 |         0.0000000 |        6.3500824 |         0.0000000 |         0.0790837 |          0.0000000 |
|  875 | head.anchor_encoder.pos_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    1.1443220 |     6.1304927 |         6.1304927 |       -6.1304927 |         0.0000000 |        5.7159214 |         0.0000000 |        -0.0556180 |          0.0000000 |
|  876 | head.anchor_encoder.pos_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.5443520 |     5.7159214 |         5.7159214 |        0.0000000 |         0.0000000 |        5.7159214 |         0.0000000 |         0.5443521 |          0.0000000 |
|  877 | head.anchor_encoder.pos_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.7973275 |     5.7303667 |         5.7303667 |       -0.8680604 |         0.0000000 |        5.7303667 |         0.0000000 |         0.0264729 |          0.0000000 |
|  878 | head.anchor_encoder.pos_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    1.3051027 |     8.3934813 |         8.3934813 |       -5.6268206 |         0.0000000 |        8.3934813 |         0.0000000 |        -0.2815331 |          0.0000000 |
|  879 | head.anchor_encoder.pos_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.5117849 |     8.3934813 |         8.3934813 |        0.0000000 |         0.0000000 |        8.3934813 |         0.0000000 |         0.5117849 |          0.0000000 |
|  880 | head.anchor_encoder.pos_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.6693519 |     7.4412642 |         7.4412642 |       -0.8784087 |         0.0000000 |        7.4412642 |         0.0000000 |         0.0309142 |          0.0000000 |
|  881 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.8612920 |    0.5051301 |     2.1477909 |         2.1477909 |       -0.6003526 |         0.0000000 |        2.6085229 |         2.0000000 |         0.9141366 |          0.9993490 |
|  882 | head.anchor_encoder.size_fc.0                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0910590 |    0.6533724 |     2.7894330 |         2.7894330 |       -2.7894330 |         0.0000000 |        1.4848489 |         1.0000000 |        -0.3523337 |          0.0858765 |
|  883 | head.anchor_encoder.size_fc.1                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.2769873 |    0.1711643 |     1.4848489 |         1.4848489 |        0.0000000 |         0.0000000 |        1.4848489 |         1.0000000 |         0.1298744 |          0.0858765 |
|  884 | head.anchor_encoder.size_fc.2                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.1658996 |   11.4983606 |   127.5277557 |       127.5277557 |       -0.7205223 |         0.0000000 |        3.9690325 |       127.0000000 |         0.0222566 |         10.9063110 |
|  885 | head.anchor_encoder.size_fc.3                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.4255763 |     1.7687529 |         1.7687529 |       -1.7687529 |         0.0000000 |        1.6264855 |         0.0000000 |        -0.0234206 |          0.0000000 |
|  886 | head.anchor_encoder.size_fc.4                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2010778 |     1.6264855 |         1.6264855 |        0.0000000 |         0.0000000 |        1.6264855 |         0.0000000 |         0.2010778 |          0.0000000 |
|  887 | head.anchor_encoder.size_fc.5                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7851445 |     3.4264002 |         3.4264002 |       -0.9041424 |         0.0000000 |        3.4264002 |         0.0000000 |         0.0094727 |          0.0000000 |
|  888 | head.anchor_encoder.size_fc.6                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7005212 |     2.2668788 |         2.2668788 |       -2.1040955 |         0.0000000 |        2.2668788 |         0.0000000 |        -0.0719134 |          0.0000000 |
|  889 | head.anchor_encoder.size_fc.7                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.3143039 |     2.2668788 |         2.2668788 |        0.0000000 |         0.0000000 |        2.2668788 |         0.0000000 |         0.3143040 |          0.0000000 |
|  890 | head.anchor_encoder.size_fc.8                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7637061 |     3.5824089 |         3.5824089 |       -0.8194100 |         0.0000000 |        3.5824089 |         0.0000000 |         0.0119253 |          0.0000000 |
|  891 | head.anchor_encoder.size_fc.9                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7424819 |     3.0190744 |         3.0190744 |       -2.6853678 |         0.0000000 |        3.0190744 |         0.0000000 |         0.1328448 |          0.0000000 |
|  892 | head.anchor_encoder.size_fc.10                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.4376634 |     3.0190744 |         3.0190744 |        0.0000000 |         0.0000000 |        3.0190744 |         0.0000000 |         0.4376633 |          0.0000000 |
|  893 | head.anchor_encoder.size_fc.11                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7023818 |     3.9347405 |         3.9347405 |       -1.2558914 |         0.0000000 |        3.9347405 |         0.0000000 |         0.0287289 |          0.0000000 |
|  894 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 2])          | qint16        | 1.0000000 | -0.7828858 |    0.9051371 |     2.2414498 |         2.2414498 |       -1.6228926 |         0.0000000 |        0.1482156 |         1.0000000 |        -0.5218878 |          0.3750000 |
|  895 | head.anchor_encoder.yaw_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0165305 |    0.4777793 |     2.1071157 |         2.1071157 |       -1.4229622 |         0.0000000 |        1.5143749 |         2.0000000 |         0.1349649 |          0.0859375 |
|  896 | head.anchor_encoder.yaw_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0385343 |    0.3438053 |     2.0000000 |         2.0000000 |        0.0000000 |         0.0000000 |        1.5143749 |         2.0000000 |         0.2689390 |          0.0859375 |
|  897 | head.anchor_encoder.yaw_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 | -0.1429918 |    8.6986561 |   127.9660339 |       127.9660339 |       -1.2629845 |         0.0000000 |        3.0178025 |       127.0000000 |        -0.0003738 |          7.9375005 |
|  898 | head.anchor_encoder.yaw_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6153246 |     3.2677333 |         3.2677333 |       -3.2677333 |         0.0000000 |        1.9005702 |         0.0000000 |        -0.1690043 |          0.0000000 |
|  899 | head.anchor_encoder.yaw_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2231602 |     1.9005702 |         1.9005702 |        0.0000000 |         0.0000000 |        1.9005702 |         0.0000000 |         0.2231602 |          0.0000000 |
|  900 | head.anchor_encoder.yaw_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7503112 |     4.0712299 |         4.0712299 |       -0.9146640 |         0.0000000 |        4.0712299 |         0.0000000 |         0.0090806 |          0.0000000 |
|  901 | head.anchor_encoder.yaw_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.8303535 |     3.7476013 |         3.7476013 |       -3.7476013 |         0.0000000 |        2.5236619 |         0.0000000 |        -0.2536897 |          0.0000000 |
|  902 | head.anchor_encoder.yaw_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2883320 |     2.5236619 |         2.5236619 |        0.0000000 |         0.0000000 |        2.5236619 |         0.0000000 |         0.2883320 |          0.0000000 |
|  903 | head.anchor_encoder.yaw_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7906388 |     3.7466285 |         3.7466285 |       -0.9110932 |         0.0000000 |        3.7466285 |         0.0000000 |         0.0136148 |          0.0000000 |
|  904 | head.anchor_encoder.yaw_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    1.0365450 |     5.1873231 |         5.1873231 |       -5.1873231 |         0.0000000 |        2.7132473 |         0.0000000 |        -0.5068019 |          0.0000000 |
|  905 | head.anchor_encoder.yaw_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2648716 |     2.7132473 |         2.7132473 |        0.0000000 |         0.0000000 |        2.7132473 |         0.0000000 |         0.2648716 |          0.0000000 |
|  906 | head.anchor_encoder.yaw_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6330777 |     4.9305153 |         4.9305153 |       -0.9855185 |         0.0000000 |        4.9305153 |         0.0000000 |         0.0795994 |          0.0000000 |
|  907 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.0000000 |    8.9736776 |    46.2724419 |        46.2724419 |      -46.2724419 |         0.0000000 |       15.6435757 |         0.0000000 |        -6.2484307 |          0.0000000 |
|  908 | head.anchor_encoder.vel_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0642948 |    5.9330621 |    28.4714832 |        28.4714832 |      -28.4714832 |         0.0000000 |       25.8244381 |         1.0000000 |        -0.2907393 |          0.2031250 |
|  909 | head.anchor_encoder.vel_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.2847882 |    2.7984877 |    24.8244381 |        24.8244381 |        0.0000000 |         0.0000000 |       25.8244381 |         1.0000000 |         2.8438358 |          0.2031250 |
|  910 | head.anchor_encoder.vel_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.1393594 |   26.3542252 |   127.8396225 |       127.8396225 |       -0.8982413 |         0.0000000 |        3.5534239 |       127.0000000 |         0.0183858 |         25.7968769 |
|  911 | head.anchor_encoder.vel_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.5562304 |     3.1954367 |         3.1954367 |       -2.8887441 |         0.0000000 |        3.1954367 |         0.0000000 |        -0.1591597 |          0.0000000 |
|  912 | head.anchor_encoder.vel_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.1985354 |     3.1954367 |         3.1954367 |        0.0000000 |         0.0000000 |        3.1954367 |         0.0000000 |         0.1985354 |          0.0000000 |
|  913 | head.anchor_encoder.vel_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.7644939 |     4.2025633 |         4.2025633 |       -0.9052612 |         0.0000000 |        4.2025633 |         0.0000000 |         0.0363908 |          0.0000000 |
|  914 | head.anchor_encoder.vel_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.9070168 |     4.7533975 |         4.7533975 |       -4.7533975 |         0.0000000 |        4.1327453 |         0.0000000 |        -0.1576475 |          0.0000000 |
|  915 | head.anchor_encoder.vel_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.3746846 |     4.1327453 |         4.1327453 |        0.0000000 |         0.0000000 |        4.1327453 |         0.0000000 |         0.3746846 |          0.0000000 |
|  916 | head.anchor_encoder.vel_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.7581543 |     5.6340590 |         5.6340590 |       -0.8833795 |         0.0000000 |        5.6340590 |         0.0000000 |         0.0204479 |          0.0000000 |
|  917 | head.anchor_encoder.vel_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.8970052 |     4.5774674 |         4.5774674 |       -4.5774674 |         0.0000000 |        4.4371643 |         0.0000000 |        -0.1936268 |          0.0000000 |
|  918 | head.anchor_encoder.vel_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.3516892 |     4.4371643 |         4.4371643 |        0.0000000 |         0.0000000 |        4.4371643 |         0.0000000 |         0.3516892 |          0.0000000 |
|  919 | head.anchor_encoder.vel_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.6276847 |     5.3349290 |         5.3349290 |       -0.8188648 |         0.0000000 |        5.3349290 |         0.0000000 |         0.0283618 |          0.0000000 |
|  920 | head.anchor_encoder.cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.6585295 |     7.4412642 |         7.4412642 |       -1.2558914 |         0.0000000 |        7.4412642 |         0.0000000 |         0.0360886 |          0.0000000 |
|  921 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 256, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
|  922 | head.layers.21.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.7034591 |     7.4412642 |     16255.4415450 |       -4.2000036 |         0.0000000 |        7.4412642 |         0.0000000 |         0.0161943 |          0.0000000 |
|  923 | head.layers.21.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 256, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
|  924 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.7034591 |     7.4412642 |     16255.4415450 |       -4.2000036 |         0.0000000 |        7.4412642 |         0.0000000 |         0.0161943 |          0.0000000 |
|  925 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
|  926 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
|  927 | head.layers.21.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.7034591 |     7.4412642 |     16255.4415450 |       -4.2000036 |         0.0000000 |        7.4412642 |         0.0000000 |         0.0161943 |          0.0000000 |
|  928 | head.layers.21.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
|  929 | head.layers.21.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
|  930 | head.layers.21.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.9044840 |    12.4330578 |        12.4330578 |       -9.9143467 |         0.0000000 |       12.4330578 |         0.0000000 |         0.0534816 |          0.0000000 |
|  931 | head.layers.21.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.6057475 |     7.8649473 |         7.8649473 |       -7.8649473 |         0.0000000 |        7.5044727 |         0.0000000 |        -0.0093173 |          0.0000000 |
|  932 | head.layers.21.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.0889345 |     1.2035409 |         1.2035409 |       -1.1944959 |         0.0000000 |        1.2035409 |         0.0000000 |         0.0015683 |          0.0000000 |
|  933 | head.layers.21.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.9044840 |    12.4330578 |        12.4330578 |       -9.9143467 |         0.0000000 |       12.4330578 |         0.0000000 |         0.0534816 |          0.0000000 |
|  934 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.9044840 |    12.4330578 |        12.4330578 |       -9.9143467 |         0.0000000 |       12.4330578 |         0.0000000 |         0.0534816 |          0.0000000 |
|  935 | head.layers.21.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  936 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  937 | head.layers.21.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.0889345 |     1.2035409 |         1.2035409 |       -1.1944959 |         0.0000000 |        1.2035409 |         0.0000000 |         0.0015683 |          0.0000000 |
|  938 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 256, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.0889345 |     1.2035409 |         1.2035409 |       -1.1944959 |         0.0000000 |        1.2035409 |         0.0000000 |         0.0015683 |          0.0000000 |
|  939 | head.layers.21.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.0000000 |    0.2380605 |     1.5541322 |        12.4330578 |       -1.2392933 |         0.0000000 |        1.5541322 |         0.0000000 |         0.0066852 |          0.0000000 |
|  940 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  941 | head.layers.21.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |   12.1561575 |   108.4993439 |       108.4993439 |      -94.2106934 |         0.0000000 |      108.4993439 |         0.0000000 |        -2.4106576 |          0.0000000 |
|  942 | head.layers.21.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.9999995 |         0.9999995 |        0.0000000 |         0.0000000 |        0.9999995 |         0.0000000 |         0.0039062 |          0.0000000 |
|  943 | head.layers.21.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.9999995 |         0.9999995 |        0.0000000 |         0.0000000 |        0.9999995 |         0.0000000 |         0.0039062 |          0.0000000 |
|  944 | head.layers.21.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.0746770 |     1.0618507 |         1.0618507 |       -1.0462036 |         0.0000000 |        1.0618507 |         0.0000000 |         0.0027198 |          0.0000000 |
|  945 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  946 | head.layers.21.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  947 | head.layers.21.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.1253052 |     1.0275482 |         1.0275482 |       -0.9551347 |         0.0000000 |        1.0275482 |         0.0000000 |         0.0035470 |          0.0000000 |
|  948 | head.layers.21.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.9999995 |         0.9999995 |        0.0000000 |         0.0000000 |        0.9999995 |         0.0000000 |         0.0039062 |          0.0000000 |
|  949 | head.layers.21.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.5309081 |         0.5309081 |        0.0000000 |         0.0000000 |        0.5309081 |         0.0000000 |         0.0039062 |          0.0000000 |
|  950 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  951 | head.layers.21.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.1253052 |     1.0275482 |         1.0275482 |       -0.9551347 |         0.0000000 |        1.0275482 |         0.0000000 |         0.0035470 |          0.0000000 |
|  952 | head.layers.21.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.0000000 |    0.6939830 |     7.5777302 |         7.5777302 |       -4.4418135 |         0.0000000 |        7.5777302 |         0.0000000 |         0.0197413 |          0.0000000 |
|  953 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.0000000 |    0.6702791 |    11.0510941 |      7242.3344907 |      -11.0510941 |         0.0000000 |       10.6080065 |         0.0000000 |         0.0072260 |          0.0000000 |
|  954 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1816115 |     4.1605411 |     27266.1058272 |       -4.1605411 |         0.0000000 |        3.5609729 |         0.0000000 |         0.0008738 |          0.0000000 |
|  955 | head.layers.22.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6644043 |    11.0510941 |     24141.1149692 |      -11.0510941 |         0.0000000 |       10.6080065 |         0.0000000 |         0.0216573 |          0.0000000 |
|  956 | head.layers.22.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0002278 |  0.0000000 |    0.6644043 |    11.0510941 |     48522.5245431 |      -11.0510941 |         0.0000000 |       10.6080065 |         0.0000000 |         0.0216573 |          0.0000000 |
|  957 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6644043 |    11.0510941 |     24141.1149692 |      -11.0510941 |         0.0000000 |       10.6080065 |         0.0000000 |         0.0216573 |          0.0000000 |
|  958 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0002278 |  0.0000000 |    0.6644043 |    11.0510941 |     48522.5245431 |      -11.0510941 |         0.0000000 |       10.6080065 |         0.0000000 |         0.0216573 |          0.0000000 |
|  959 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1816115 |     4.1605411 |     27266.1058272 |       -4.1605411 |         0.0000000 |        3.5609729 |         0.0000000 |         0.0008738 |          0.0000000 |
|  960 | head.layers.22.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6644043 |    11.0510941 |     24141.1149692 |      -11.0510941 |         0.0000000 |       10.6080065 |         0.0000000 |         0.0216573 |          0.0000000 |
|  961 | head.layers.22.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0002278 |  0.0000000 |    0.6644043 |    11.0510941 |     48522.5245431 |      -11.0510941 |         0.0000000 |       10.6080065 |         0.0000000 |         0.0216573 |          0.0000000 |
|  962 | head.layers.22.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1816115 |     4.1605411 |     27266.1058272 |       -4.1605411 |         0.0000000 |        3.5609729 |         0.0000000 |         0.0008738 |          0.0000000 |
|  963 | head.layers.22.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.2300379 |    11.0944195 |        11.0944195 |      -11.0944195 |         0.0000000 |        9.0824242 |         0.0000000 |         0.0519139 |          0.0000000 |
|  964 | head.layers.22.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.3955388 |    14.8100872 |        14.8100872 |      -14.8100872 |         0.0000000 |       14.4945602 |         0.0000000 |         0.0122217 |          0.0000000 |
|  965 | head.layers.22.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2112837 |     3.8896461 |         3.8896461 |       -3.6132488 |         0.0000000 |        3.8896461 |         0.0000000 |         0.0110019 |          0.0000000 |
|  966 | head.layers.22.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.2300379 |    11.0944195 |        11.0944195 |      -11.0944195 |         0.0000000 |        9.0824242 |         0.0000000 |         0.0519139 |          0.0000000 |
|  967 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.2300379 |    11.0944195 |        11.0944195 |      -11.0944195 |         0.0000000 |        9.0824242 |         0.0000000 |         0.0519139 |          0.0000000 |
|  968 | head.layers.22.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.3955388 |    14.8100872 |        14.8100872 |      -14.8100872 |         0.0000000 |       14.4945602 |         0.0000000 |         0.0122217 |          0.0000000 |
|  969 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.3955388 |    14.8100872 |        14.8100872 |      -14.8100872 |         0.0000000 |       14.4945602 |         0.0000000 |         0.0122217 |          0.0000000 |
|  970 | head.layers.22.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.2112837 |     3.8896461 |         3.8896461 |       -3.6132488 |         0.0000000 |        3.8896461 |         0.0000000 |         0.0110019 |          0.0000000 |
|  971 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.2112837 |     3.8896461 |         3.8896461 |       -3.6132488 |         0.0000000 |        3.8896461 |         0.0000000 |         0.0110019 |          0.0000000 |
|  972 | head.layers.22.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.0000000 |    0.1537547 |     1.3868024 |        11.0944195 |       -1.3868024 |         0.0000000 |        1.1353030 |         0.0000000 |         0.0064892 |          0.0000000 |
|  973 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  974 | head.layers.22.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    6.5847111 |   125.3901672 |       125.3901672 |      -76.7403641 |         0.0000000 |      125.3901672 |         0.0000000 |         1.1438934 |          0.0000000 |
|  975 | head.layers.22.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9998353 |         0.9998353 |        0.0000000 |         0.0000000 |        0.9998353 |         0.0000000 |         0.0019531 |          0.0000000 |
|  976 | head.layers.22.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9998353 |         0.9998353 |        0.0000000 |         0.0000000 |        0.9998353 |         0.0000000 |         0.0019531 |          0.0000000 |
|  977 | head.layers.22.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1590513 |     3.5368543 |         3.5368543 |       -2.2391102 |         0.0000000 |        3.5368543 |         0.0000000 |         0.0127794 |          0.0000000 |
|  978 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  979 | head.layers.22.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  980 | head.layers.22.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2632511 |     1.6344448 |         1.6344448 |       -1.6344448 |         0.0000000 |        1.5758030 |         0.0000000 |         0.0173194 |          0.0000000 |
|  981 | head.layers.22.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 512])     | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9998353 |         0.9998353 |        0.0000000 |         0.0000000 |        0.9998353 |         0.0000000 |         0.0019531 |          0.0000000 |
|  982 | head.layers.22.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.2491230 |         0.2491230 |        0.0000001 |         0.0000000 |        0.2491230 |         0.0000000 |         0.0019531 |          0.0000000 |
|  983 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  984 | head.layers.22.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2632511 |     1.6344448 |         1.6344448 |       -1.6344448 |         0.0000000 |        1.5758030 |         0.0000000 |         0.0173194 |          0.0000000 |
|  985 | head.layers.22.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.0000000 |    0.6977998 |    10.7000008 |        10.7000008 |      -10.7000008 |         0.0000000 |        9.9161015 |         0.0000000 |         0.0389767 |          0.0000000 |
|  986 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.0000000 |    1.2410706 |    30.1961613 |     19789.0542930 |      -30.1961613 |         0.0000000 |       23.8093109 |         0.0000000 |        -0.0230869 |          0.0000000 |
|  987 | head.layers.23                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.5425187 |     6.9244270 |         6.9244270 |       -6.9244270 |         0.0000000 |        5.6307597 |         0.0000000 |        -0.0015870 |          0.0000000 |
|  988 | head.layers.24.kps_generator.offset            | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 24])         | qint16        | 1.0000000 |  0.0000000 |    1.1529770 |     7.6783643 |         7.6783643 |       -7.6783643 |         0.0000000 |        5.2292757 |         0.0000000 |        -0.3377497 |          0.0000000 |
|  989 | head.layers.24.kps_generator                   | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.0000000 |    1.1529770 |     7.6783643 |         7.6783643 |       -7.6783643 |         0.0000000 |        5.2292757 |         0.0000000 |        -0.3377497 |          0.0000000 |
|  990 | head.layers.24.kps_generator                   | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 3])       | qint16        | 1.0000000 |  0.6434982 |    8.8328009 |   112.7409058 |       112.7409058 |      -61.3643532 |       -58.0000000 |       65.7300262 |        60.0000000 |         8.2899294 |          1.2368289 |
|  991 | head.layers.24.kps_generator.keypoints_add     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.6485652 |    9.2211065 |   113.7320328 |       113.7320328 |      -65.0653076 |       -58.0000000 |       68.2372742 |        60.0000000 |         7.9521799 |          1.2368289 |
|  992 | head.layers.24.weight_add                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.8958105 |     8.1879025 |         8.1879025 |       -7.4442348 |         0.0000000 |        8.1879025 |         0.0000000 |         0.0345015 |          0.0000000 |
|  993 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 3, 4])         | qint16        | 1.0000000 |  0.9977293 |    0.2101564 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3522723 |          0.3472222 |
|  994 | head.layers.24                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 6, 12])           | qint16        | 1.0000000 |  0.9977293 |    0.2101564 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3522723 |          0.3472222 |
|  995 | head.layers.24.camera_encoder.0                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.7045760 |     6.2766805 |         6.2766805 |       -6.2766805 |         0.0000000 |        6.1695027 |         0.0000000 |        -0.0835410 |          0.0000000 |
|  996 | head.layers.24.camera_encoder.1                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.3105174 |     6.1695027 |         6.1695027 |        0.0000000 |         0.0000000 |        6.1695027 |         0.0000000 |         0.3105174 |          0.0000000 |
|  997 | head.layers.24.camera_encoder.2                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.7858711 |     4.2228565 |         4.2228565 |       -0.8781247 |         0.0000000 |        4.2228565 |         0.0000000 |         0.0260929 |          0.0000000 |
|  998 | head.layers.24.camera_encoder.3                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    2.1815076 |    33.1354103 |        33.1354103 |      -14.0334568 |         0.0000000 |       33.1354103 |         0.0000000 |        -0.2934401 |          0.0000000 |
|  999 | head.layers.24.camera_encoder.4                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.9440338 |    33.1354103 |        33.1354103 |        0.0000000 |         0.0000000 |       33.1354103 |         0.0000000 |         0.9440337 |          0.0000000 |
| 1000 | head.layers.24.camera_encoder.5                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.4514041 |     7.7228999 |         7.7228999 |       -1.1285971 |         0.0000000 |        7.7228999 |         0.0000000 |         0.0302501 |          0.0000000 |
| 1001 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.8958105 |     8.1879025 |         8.1879025 |       -7.4442348 |         0.0000000 |        8.1879025 |         0.0000000 |         0.0345015 |          0.0000000 |
| 1002 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 6, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.4514041 |     7.7228999 |         7.7228999 |       -1.1285971 |         0.0000000 |        7.7228999 |         0.0000000 |         0.0302501 |          0.0000000 |
| 1003 | head.layers.24.cam_add                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 6, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.9820822 |    14.8980255 |        14.8980255 |       -6.0021701 |         0.0000000 |       14.8980255 |         0.0000000 |         0.0647516 |          0.0000000 |
| 1004 | head.layers.24.weights_fc                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 6, 64])      | qint8         | 1.0000000 |  0.0000000 |    1.8995733 |    10.3969002 |        10.3969002 |      -10.3969002 |         0.0000000 |        7.8918357 |         0.0000000 |        -0.3848869 |          0.0000000 |
| 1005 | head.layers.24                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.0000000 |    1.8995733 |    10.3969002 |        10.3969002 |      -10.3969002 |         0.0000000 |        7.8918357 |         0.0000000 |        -0.3848869 |          0.0000000 |
| 1006 | head.layers.24.weight_softmax                  | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.0000000 |    0.0208333 |     0.7735062 |         0.7735062 |        0.0000000 |         0.0000000 |        0.7735062 |         0.0000000 |         0.0208333 |          0.0000000 |
| 1007 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  0.8175919 |    9.8033638 |   113.7320328 |       113.7320328 |      -65.0653076 |       -58.0000000 |       68.2372742 |        60.0000000 |        10.4273148 |          5.5649042 |
| 1008 | head.layers.24                                 | torch.ones_like                                                               | torch.ones_like                                                         | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
| 1009 | head.layers.24.point_quant_stub                | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
| 1010 | head.layers.24.point_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 8, 4])       | qint16        | 1.0000000 |  0.6487116 |    6.9158301 |   113.7320328 |       113.7320328 |      -65.0653076 |       -58.0000000 |       68.2372742 |        60.0000000 |         6.2141352 |          1.1776217 |
| 1011 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 1, 1, 4, 4])   | qint16        | 1.0000000 |  0.9977411 |    0.1576173 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3267042 |          0.3229167 |
| 1012 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 512, 8, 1, 4]) | qint16        | 1.0000000 |  0.6487116 |    6.9158301 |   113.7320328 |       113.7320328 |      -65.0653076 |       -58.0000000 |       68.2372742 |        60.0000000 |         6.2141352 |          1.1776217 |
| 1013 | head.layers.24.point_matmul                    | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 4, 4]) | qint16        | 1.0000000 |  0.0757400 |    5.1338153 |   783.4978027 |       783.4978027 |     -353.4978027 |      -180.0000000 |      244.6639252 |       440.0000000 |        -0.1840860 |          1.5117548 |
| 1014 | head.layers.24.point_sum                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 6, 512, 8, 4])    | qint16        | 0.0244144 |  0.0685865 |   15.9521198 |   817.6738281 |     33491.4089617 |     -310.4479065 |      -180.0076294 |      315.5769653 |       518.0010376 |        -0.7363439 |          6.0472698 |
| 1015 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.7165800 |   11.4876289 |   125.4466248 |      5138.2153471 |      -73.7036285 |       -63.9902344 |       73.8060379 |        61.9882507 |        -0.9072191 |          1.5886035 |
| 1016 | head.layers.24                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.8128527 |    6.1429977 |    57.8898315 |      2371.1313194 |        0.0100000 |         0.0000000 |       73.8060379 |        61.9882507 |        11.2296143 |         12.4062328 |
| 1017 | head.layers.24.reciprocal_op                   | horizon_plugin_pytorch.nn.reciprocal.Reciprocal                               | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.7906966 |   46.5968971 |    99.9816895 |     32761.5000992 |        0.0135490 |         0.0152590 |      100.0000000 |         1.2787061 |        47.0287170 |          0.6503971 |
| 1018 | head.layers.24                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.7906966 |   46.5968971 |    99.9816895 |     32761.5000992 |        0.0135490 |         0.0152590 |      100.0000000 |         1.2787061 |        47.0287170 |          0.6503971 |
| 1019 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0244144 | -0.0425870 |   26.1599293 |   817.6738281 |     33491.4089617 |     -310.4479065 |      -180.0076294 |      315.5769653 |       518.0010376 |        -1.5190781 |         10.7997427 |
| 1020 | head.layers.24.point_mul                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.4639363 | 1419.8791504 | 31547.6972656 | 103373917.0392053 |   -31044.7910156 |       -10.0001526 |    31557.6972656 |         9.9998474 |       627.1473389 |          2.1499586 |
| 1021 | head.layers.24                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.4628072 |    0.6102868 |     2.1998703 |      7208.4251644 |       -1.1000000 |        -1.0998703 |        1.1000000 |         1.0998703 |         0.1073882 |          0.2038586 |
| 1022 | head.layers.24                                 | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([156, 512, 8, 2])      | qint16        | 0.0003052 |  0.4628072 |    0.6102868 |     2.1998703 |      7208.4251644 |       -1.1000000 |        -1.0998703 |        1.1000000 |         1.0998703 |         0.1073882 |          0.2038586 |
| 1023 | head.layers.24                                 | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer            | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer      | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.0000000 |    0.4793736 |    54.5699921 |        54.5699921 |      -54.5699921 |         0.0000000 |       49.5862045 |         0.0000000 |         0.0163597 |          0.0000000 |
| 1024 | head.layers.24.feat_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.0000000 |    0.4793736 |    54.5699921 |        54.5699921 |      -54.5699921 |         0.0000000 |       49.5862045 |         0.0000000 |         0.0163597 |          0.0000000 |
| 1025 | head.layers.24                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 6, 256, 512, 8])  | qint8         | 1.0000000 |  0.0000000 |    0.4793736 |    54.5699921 |        54.5699921 |      -54.5699921 |         0.0000000 |       49.5862045 |         0.0000000 |         0.0163597 |          0.0000000 |
| 1026 | head.layers.24                                 | torch.Tensor.permute                                                          | torch.Tensor.permute                                                    | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.0000000 |    0.4793736 |    54.5699921 |        54.5699921 |      -54.5699921 |         0.0000000 |       49.5862045 |         0.0000000 |         0.0163597 |          0.0000000 |
| 1027 | head.layers.24                                 | torch.Tensor.contiguous                                                       | torch.Tensor.contiguous                                                 | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.0000000 |    0.4793736 |    54.5699921 |        54.5699921 |      -54.5699921 |         0.0000000 |       49.5862045 |         0.0000000 |         0.0163597 |          0.0000000 |
| 1028 | head.layers.24                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.0000000 |    0.4793736 |    54.5699921 |        54.5699921 |      -54.5699921 |         0.0000000 |       49.5862045 |         0.0000000 |         0.0163597 |          0.0000000 |
| 1029 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 48, 8, 1])   | qint8         | 1.0000000 |  0.0000000 |    0.0208333 |     0.7735062 |         0.7735062 |        0.0000000 |         0.0000000 |        0.7735062 |         0.0000000 |         0.0208333 |          0.0000000 |
| 1030 | head.layers.24                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.0000000 |    0.4793736 |    54.5699921 |        54.5699921 |      -54.5699921 |         0.0000000 |       49.5862045 |         0.0000000 |         0.0163597 |          0.0000000 |
| 1031 | head.layers.24.feat_mul                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.0000000 |    0.0058296 |     6.6838832 |         6.6838832 |       -6.4750972 |         0.0000000 |        6.6838832 |         0.0000000 |         0.0001090 |          0.0000000 |
| 1032 | head.layers.24                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.0000000 |    0.0058296 |     6.6838832 |         6.6838832 |       -6.4750972 |         0.0000000 |        6.6838832 |         0.0000000 |         0.0001090 |          0.0000000 |
| 1033 | head.layers.24.feat_sum                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.2311858 |     8.9435978 |         8.9435978 |       -8.8248577 |         0.0000000 |        8.9435978 |         0.0000000 |         0.0052338 |          0.0000000 |
| 1034 | head.layers.24.output_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3510144 |     9.4825163 |         9.4825163 |       -9.1316538 |         0.0000000 |        9.4825163 |         0.0000000 |        -0.0058128 |          0.0000000 |
| 1035 | head.layers.24.proj_drop                       | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3510144 |     9.4825163 |         9.4825163 |       -9.1316538 |         0.0000000 |        9.4825163 |         0.0000000 |        -0.0058128 |          0.0000000 |
| 1036 | head.layers.24.residual_op                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.4467666 |     9.4825163 |         9.4825163 |       -9.1316538 |         0.0000000 |        9.4825163 |         0.0000000 |        -0.0036999 |          0.0000000 |
| 1037 | head.layers.25.pre_norm                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.5569131 |     8.1537991 |         8.1537991 |       -7.2048120 |         0.0000000 |        8.1537991 |         0.0000000 |         0.0009042 |          0.0000000 |
| 1038 | head.layers.25.layers.0.0                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.0000000 |    2.1997843 |    15.2922630 |        15.2922630 |      -15.2922630 |         0.0000000 |       12.9503460 |         0.0000000 |        -1.3066665 |          0.0000000 |
| 1039 | head.layers.25.layers.0.2                      | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.0000000 |    0.4465589 |    12.9503460 |        12.9503460 |        0.0000000 |         0.0000000 |       12.9503460 |         0.0000000 |         0.4465590 |          0.0000000 |
| 1040 | head.layers.25.layers.1                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    4.7284975 |    51.7517738 |        51.7517738 |      -51.7517738 |         0.0000000 |       51.1047478 |         0.0000000 |        -0.0818334 |          0.0000000 |
| 1041 | head.layers.25.layers.2                        | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    4.7284975 |    51.7517738 |        51.7517738 |      -51.7517738 |         0.0000000 |       51.1047478 |         0.0000000 |        -0.0818334 |          0.0000000 |
| 1042 | head.layers.25.identity_fc                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1043 | head.layers.25.short_add                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.linear.LinearAdd                          | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    5.8996563 |    54.8875656 |        54.8875656 |      -54.8875656 |         0.0000000 |       50.8244514 |         0.0000000 |        -0.0672674 |          0.0000000 |
| 1044 | head.layers.26                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.7648574 |     4.9372416 |         4.9372416 |       -4.9372416 |         0.0000000 |        3.5475037 |         0.0000000 |        -0.0022996 |          0.0000000 |
| 1045 | head.layers.27.add1                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.9779322 |     8.0151825 |         8.0151825 |       -4.0637159 |         0.0000000 |        8.0151825 |         0.0000000 |         0.0337890 |          0.0000000 |
| 1046 | head.layers.27.layers.0                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.4467862 |     8.0223417 |         8.0223417 |       -8.0223417 |         0.0000000 |        7.2135186 |         0.0000000 |        -0.6490504 |          0.0000000 |
| 1047 | head.layers.27.layers.1                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3988679 |     7.2135186 |         7.2135186 |        0.0000000 |         0.0000000 |        7.2135186 |         0.0000000 |         0.3988679 |          0.0000000 |
| 1048 | head.layers.27.layers.2                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.9695734 |     7.7701049 |         7.7701049 |       -6.9818234 |         0.0000000 |        7.7701049 |         0.0000000 |        -0.3061866 |          0.0000000 |
| 1049 | head.layers.27.layers.3                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3316934 |     7.7701049 |         7.7701049 |        0.0000000 |         0.0000000 |        7.7701049 |         0.0000000 |         0.3316934 |          0.0000000 |
| 1050 | head.layers.27.layers.4                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.7236729 |     7.3658223 |         7.3658223 |       -0.8007023 |         0.0000000 |        7.3658223 |         0.0000000 |         0.0454494 |          0.0000000 |
| 1051 | head.layers.27.layers.5                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.2480824 |     8.6791306 |         8.6791306 |       -7.7038989 |         0.0000000 |        8.6791306 |         0.0000000 |        -0.5311908 |          0.0000000 |
| 1052 | head.layers.27.layers.6                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3584458 |     8.6791306 |         8.6791306 |        0.0000000 |         0.0000000 |        8.6791306 |         0.0000000 |         0.3584458 |          0.0000000 |
| 1053 | head.layers.27.layers.7                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.1286650 |    17.0688763 |        17.0688763 |       -5.3329172 |         0.0000000 |       17.0688763 |         0.0000000 |        -0.6170568 |          0.0000000 |
| 1054 | head.layers.27.layers.8                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.2558041 |    17.0688763 |        17.0688763 |        0.0000000 |         0.0000000 |       17.0688763 |         0.0000000 |         0.2558041 |          0.0000000 |
| 1055 | head.layers.27.layers.9                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4880283 |    13.0472937 |        13.0472937 |       -0.7497945 |         0.0000000 |       13.0472937 |         0.0000000 |         0.0314139 |          0.0000000 |
| 1056 | head.layers.27.layers.10                       | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.0000000 |    0.8269141 |     6.4400492 |         6.4400492 |       -6.4400492 |         0.0000000 |        6.1952887 |         0.0000000 |        -0.3097776 |          0.0000000 |
| 1057 | head.layers.27.layers.11.scale_quant_stub      | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([11])                  | qint16        | 1.0000000 |  0.9369276 |    0.0702270 |     0.2814003 |         0.2814003 |        0.0068504 |         0.0000000 |        0.9011661 |         1.0000000 |         0.1431663 |          0.0909091 |
| 1058 | head.layers.27.layers.11.mul                   | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.0000000 |    0.0947815 |     1.7590336 |         1.7590336 |       -1.7590336 |         0.0000000 |        1.1679670 |         0.0000000 |        -0.0626198 |          0.0000000 |
| 1059 | head.layers.27.add2                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.5510878 |    5.2058086 |   112.6915741 |       112.6915741 |      -61.5078621 |       -58.0000000 |       66.8430176 |        60.0000000 |         0.6485740 |          0.6780485 |
| 1060 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 11])         | torch.float32 |           |  0.5510878 |    5.2058086 |   112.6915741 |                   |      -61.5078621 |       -58.0000000 |       66.8430176 |        60.0000000 |         0.6485740 |          0.6780485 |
| 1061 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.6436391 |    8.8640957 |   112.6915741 |       112.6915741 |      -61.5078621 |       -58.0000000 |       66.8430176 |        60.0000000 |         8.2159185 |          1.2368289 |
| 1062 | head.anchor_encoder.pos_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 | -0.0954282 |   12.7691364 |   159.9798279 |       159.9798279 |      -74.2747192 |         0.0000000 |       59.1088486 |        95.0000000 |        -0.6943873 |          3.4413030 |
| 1063 | head.anchor_encoder.pos_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.2303311 |    6.9656253 |    95.0000000 |        95.0000000 |        0.0000000 |         0.0000000 |       59.1088486 |        95.0000000 |         5.1091242 |          3.4413030 |
| 1064 | head.anchor_encoder.pos_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0939943 |    2.4025831 |   127.7390060 |       127.7390060 |       -0.9097427 |        -1.0000000 |        4.0337377 |       127.0000000 |         0.0111420 |          1.5994991 |
| 1065 | head.anchor_encoder.pos_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.8671088 |     6.3165860 |         6.3165860 |       -5.5455065 |         0.0000000 |        6.3165860 |         0.0000000 |        -0.2282844 |          0.0000000 |
| 1066 | head.anchor_encoder.pos_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.3194122 |     6.3165860 |         6.3165860 |        0.0000000 |         0.0000000 |        6.3165860 |         0.0000000 |         0.3194122 |          0.0000000 |
| 1067 | head.anchor_encoder.pos_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.6449796 |     6.3496575 |         6.3496575 |       -0.9782009 |         0.0000000 |        6.3496575 |         0.0000000 |         0.0789500 |          0.0000000 |
| 1068 | head.anchor_encoder.pos_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    1.1410371 |     6.1297278 |         6.1297278 |       -6.1297278 |         0.0000000 |        5.7151861 |         0.0000000 |        -0.0552755 |          0.0000000 |
| 1069 | head.anchor_encoder.pos_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.5428808 |     5.7151861 |         5.7151861 |        0.0000000 |         0.0000000 |        5.7151861 |         0.0000000 |         0.5428808 |          0.0000000 |
| 1070 | head.anchor_encoder.pos_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.7974778 |     5.7253022 |         5.7253022 |       -0.8712199 |         0.0000000 |        5.7253022 |         0.0000000 |         0.0264362 |          0.0000000 |
| 1071 | head.anchor_encoder.pos_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    1.3031534 |     8.3899937 |         8.3899937 |       -5.6512837 |         0.0000000 |        8.3899937 |         0.0000000 |        -0.2814313 |          0.0000000 |
| 1072 | head.anchor_encoder.pos_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.5108610 |     8.3899937 |         8.3899937 |        0.0000000 |         0.0000000 |        8.3899937 |         0.0000000 |         0.5108610 |          0.0000000 |
| 1073 | head.anchor_encoder.pos_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.6693556 |     7.4364495 |         7.4364495 |       -0.8784916 |         0.0000000 |        7.4364495 |         0.0000000 |         0.0308716 |          0.0000000 |
| 1074 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.8617021 |    0.5041748 |     2.1562436 |         2.1562436 |       -0.6055109 |         0.0000000 |        2.7822609 |         2.0000000 |         0.9175451 |          0.9993490 |
| 1075 | head.anchor_encoder.size_fc.0                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0917329 |    0.6568874 |     2.8829834 |         2.8829834 |       -2.8829834 |         0.0000000 |        1.5346272 |         1.0000000 |        -0.3541370 |          0.0858765 |
| 1076 | head.anchor_encoder.size_fc.1                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.2777917 |    0.1717359 |     1.5346272 |         1.5346272 |        0.0000000 |         0.0000000 |        1.5346272 |         1.0000000 |         0.1310144 |          0.0858765 |
| 1077 | head.anchor_encoder.size_fc.2                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.1665384 |   11.4978018 |   127.5271988 |       127.5271988 |       -0.7208093 |         0.0000000 |        3.9646430 |       127.0000000 |         0.0221620 |         10.9063110 |
| 1078 | head.anchor_encoder.size_fc.3                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.4261665 |     1.7570109 |         1.7570109 |       -1.7570109 |         0.0000000 |        1.6242826 |         0.0000000 |        -0.0208846 |          0.0000000 |
| 1079 | head.anchor_encoder.size_fc.4                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2026409 |     1.6242826 |         1.6242826 |        0.0000000 |         0.0000000 |        1.6242826 |         0.0000000 |         0.2026409 |          0.0000000 |
| 1080 | head.anchor_encoder.size_fc.5                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7850217 |     3.3910110 |         3.3910110 |       -0.9027014 |         0.0000000 |        3.3910110 |         0.0000000 |         0.0095553 |          0.0000000 |
| 1081 | head.anchor_encoder.size_fc.6                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6992651 |     2.2632139 |         2.2632139 |       -2.1258695 |         0.0000000 |        2.2632139 |         0.0000000 |        -0.0731953 |          0.0000000 |
| 1082 | head.anchor_encoder.size_fc.7                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.3130350 |     2.2632139 |         2.2632139 |        0.0000000 |         0.0000000 |        2.2632139 |         0.0000000 |         0.3130350 |          0.0000000 |
| 1083 | head.anchor_encoder.size_fc.8                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7702303 |     3.5817254 |         3.5817254 |       -0.8094202 |         0.0000000 |        3.5817254 |         0.0000000 |         0.0125229 |          0.0000000 |
| 1084 | head.anchor_encoder.size_fc.9                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7444050 |     3.0098939 |         3.0098939 |       -2.6565943 |         0.0000000 |        3.0098939 |         0.0000000 |         0.1313720 |          0.0000000 |
| 1085 | head.anchor_encoder.size_fc.10                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.4378885 |     3.0098939 |         3.0098939 |        0.0000000 |         0.0000000 |        3.0098939 |         0.0000000 |         0.4378885 |          0.0000000 |
| 1086 | head.anchor_encoder.size_fc.11                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7004070 |     3.9460886 |         3.9460886 |       -1.1828291 |         0.0000000 |        3.9460886 |         0.0000000 |         0.0295590 |          0.0000000 |
| 1087 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 2])          | qint16        | 1.0000000 | -0.7941302 |    0.9077629 |     2.2018354 |         2.2018354 |       -1.5826559 |         0.0000000 |        0.1379656 |         1.0000000 |        -0.5278481 |          0.3750000 |
| 1088 | head.anchor_encoder.yaw_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0131479 |    0.4798370 |     2.0966923 |         2.0966923 |       -1.4054055 |         0.0000000 |        1.4878817 |         2.0000000 |         0.1358715 |          0.0859375 |
| 1089 | head.anchor_encoder.yaw_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0364760 |    0.3454874 |     2.0000000 |         2.0000000 |        0.0000000 |         0.0000000 |        1.4878817 |         2.0000000 |         0.2702211 |          0.0859375 |
| 1090 | head.anchor_encoder.yaw_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 | -0.1477628 |    8.6980686 |   127.9639359 |       127.9639359 |       -1.2715580 |         0.0000000 |        3.0190270 |       127.0000000 |        -0.0002374 |          7.9375005 |
| 1091 | head.anchor_encoder.yaw_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6120300 |     3.2413557 |         3.2413557 |       -3.2413557 |         0.0000000 |        1.9301500 |         0.0000000 |        -0.1676708 |          0.0000000 |
| 1092 | head.anchor_encoder.yaw_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2221796 |     1.9301500 |         1.9301500 |        0.0000000 |         0.0000000 |        1.9301500 |         0.0000000 |         0.2221796 |          0.0000000 |
| 1093 | head.anchor_encoder.yaw_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7354434 |     4.0584388 |         4.0584388 |       -0.9144455 |         0.0000000 |        4.0584388 |         0.0000000 |         0.0090951 |          0.0000000 |
| 1094 | head.anchor_encoder.yaw_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.8201384 |     3.7541749 |         3.7541749 |       -3.7541749 |         0.0000000 |        2.5204899 |         0.0000000 |        -0.2544440 |          0.0000000 |
| 1095 | head.anchor_encoder.yaw_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2828472 |     2.5204899 |         2.5204899 |        0.0000000 |         0.0000000 |        2.5204899 |         0.0000000 |         0.2828472 |          0.0000000 |
| 1096 | head.anchor_encoder.yaw_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7749714 |     3.6754758 |         3.6754758 |       -0.9107327 |         0.0000000 |        3.6754758 |         0.0000000 |         0.0137483 |          0.0000000 |
| 1097 | head.anchor_encoder.yaw_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    1.0499101 |     5.1871934 |         5.1871934 |       -5.1871934 |         0.0000000 |        2.7200913 |         0.0000000 |        -0.5247822 |          0.0000000 |
| 1098 | head.anchor_encoder.yaw_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2625640 |     2.7200913 |         2.7200913 |        0.0000000 |         0.0000000 |        2.7200913 |         0.0000000 |         0.2625640 |          0.0000000 |
| 1099 | head.anchor_encoder.yaw_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6060563 |     4.9839063 |         4.9839063 |       -0.9762715 |         0.0000000 |        4.9839063 |         0.0000000 |         0.0732572 |          0.0000000 |
| 1100 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.0000000 |    9.1145210 |    46.0130844 |        46.0130844 |      -46.0130844 |         0.0000000 |       15.5269632 |         0.0000000 |        -6.4034595 |          0.0000000 |
| 1101 | head.anchor_encoder.vel_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0658202 |    5.9587932 |    28.5697193 |        28.5697193 |      -28.5697193 |         0.0000000 |       25.9108238 |         1.0000000 |        -0.3167043 |          0.2031250 |
| 1102 | head.anchor_encoder.vel_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.2867862 |    2.7977889 |    24.9108238 |        24.9108238 |        0.0000000 |         0.0000000 |       25.9108238 |         1.0000000 |         2.8442998 |          0.2031250 |
| 1103 | head.anchor_encoder.vel_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.1380313 |   26.3542652 |   127.8380280 |       127.8380280 |       -0.8965230 |         0.0000000 |        3.5419161 |       127.0000000 |         0.0188538 |         25.7968769 |
| 1104 | head.anchor_encoder.vel_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.5541211 |     3.1948166 |         3.1948166 |       -2.6972783 |         0.0000000 |        3.1948166 |         0.0000000 |        -0.1624998 |          0.0000000 |
| 1105 | head.anchor_encoder.vel_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.1958107 |     3.1948166 |         3.1948166 |        0.0000000 |         0.0000000 |        3.1948166 |         0.0000000 |         0.1958107 |          0.0000000 |
| 1106 | head.anchor_encoder.vel_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.7610778 |     4.2016263 |         4.2016263 |       -0.9070169 |         0.0000000 |        4.2016263 |         0.0000000 |         0.0362604 |          0.0000000 |
| 1107 | head.anchor_encoder.vel_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.9070890 |     4.7525654 |         4.7525654 |       -4.7525654 |         0.0000000 |        4.1204758 |         0.0000000 |        -0.1611000 |          0.0000000 |
| 1108 | head.anchor_encoder.vel_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.3729945 |     4.1204758 |         4.1204758 |        0.0000000 |         0.0000000 |        4.1204758 |         0.0000000 |         0.3729945 |          0.0000000 |
| 1109 | head.anchor_encoder.vel_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.7558944 |     5.6525879 |         5.6525879 |       -0.8806370 |         0.0000000 |        5.6525879 |         0.0000000 |         0.0205899 |          0.0000000 |
| 1110 | head.anchor_encoder.vel_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.8943109 |     4.6223669 |         4.6223669 |       -4.6223669 |         0.0000000 |        4.4104238 |         0.0000000 |        -0.1936852 |          0.0000000 |
| 1111 | head.anchor_encoder.vel_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.3503128 |     4.4104238 |         4.4104238 |        0.0000000 |         0.0000000 |        4.4104238 |         0.0000000 |         0.3503128 |          0.0000000 |
| 1112 | head.anchor_encoder.vel_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.6234072 |     5.2305489 |         5.2305489 |       -0.8146061 |         0.0000000 |        5.2305489 |         0.0000000 |         0.0283649 |          0.0000000 |
| 1113 | head.anchor_encoder.cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.6538375 |     7.4364495 |         7.4364495 |       -1.1828291 |         0.0000000 |        7.4364495 |         0.0000000 |         0.0353790 |          0.0000000 |
| 1114 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 256, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
| 1115 | head.layers.28.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.7093476 |     7.4364495 |     16244.9239971 |       -4.9372416 |         0.0000000 |        7.4364495 |         0.0000000 |         0.0165397 |          0.0000000 |
| 1116 | head.layers.28.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 256, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
| 1117 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.7093476 |     7.4364495 |     16244.9239971 |       -4.9372416 |         0.0000000 |        7.4364495 |         0.0000000 |         0.0165397 |          0.0000000 |
| 1118 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
| 1119 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
| 1120 | head.layers.28.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.7093476 |     7.4364495 |     16244.9239971 |       -4.9372416 |         0.0000000 |        7.4364495 |         0.0000000 |         0.0165397 |          0.0000000 |
| 1121 | head.layers.28.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
| 1122 | head.layers.28.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
| 1123 | head.layers.28.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.9962087 |    12.5595312 |        12.5595312 |      -11.7564793 |         0.0000000 |       12.5595312 |         0.0000000 |         0.0606673 |          0.0000000 |
| 1124 | head.layers.28.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.7942346 |    11.5417175 |        11.5417175 |      -11.5417175 |         0.0000000 |        8.5433044 |         0.0000000 |         0.0190670 |          0.0000000 |
| 1125 | head.layers.28.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.1025277 |     1.5901335 |         1.5901335 |       -1.5901335 |         0.0000000 |        1.3289862 |         0.0000000 |         0.0049140 |          0.0000000 |
| 1126 | head.layers.28.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.9962087 |    12.5595312 |        12.5595312 |      -11.7564793 |         0.0000000 |       12.5595312 |         0.0000000 |         0.0606673 |          0.0000000 |
| 1127 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.9962087 |    12.5595312 |        12.5595312 |      -11.7564793 |         0.0000000 |       12.5595312 |         0.0000000 |         0.0606673 |          0.0000000 |
| 1128 | head.layers.28.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1129 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1130 | head.layers.28.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1025277 |     1.5901335 |         1.5901335 |       -1.5901335 |         0.0000000 |        1.3289862 |         0.0000000 |         0.0049140 |          0.0000000 |
| 1131 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 256, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1025277 |     1.5901335 |         1.5901335 |       -1.5901335 |         0.0000000 |        1.3289862 |         0.0000000 |         0.0049140 |          0.0000000 |
| 1132 | head.layers.28.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.0000000 |    0.2495261 |     1.5699414 |        12.5595312 |       -1.4695599 |         0.0000000 |        1.5699414 |         0.0000000 |         0.0075834 |          0.0000000 |
| 1133 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1134 | head.layers.28.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |   18.7728062 |   129.3544769 |       129.3544769 |     -129.3544769 |         0.0000000 |      123.4668503 |         0.0000000 |        -6.5376987 |          0.0000000 |
| 1135 | head.layers.28.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.9999989 |         0.9999989 |        0.0000000 |         0.0000000 |        0.9999989 |         0.0000000 |         0.0039062 |          0.0000000 |
| 1136 | head.layers.28.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.9999989 |         0.9999989 |        0.0000000 |         0.0000000 |        0.9999989 |         0.0000000 |         0.0039062 |          0.0000000 |
| 1137 | head.layers.28.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.0877025 |     1.4828752 |         1.4828752 |       -1.4828752 |         0.0000000 |        1.2018247 |         0.0000000 |         0.0038095 |          0.0000000 |
| 1138 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1139 | head.layers.28.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1140 | head.layers.28.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.1650102 |     1.4326441 |         1.4326441 |       -1.2866803 |         0.0000000 |        1.4326441 |         0.0000000 |         0.0083973 |          0.0000000 |
| 1141 | head.layers.28.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.9999989 |         0.9999989 |        0.0000000 |         0.0000000 |        0.9999989 |         0.0000000 |         0.0039062 |          0.0000000 |
| 1142 | head.layers.28.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.5652897 |         0.5652897 |        0.0000000 |         0.0000000 |        0.5652897 |         0.0000000 |         0.0039062 |          0.0000000 |
| 1143 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1144 | head.layers.28.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.1650102 |     1.4326441 |         1.4326441 |       -1.2866803 |         0.0000000 |        1.4326441 |         0.0000000 |         0.0083973 |          0.0000000 |
| 1145 | head.layers.28.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.0000000 |    0.7140837 |     7.7959228 |         7.7959228 |       -4.7849336 |         0.0000000 |        7.7959228 |         0.0000000 |         0.0249370 |          0.0000000 |
| 1146 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.0000000 |    0.7105328 |    11.1785707 |      7325.8763410 |      -11.1785707 |         0.0000000 |       10.3571281 |         0.0000000 |         0.0443241 |          0.0000000 |
| 1147 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1752895 |     4.0376520 |     26460.7524910 |       -4.0376520 |         0.0000000 |        3.5462697 |         0.0000000 |        -0.0002182 |          0.0000000 |
| 1148 | head.layers.29.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6821851 |    11.1785707 |     24419.5878033 |      -11.1785707 |         0.0000000 |       10.3571281 |         0.0000000 |         0.0398516 |          0.0000000 |
| 1149 | head.layers.29.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0002277 |  0.0000000 |    0.6821851 |    11.1785707 |     49083.7819080 |      -11.1785707 |         0.0000000 |       10.3571281 |         0.0000000 |         0.0398516 |          0.0000000 |
| 1150 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6821851 |    11.1785707 |     24419.5878033 |      -11.1785707 |         0.0000000 |       10.3571281 |         0.0000000 |         0.0398516 |          0.0000000 |
| 1151 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0002277 |  0.0000000 |    0.6821851 |    11.1785707 |     49083.7819080 |      -11.1785707 |         0.0000000 |       10.3571281 |         0.0000000 |         0.0398516 |          0.0000000 |
| 1152 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1752895 |     4.0376520 |     26460.7524910 |       -4.0376520 |         0.0000000 |        3.5462697 |         0.0000000 |        -0.0002182 |          0.0000000 |
| 1153 | head.layers.29.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6821851 |    11.1785707 |     24419.5878033 |      -11.1785707 |         0.0000000 |       10.3571281 |         0.0000000 |         0.0398516 |          0.0000000 |
| 1154 | head.layers.29.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0002277 |  0.0000000 |    0.6821851 |    11.1785707 |     49083.7819080 |      -11.1785707 |         0.0000000 |       10.3571281 |         0.0000000 |         0.0398516 |          0.0000000 |
| 1155 | head.layers.29.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1752895 |     4.0376520 |     26460.7524910 |       -4.0376520 |         0.0000000 |        3.5462697 |         0.0000000 |        -0.0002182 |          0.0000000 |
| 1156 | head.layers.29.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.3005639 |     9.2844954 |         9.2844954 |       -8.1777077 |         0.0000000 |        9.2844954 |         0.0000000 |         0.0238440 |          0.0000000 |
| 1157 | head.layers.29.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.6607102 |    14.8122416 |        14.8122416 |      -14.8122416 |         0.0000000 |       13.5411654 |         0.0000000 |         0.0346209 |          0.0000000 |
| 1158 | head.layers.29.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2150934 |     4.2027702 |         4.2027702 |       -3.5675447 |         0.0000000 |        4.2027702 |         0.0000000 |         0.0028687 |          0.0000000 |
| 1159 | head.layers.29.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.3005639 |     9.2844954 |         9.2844954 |       -8.1777077 |         0.0000000 |        9.2844954 |         0.0000000 |         0.0238440 |          0.0000000 |
| 1160 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.3005639 |     9.2844954 |         9.2844954 |       -8.1777077 |         0.0000000 |        9.2844954 |         0.0000000 |         0.0238440 |          0.0000000 |
| 1161 | head.layers.29.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.6607102 |    14.8122416 |        14.8122416 |      -14.8122416 |         0.0000000 |       13.5411654 |         0.0000000 |         0.0346209 |          0.0000000 |
| 1162 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.6607102 |    14.8122416 |        14.8122416 |      -14.8122416 |         0.0000000 |       13.5411654 |         0.0000000 |         0.0346209 |          0.0000000 |
| 1163 | head.layers.29.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.2150934 |     4.2027702 |         4.2027702 |       -3.5675447 |         0.0000000 |        4.2027702 |         0.0000000 |         0.0028687 |          0.0000000 |
| 1164 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.2150934 |     4.2027702 |         4.2027702 |       -3.5675447 |         0.0000000 |        4.2027702 |         0.0000000 |         0.0028687 |          0.0000000 |
| 1165 | head.layers.29.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.0000000 |    0.1625705 |     1.1605619 |         9.2844954 |       -1.0222135 |         0.0000000 |        1.1605619 |         0.0000000 |         0.0029805 |          0.0000000 |
| 1166 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1167 | head.layers.29.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    7.8361440 |    89.0618286 |        89.0618286 |      -70.1223755 |         0.0000000 |       89.0618286 |         0.0000000 |         0.1451780 |          0.0000000 |
| 1168 | head.layers.29.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9999763 |         0.9999763 |        0.0000000 |         0.0000000 |        0.9999763 |         0.0000000 |         0.0019531 |          0.0000000 |
| 1169 | head.layers.29.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9999763 |         0.9999763 |        0.0000000 |         0.0000000 |        0.9999763 |         0.0000000 |         0.0019531 |          0.0000000 |
| 1170 | head.layers.29.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1800177 |     4.0685363 |         4.0685363 |       -3.4296505 |         0.0000000 |        4.0685363 |         0.0000000 |         0.0057664 |          0.0000000 |
| 1171 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1172 | head.layers.29.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1173 | head.layers.29.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.3281153 |     2.1809564 |         2.1809564 |       -2.1199830 |         0.0000000 |        2.1809564 |         0.0000000 |         0.0035751 |          0.0000000 |
| 1174 | head.layers.29.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 512])     | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9999763 |         0.9999763 |        0.0000000 |         0.0000000 |        0.9999763 |         0.0000000 |         0.0019531 |          0.0000000 |
| 1175 | head.layers.29.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.3783459 |         0.3783459 |        0.0000000 |         0.0000000 |        0.3783459 |         0.0000000 |         0.0019531 |          0.0000000 |
| 1176 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1177 | head.layers.29.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.3281153 |     2.1809564 |         2.1809564 |       -2.1199830 |         0.0000000 |        2.1809564 |         0.0000000 |         0.0035751 |          0.0000000 |
| 1178 | head.layers.29.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.0000000 |    0.7599120 |    10.4281330 |        10.4281330 |      -10.4281330 |         0.0000000 |        9.8588667 |         0.0000000 |         0.0434267 |          0.0000000 |
| 1179 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.0000000 |    1.3299161 |    29.8500614 |     19562.2377539 |      -29.8500614 |         0.0000000 |       23.8784981 |         0.0000000 |        -0.0228583 |          0.0000000 |
| 1180 | head.layers.30                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.5798484 |     6.8458343 |         6.8458343 |       -6.8458343 |         0.0000000 |        6.2399631 |         0.0000000 |         0.0018320 |          0.0000000 |
| 1181 | head.layers.31.kps_generator.offset            | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 24])         | qint16        | 1.0000000 |  0.0000000 |    1.1628911 |     7.0669446 |         7.0669446 |       -7.0669446 |         0.0000000 |        5.2122626 |         0.0000000 |        -0.6167719 |          0.0000000 |
| 1182 | head.layers.31.kps_generator                   | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.0000000 |    1.1628911 |     7.0669446 |         7.0669446 |       -7.0669446 |         0.0000000 |        5.2122626 |         0.0000000 |        -0.6167719 |          0.0000000 |
| 1183 | head.layers.31.kps_generator                   | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 3])       | qint16        | 1.0000000 |  0.6436391 |    8.8640957 |   112.6915741 |       112.6915741 |      -61.5078621 |       -58.0000000 |       66.8430176 |        60.0000000 |         8.2159185 |          1.2368289 |
| 1184 | head.layers.31.kps_generator.keypoints_add     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.6465908 |    9.2778158 |   115.0907593 |       115.0907593 |      -64.6389389 |       -58.0000000 |       69.1406021 |        60.0000000 |         7.5991464 |          1.2368289 |
| 1185 | head.layers.31.weight_add                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.9053135 |     8.3211555 |         8.3211555 |       -7.3142509 |         0.0000000 |        8.3211555 |         0.0000000 |         0.0372111 |          0.0000000 |
| 1186 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 3, 4])         | qint16        | 1.0000000 |  0.9977293 |    0.2101564 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3522723 |          0.3472222 |
| 1187 | head.layers.31                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 6, 12])           | qint16        | 1.0000000 |  0.9977293 |    0.2101564 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3522723 |          0.3472222 |
| 1188 | head.layers.31.camera_encoder.0                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.7028903 |     6.3953948 |         6.3953948 |       -6.3953948 |         0.0000000 |        6.0321255 |         0.0000000 |        -0.1030675 |          0.0000000 |
| 1189 | head.layers.31.camera_encoder.1                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.2999114 |     6.0321255 |         6.0321255 |        0.0000000 |         0.0000000 |        6.0321255 |         0.0000000 |         0.2999114 |          0.0000000 |
| 1190 | head.layers.31.camera_encoder.2                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.7861878 |     4.9130092 |         4.9130092 |       -0.8076455 |         0.0000000 |        4.9130092 |         0.0000000 |         0.0247915 |          0.0000000 |
| 1191 | head.layers.31.camera_encoder.3                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    2.2730424 |    23.3326073 |        23.3326073 |      -19.5911980 |         0.0000000 |       23.3326073 |         0.0000000 |        -0.4829063 |          0.0000000 |
| 1192 | head.layers.31.camera_encoder.4                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.8950680 |    23.3326073 |        23.3326073 |        0.0000000 |         0.0000000 |       23.3326073 |         0.0000000 |         0.8950680 |          0.0000000 |
| 1193 | head.layers.31.camera_encoder.5                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.1089496 |    0.4299536 |     7.1566195 |         7.1566195 |       -1.1679387 |        -1.0000000 |        7.1566195 |         0.0000000 |         0.0284684 |         -0.0078125 |
| 1194 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.9053135 |     8.3211555 |         8.3211555 |       -7.3142509 |         0.0000000 |        8.3211555 |         0.0000000 |         0.0372111 |          0.0000000 |
| 1195 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 6, 256])       | qint8         | 1.0000000 |  0.1089496 |    0.4299536 |     7.1566195 |         7.1566195 |       -1.1679387 |        -1.0000000 |        7.1566195 |         0.0000000 |         0.0284684 |         -0.0078125 |
| 1196 | head.layers.31.cam_add                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 6, 256])     | qint8         | 1.0000000 | -0.0789405 |    1.0023922 |     9.8828440 |         9.8828440 |       -6.2227793 |        -1.0000000 |        9.8828440 |         0.0000000 |         0.0656795 |         -0.0078125 |
| 1197 | head.layers.31.weights_fc                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 6, 64])      | qint8         | 1.0000000 |  0.0000000 |    1.8127121 |    12.7611513 |        12.7611513 |      -12.1943979 |         0.0000000 |       12.7611513 |         0.0000000 |         0.0242522 |          0.0000000 |
| 1198 | head.layers.31                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.0000000 |    1.8127121 |    12.7611513 |        12.7611513 |      -12.1943979 |         0.0000000 |       12.7611513 |         0.0000000 |         0.0242522 |          0.0000000 |
| 1199 | head.layers.31.weight_softmax                  | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.0000000 |    0.0208333 |     0.8952301 |         0.8952301 |        0.0000000 |         0.0000000 |        0.8952301 |         0.0000000 |         0.0208333 |          0.0000000 |
| 1200 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  0.8162024 |    9.9349728 |   115.0907593 |       115.0907593 |      -64.6389389 |       -58.0000000 |       69.1406021 |        60.0000000 |        10.1498260 |          5.5649042 |
| 1201 | head.layers.31                                 | torch.ones_like                                                               | torch.ones_like                                                         | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
| 1202 | head.layers.31.point_quant_stub                | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
| 1203 | head.layers.31.point_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 8, 4])       | qint16        | 1.0000000 |  0.6467402 |    6.9583621 |   115.0907593 |       115.0907593 |      -64.6389389 |       -58.0000000 |       69.1406021 |        60.0000000 |         5.9493599 |          1.1776217 |
| 1204 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 1, 1, 4, 4])   | qint16        | 1.0000000 |  0.9977411 |    0.1576173 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3267042 |          0.3229167 |
| 1205 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 512, 8, 1, 4]) | qint16        | 1.0000000 |  0.6467402 |    6.9583621 |   115.0907593 |       115.0907593 |      -64.6389389 |       -58.0000000 |       69.1406021 |        60.0000000 |         5.9493599 |          1.1776217 |
| 1206 | head.layers.31.point_matmul                    | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 4, 4]) | qint16        | 1.0000000 |  0.0760648 |    5.1753950 |   785.0596313 |       785.0596313 |     -355.0596313 |      -180.0000000 |      249.8751373 |       440.0000000 |        -0.1562724 |          1.5117548 |
| 1207 | head.layers.31.point_sum                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 6, 512, 8, 4])    | qint16        | 0.0244144 |  0.0698791 |   16.0374451 |   816.7304688 |     33452.7695512 |     -311.1663208 |      -180.0076294 |      321.0947266 |       518.0010376 |        -0.6250895 |          6.0472698 |
| 1208 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.7163468 |   11.4779158 |   126.9697418 |      5200.6012701 |      -76.6246948 |       -63.9902344 |       73.4561691 |        61.9882507 |        -0.8759958 |          1.5886035 |
| 1209 | head.layers.31                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.8125190 |    6.1371741 |    59.0305023 |      2417.8524815 |        0.0100000 |         0.0000000 |       73.4561691 |        61.9882507 |        11.2706022 |         12.4062328 |
| 1210 | head.layers.31.reciprocal_op                   | horizon_plugin_pytorch.nn.reciprocal.Reciprocal                               | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.7891481 |   46.3535614 |    99.9816895 |     32761.5000992 |        0.0136136 |         0.0152590 |      100.0000000 |         1.2787061 |        46.7831268 |          0.6503971 |
| 1211 | head.layers.31                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.7891481 |   46.3535614 |    99.9816895 |     32761.5000992 |        0.0136136 |         0.0152590 |      100.0000000 |         1.2787061 |        46.7831268 |          0.6503971 |
| 1212 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0244144 | -0.0418288 |   26.3354340 |   816.7304688 |     33452.7695512 |     -311.1663208 |      -180.0076294 |      321.0947266 |       518.0010376 |        -1.3121812 |         10.7997427 |
| 1213 | head.layers.31.point_mul                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.4615154 | 1412.1096191 | 32099.4726562 | 105181947.0508568 |   -31116.6328125 |       -10.0001526 |    32109.4726562 |         9.9998474 |       636.4870605 |          2.1499586 |
| 1214 | head.layers.31                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.4561746 |    0.6200458 |     2.1998703 |      7208.4251644 |       -1.1000000 |        -1.0998703 |        1.1000000 |         1.0998703 |         0.1065517 |          0.2038586 |
| 1215 | head.layers.31                                 | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([156, 512, 8, 2])      | qint16        | 0.0003052 |  0.4561746 |    0.6200458 |     2.1998703 |      7208.4251644 |       -1.1000000 |        -1.0998703 |        1.1000000 |         1.0998703 |         0.1065517 |          0.2038586 |
| 1216 | head.layers.31                                 | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer            | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer      | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.0000000 |    0.4567491 |    54.4943085 |        54.4943085 |      -54.4943085 |         0.0000000 |       48.7969971 |         0.0000000 |         0.0143460 |          0.0000000 |
| 1217 | head.layers.31.feat_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.0000000 |    0.4567491 |    54.4943085 |        54.4943085 |      -54.4943085 |         0.0000000 |       48.7969971 |         0.0000000 |         0.0143460 |          0.0000000 |
| 1218 | head.layers.31                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 6, 256, 512, 8])  | qint8         | 1.0000000 |  0.0000000 |    0.4567491 |    54.4943085 |        54.4943085 |      -54.4943085 |         0.0000000 |       48.7969971 |         0.0000000 |         0.0143460 |          0.0000000 |
| 1219 | head.layers.31                                 | torch.Tensor.permute                                                          | torch.Tensor.permute                                                    | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.0000000 |    0.4567491 |    54.4943085 |        54.4943085 |      -54.4943085 |         0.0000000 |       48.7969971 |         0.0000000 |         0.0143460 |          0.0000000 |
| 1220 | head.layers.31                                 | torch.Tensor.contiguous                                                       | torch.Tensor.contiguous                                                 | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.0000000 |    0.4567491 |    54.4943085 |        54.4943085 |      -54.4943085 |         0.0000000 |       48.7969971 |         0.0000000 |         0.0143460 |          0.0000000 |
| 1221 | head.layers.31                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.0000000 |    0.4567491 |    54.4943085 |        54.4943085 |      -54.4943085 |         0.0000000 |       48.7969971 |         0.0000000 |         0.0143460 |          0.0000000 |
| 1222 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 48, 8, 1])   | qint8         | 1.0000000 |  0.0000000 |    0.0208333 |     0.8952301 |         0.8952301 |        0.0000000 |         0.0000000 |        0.8952301 |         0.0000000 |         0.0208333 |          0.0000000 |
| 1223 | head.layers.31                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.0000000 |    0.4567491 |    54.4943085 |        54.4943085 |      -54.4943085 |         0.0000000 |       48.7969971 |         0.0000000 |         0.0143460 |          0.0000000 |
| 1224 | head.layers.31.feat_mul                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.0000000 |    0.0054330 |    11.9796171 |        11.9796171 |      -11.9796171 |         0.0000000 |        8.5088749 |         0.0000000 |         0.0000004 |          0.0000000 |
| 1225 | head.layers.31                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.0000000 |    0.0054330 |    11.9796171 |        11.9796171 |      -11.9796171 |         0.0000000 |        8.5088749 |         0.0000000 |         0.0000004 |          0.0000000 |
| 1226 | head.layers.31.feat_sum                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.2057725 |    13.1687622 |        13.1687622 |      -13.1687622 |         0.0000000 |        9.5621004 |         0.0000000 |         0.0000176 |          0.0000000 |
| 1227 | head.layers.31.output_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3075138 |     9.1343060 |         9.1343060 |       -7.4600663 |         0.0000000 |        9.1343060 |         0.0000000 |        -0.0025296 |          0.0000000 |
| 1228 | head.layers.31.proj_drop                       | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3075138 |     9.1343060 |         9.1343060 |       -7.4600663 |         0.0000000 |        9.1343060 |         0.0000000 |        -0.0025296 |          0.0000000 |
| 1229 | head.layers.31.residual_op                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.4436811 |     9.1343060 |         9.1343060 |       -7.4600663 |         0.0000000 |        9.1343060 |         0.0000000 |        -0.0003488 |          0.0000000 |
| 1230 | head.layers.32.pre_norm                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.5701687 |     7.6093745 |         7.6093745 |       -7.6093745 |         0.0000000 |        7.2926073 |         0.0000000 |        -0.0031510 |          0.0000000 |
| 1231 | head.layers.32.layers.0.0                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.0000000 |    2.4004579 |    15.1514359 |        15.1514359 |      -15.1514359 |         0.0000000 |       13.4808750 |         0.0000000 |        -1.7421576 |          0.0000000 |
| 1232 | head.layers.32.layers.0.2                      | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.0000000 |    0.3291503 |    13.4808750 |        13.4808750 |        0.0000000 |         0.0000000 |       13.4808750 |         0.0000000 |         0.3291503 |          0.0000000 |
| 1233 | head.layers.32.layers.1                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    3.6995096 |    49.9368973 |        49.9368973 |      -49.9368973 |         0.0000000 |       42.9923096 |         0.0000000 |         0.0450117 |          0.0000000 |
| 1234 | head.layers.32.layers.2                        | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    3.6995096 |    49.9368973 |        49.9368973 |      -49.9368973 |         0.0000000 |       42.9923096 |         0.0000000 |         0.0450117 |          0.0000000 |
| 1235 | head.layers.32.identity_fc                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1236 | head.layers.32.short_add                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.linear.LinearAdd                          | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    5.1977587 |    52.6310120 |        52.6310120 |      -52.6310120 |         0.0000000 |       44.6253357 |         0.0000000 |         0.0301451 |          0.0000000 |
| 1237 | head.layers.33                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.7611118 |     4.4297833 |         4.4297833 |       -4.4297833 |         0.0000000 |        3.8293908 |         0.0000000 |        -0.0033574 |          0.0000000 |
| 1238 | head.layers.34.add1                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.9894786 |     8.3032808 |         8.3032808 |       -4.3244510 |         0.0000000 |        8.3032808 |         0.0000000 |         0.0320217 |          0.0000000 |
| 1239 | head.layers.34.layers.0                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.3341599 |     8.2653580 |         8.2653580 |       -8.2653580 |         0.0000000 |        8.0072107 |         0.0000000 |        -0.4605494 |          0.0000000 |
| 1240 | head.layers.34.layers.1                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4368052 |     8.0072107 |         8.0072107 |        0.0000000 |         0.0000000 |        8.0072107 |         0.0000000 |         0.4368053 |          0.0000000 |
| 1241 | head.layers.34.layers.2                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.0898459 |     8.5607128 |         8.5607128 |       -8.5607128 |         0.0000000 |        7.4769115 |         0.0000000 |        -0.5111202 |          0.0000000 |
| 1242 | head.layers.34.layers.3                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.2893628 |     7.4769115 |         7.4769115 |        0.0000000 |         0.0000000 |        7.4769115 |         0.0000000 |         0.2893628 |          0.0000000 |
| 1243 | head.layers.34.layers.4                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.6931624 |     7.8336587 |         7.8336587 |       -0.7382725 |         0.0000000 |        7.8336587 |         0.0000000 |         0.0362735 |          0.0000000 |
| 1244 | head.layers.34.layers.5                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.2326719 |     7.5342116 |         7.5342116 |       -6.0854731 |         0.0000000 |        7.5342116 |         0.0000000 |        -0.5294379 |          0.0000000 |
| 1245 | head.layers.34.layers.6                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3516169 |     7.5342116 |         7.5342116 |        0.0000000 |         0.0000000 |        7.5342116 |         0.0000000 |         0.3516170 |          0.0000000 |
| 1246 | head.layers.34.layers.7                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.0444618 |    16.0767174 |        16.0767174 |       -4.3783431 |         0.0000000 |       16.0767174 |         0.0000000 |        -0.6186591 |          0.0000000 |
| 1247 | head.layers.34.layers.8                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.2129014 |    16.0767174 |        16.0767174 |        0.0000000 |         0.0000000 |       16.0767174 |         0.0000000 |         0.2129014 |          0.0000000 |
| 1248 | head.layers.34.layers.9                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4897228 |    12.3919544 |        12.3919544 |       -0.7553867 |         0.0000000 |       12.3919544 |         0.0000000 |         0.0231330 |          0.0000000 |
| 1249 | head.layers.34.layers.10                       | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.0000000 |    0.9519470 |     6.8882608 |         6.8882608 |       -6.2740660 |         0.0000000 |        6.8882608 |         0.0000000 |         0.1439068 |          0.0000000 |
| 1250 | head.layers.34.layers.11.scale_quant_stub      | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([11])                  | qint16        | 1.0000000 |  0.9903026 |    0.0404044 |     0.1620035 |         0.1620035 |        0.0028612 |         0.0000000 |        0.8379965 |         1.0000000 |         0.1018584 |          0.0909091 |
| 1251 | head.layers.34.layers.11.mul                   | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.0000000 |    0.0399316 |     0.9178028 |         0.9178028 |       -0.9178028 |         0.0000000 |        0.7073690 |         0.0000000 |         0.0044227 |          0.0000000 |
| 1252 | head.layers.34.add2                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.5512052 |    5.2062006 |   112.6563568 |       112.6563568 |      -61.4566994 |       -58.0000000 |       67.0949707 |        60.0000000 |         0.6529967 |          0.6780485 |
| 1253 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 11])         | torch.float32 |           |  0.5512052 |    5.2062006 |   112.6563568 |                   |      -61.4566994 |       -58.0000000 |       67.0949707 |        60.0000000 |         0.6529967 |          0.6780485 |
| 1254 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.6436465 |    8.8688383 |   112.6563568 |       112.6563568 |      -61.4566994 |       -58.0000000 |       67.0949707 |        60.0000000 |         8.2281027 |          1.2368289 |
| 1255 | head.anchor_encoder.pos_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 | -0.0951616 |   12.7692699 |   159.9961395 |       159.9961395 |      -74.2690430 |         0.0000000 |       59.0912247 |        95.0000000 |        -0.6955190 |          3.4413030 |
| 1256 | head.anchor_encoder.pos_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.2303774 |    6.9649429 |    95.0000000 |        95.0000000 |        0.0000000 |         0.0000000 |       59.0912247 |        95.0000000 |         5.1088080 |          3.4413030 |
| 1257 | head.anchor_encoder.pos_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0947067 |    2.4023366 |   127.7362595 |       127.7362595 |       -0.9090925 |        -1.0000000 |        4.0332942 |       127.0000000 |         0.0111501 |          1.5994991 |
| 1258 | head.anchor_encoder.pos_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.8678688 |     6.3148513 |         6.3148513 |       -5.5454645 |         0.0000000 |        6.3148513 |         0.0000000 |        -0.2284136 |          0.0000000 |
| 1259 | head.anchor_encoder.pos_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.3197276 |     6.3148513 |         6.3148513 |        0.0000000 |         0.0000000 |        6.3148513 |         0.0000000 |         0.3197276 |          0.0000000 |
| 1260 | head.anchor_encoder.pos_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.6451622 |     6.3496590 |         6.3496590 |       -0.9782494 |         0.0000000 |        6.3496590 |         0.0000000 |         0.0789669 |          0.0000000 |
| 1261 | head.anchor_encoder.pos_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    1.1403871 |     6.1305814 |         6.1305814 |       -6.1305814 |         0.0000000 |        5.7128587 |         0.0000000 |        -0.0547403 |          0.0000000 |
| 1262 | head.anchor_encoder.pos_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.5428234 |     5.7128587 |         5.7128587 |        0.0000000 |         0.0000000 |        5.7128587 |         0.0000000 |         0.5428234 |          0.0000000 |
| 1263 | head.anchor_encoder.pos_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.7974254 |     5.7227111 |         5.7227111 |       -0.8646023 |         0.0000000 |        5.7227111 |         0.0000000 |         0.0264316 |          0.0000000 |
| 1264 | head.anchor_encoder.pos_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    1.3016956 |     8.3950319 |         8.3950319 |       -5.6506000 |         0.0000000 |        8.3950319 |         0.0000000 |        -0.2801095 |          0.0000000 |
| 1265 | head.anchor_encoder.pos_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.5107930 |     8.3950319 |         8.3950319 |        0.0000000 |         0.0000000 |        8.3950319 |         0.0000000 |         0.5107931 |          0.0000000 |
| 1266 | head.anchor_encoder.pos_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.0000000 |    0.6696702 |     7.4387288 |         7.4387288 |       -0.8785334 |         0.0000000 |        7.4387288 |         0.0000000 |         0.0309238 |          0.0000000 |
| 1267 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.8588657 |    0.5110375 |     2.1498170 |         2.1498170 |       -0.5778089 |         0.0000000 |        2.7806492 |         2.0000000 |         0.9169568 |          0.9993490 |
| 1268 | head.anchor_encoder.size_fc.0                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0902293 |    0.6549372 |     2.8845367 |         2.8845367 |       -2.8845367 |         0.0000000 |        1.5347012 |         1.0000000 |        -0.3534084 |          0.0858765 |
| 1269 | head.anchor_encoder.size_fc.1                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.2731973 |    0.1716408 |     1.5347012 |         1.5347012 |        0.0000000 |         0.0000000 |        1.5347012 |         1.0000000 |         0.1298881 |          0.0858765 |
| 1270 | head.anchor_encoder.size_fc.2                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.1617615 |   11.5001116 |   127.5285873 |       127.5285873 |       -0.7190371 |         0.0000000 |        3.9635403 |       127.0000000 |         0.0220669 |         10.9063110 |
| 1271 | head.anchor_encoder.size_fc.3                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.4273824 |     1.7684132 |         1.7684132 |       -1.7684132 |         0.0000000 |        1.6377945 |         0.0000000 |        -0.0208074 |          0.0000000 |
| 1272 | head.anchor_encoder.size_fc.4                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2032875 |     1.6377945 |         1.6377945 |        0.0000000 |         0.0000000 |        1.6377945 |         0.0000000 |         0.2032875 |          0.0000000 |
| 1273 | head.anchor_encoder.size_fc.5                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7853206 |     3.3801951 |         3.3801951 |       -0.9009516 |         0.0000000 |        3.3801951 |         0.0000000 |         0.0096592 |          0.0000000 |
| 1274 | head.anchor_encoder.size_fc.6                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7001832 |     2.2771502 |         2.2771502 |       -2.1276367 |         0.0000000 |        2.2771502 |         0.0000000 |        -0.0745813 |          0.0000000 |
| 1275 | head.anchor_encoder.size_fc.7                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.3128009 |     2.2771502 |         2.2771502 |        0.0000000 |         0.0000000 |        2.2771502 |         0.0000000 |         0.3128009 |          0.0000000 |
| 1276 | head.anchor_encoder.size_fc.8                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7698683 |     3.5794442 |         3.5794442 |       -0.7891839 |         0.0000000 |        3.5794442 |         0.0000000 |         0.0121393 |          0.0000000 |
| 1277 | head.anchor_encoder.size_fc.9                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7473679 |     3.0179973 |         3.0179973 |       -2.6115010 |         0.0000000 |        3.0179973 |         0.0000000 |         0.1349017 |          0.0000000 |
| 1278 | head.anchor_encoder.size_fc.10                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.4411348 |     3.0179973 |         3.0179973 |        0.0000000 |         0.0000000 |        3.0179973 |         0.0000000 |         0.4411348 |          0.0000000 |
| 1279 | head.anchor_encoder.size_fc.11                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6936823 |     3.9247329 |         3.9247329 |       -1.1644446 |         0.0000000 |        3.9247329 |         0.0000000 |         0.0285789 |          0.0000000 |
| 1280 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 2])          | qint16        | 1.0000000 | -0.7988814 |    0.9050788 |     2.1694841 |         2.1694841 |       -1.5551295 |         0.0000000 |        0.1263179 |         1.0000000 |        -0.5263550 |          0.3750000 |
| 1281 | head.anchor_encoder.yaw_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0135491 |    0.4788305 |     2.0871515 |         2.0871515 |       -1.3925010 |         0.0000000 |        1.4723502 |         2.0000000 |         0.1355964 |          0.0859375 |
| 1282 | head.anchor_encoder.yaw_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0364043 |    0.3448513 |     2.0000000 |         2.0000000 |        0.0000000 |         0.0000000 |        1.4723502 |         2.0000000 |         0.2695756 |          0.0859375 |
| 1283 | head.anchor_encoder.yaw_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 | -0.1479661 |    8.6971607 |   127.9612656 |       127.9612656 |       -1.2736557 |         0.0000000 |        3.0083756 |       127.0000000 |        -0.0002323 |          7.9375005 |
| 1284 | head.anchor_encoder.yaw_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.6099950 |     3.2273657 |         3.2273657 |       -3.2273657 |         0.0000000 |        1.9419924 |         0.0000000 |        -0.1677649 |          0.0000000 |
| 1285 | head.anchor_encoder.yaw_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2211150 |     1.9419924 |         1.9419924 |        0.0000000 |         0.0000000 |        1.9419924 |         0.0000000 |         0.2211151 |          0.0000000 |
| 1286 | head.anchor_encoder.yaw_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7329766 |     4.0271802 |         4.0271802 |       -0.9110166 |         0.0000000 |        4.0271802 |         0.0000000 |         0.0090427 |          0.0000000 |
| 1287 | head.anchor_encoder.yaw_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.8160665 |     3.7577455 |         3.7577455 |       -3.7577455 |         0.0000000 |        2.5172043 |         0.0000000 |        -0.2576037 |          0.0000000 |
| 1288 | head.anchor_encoder.yaw_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2792314 |     2.5172043 |         2.5172043 |        0.0000000 |         0.0000000 |        2.5172043 |         0.0000000 |         0.2792314 |          0.0000000 |
| 1289 | head.anchor_encoder.yaw_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.7684427 |     3.6606457 |         3.6606457 |       -0.9103000 |         0.0000000 |        3.6606457 |         0.0000000 |         0.0136580 |          0.0000000 |
| 1290 | head.anchor_encoder.yaw_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    1.0523279 |     5.1872058 |         5.1872058 |       -5.1872058 |         0.0000000 |        2.7222242 |         0.0000000 |        -0.5375715 |          0.0000000 |
| 1291 | head.anchor_encoder.yaw_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.2573781 |     2.7222242 |         2.7222242 |        0.0000000 |         0.0000000 |        2.7222242 |         0.0000000 |         0.2573781 |          0.0000000 |
| 1292 | head.anchor_encoder.yaw_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.0000000 |    0.5829991 |     4.9795518 |         4.9795518 |       -0.9658547 |         0.0000000 |        4.9795518 |         0.0000000 |         0.0706264 |          0.0000000 |
| 1293 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.0000000 |    9.1061430 |    46.0243034 |        46.0243034 |      -46.0243034 |         0.0000000 |       15.5323982 |         0.0000000 |        -6.3998346 |          0.0000000 |
| 1294 | head.anchor_encoder.vel_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0658237 |    5.9522271 |    28.6672459 |        28.6672459 |      -28.6672459 |         0.0000000 |       25.9971523 |         1.0000000 |        -0.3171374 |          0.2031250 |
| 1295 | head.anchor_encoder.vel_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.2867031 |    2.7942228 |    24.9971523 |        24.9971523 |        0.0000000 |         0.0000000 |       25.9971523 |         1.0000000 |         2.8408673 |          0.2031250 |
| 1296 | head.anchor_encoder.vel_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.1377140 |   26.3545494 |   127.8424759 |       127.8424759 |       -0.9013193 |         0.0000000 |        3.5405760 |       127.0000000 |         0.0188804 |         25.7968769 |
| 1297 | head.anchor_encoder.vel_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.5537654 |     3.1950607 |         3.1950607 |       -2.6438704 |         0.0000000 |        3.1950607 |         0.0000000 |        -0.1627086 |          0.0000000 |
| 1298 | head.anchor_encoder.vel_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.1955284 |     3.1950607 |         3.1950607 |        0.0000000 |         0.0000000 |        3.1950607 |         0.0000000 |         0.1955284 |          0.0000000 |
| 1299 | head.anchor_encoder.vel_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.7595113 |     4.1975088 |         4.1975088 |       -0.9036657 |         0.0000000 |        4.1975088 |         0.0000000 |         0.0361290 |          0.0000000 |
| 1300 | head.anchor_encoder.vel_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.9063151 |     4.7396483 |         4.7396483 |       -4.7396483 |         0.0000000 |        4.1191835 |         0.0000000 |        -0.1607145 |          0.0000000 |
| 1301 | head.anchor_encoder.vel_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.3728003 |     4.1191835 |         4.1191835 |        0.0000000 |         0.0000000 |        4.1191835 |         0.0000000 |         0.3728003 |          0.0000000 |
| 1302 | head.anchor_encoder.vel_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.7556399 |     5.6309576 |         5.6309576 |       -0.8806671 |         0.0000000 |        5.6309576 |         0.0000000 |         0.0204516 |          0.0000000 |
| 1303 | head.anchor_encoder.vel_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.8951659 |     4.5222526 |         4.5222526 |       -4.5222526 |         0.0000000 |        4.4083252 |         0.0000000 |        -0.1924435 |          0.0000000 |
| 1304 | head.anchor_encoder.vel_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.3513612 |     4.4083252 |         4.4083252 |        0.0000000 |         0.0000000 |        4.4083252 |         0.0000000 |         0.3513612 |          0.0000000 |
| 1305 | head.anchor_encoder.vel_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.0000000 |    0.6249177 |     5.3360229 |         5.3360229 |       -0.8123495 |         0.0000000 |        5.3360229 |         0.0000000 |         0.0283427 |          0.0000000 |
| 1306 | head.anchor_encoder.cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.6506497 |     7.4387288 |         7.4387288 |       -1.1644446 |         0.0000000 |        7.4387288 |         0.0000000 |         0.0349482 |          0.0000000 |
| 1307 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 256, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
| 1308 | head.layers.35.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.7058808 |     7.4387288 |     16249.9030878 |       -4.4297833 |         0.0000000 |        7.4387288 |         0.0000000 |         0.0157954 |          0.0000000 |
| 1309 | head.layers.35.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 256, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
| 1310 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.7058808 |     7.4387288 |     16249.9030878 |       -4.4297833 |         0.0000000 |        7.4387288 |         0.0000000 |         0.0157954 |          0.0000000 |
| 1311 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
| 1312 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
| 1313 | head.layers.35.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.7058808 |     7.4387288 |     16249.9030878 |       -4.4297833 |         0.0000000 |        7.4387288 |         0.0000000 |         0.0157954 |          0.0000000 |
| 1314 | head.layers.35.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.0000000 |    0.6059915 |     5.2068014 |     29154.2204333 |       -4.7676134 |         0.0000000 |        5.2068014 |         0.0000000 |         0.0197714 |          0.0000000 |
| 1315 | head.layers.35.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.0770323 |     1.9849690 |     13008.4944749 |       -1.9849690 |         0.0000000 |        1.6396148 |         0.0000000 |        -0.0036596 |          0.0000000 |
| 1316 | head.layers.35.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    2.0666347 |    13.0418005 |        13.0418005 |      -13.0418005 |         0.0000000 |       12.4758062 |         0.0000000 |         0.0181749 |          0.0000000 |
| 1317 | head.layers.35.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    2.1408925 |    11.7748194 |        11.7748194 |      -11.7748194 |         0.0000000 |       10.2552834 |         0.0000000 |        -0.0476902 |          0.0000000 |
| 1318 | head.layers.35.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.1121602 |     1.4409871 |         1.4409871 |       -1.2173600 |         0.0000000 |        1.4409871 |         0.0000000 |        -0.0010623 |          0.0000000 |
| 1319 | head.layers.35.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    2.0666347 |    13.0418005 |        13.0418005 |      -13.0418005 |         0.0000000 |       12.4758062 |         0.0000000 |         0.0181749 |          0.0000000 |
| 1320 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    2.0666347 |    13.0418005 |        13.0418005 |      -13.0418005 |         0.0000000 |       12.4758062 |         0.0000000 |         0.0181749 |          0.0000000 |
| 1321 | head.layers.35.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1322 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1323 | head.layers.35.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1121602 |     1.4409871 |         1.4409871 |       -1.2173600 |         0.0000000 |        1.4409871 |         0.0000000 |        -0.0010623 |          0.0000000 |
| 1324 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 256, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1121602 |     1.4409871 |         1.4409871 |       -1.2173600 |         0.0000000 |        1.4409871 |         0.0000000 |        -0.0010623 |          0.0000000 |
| 1325 | head.layers.35.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.0000000 |    0.2583293 |     1.6302251 |        13.0418005 |       -1.6302251 |         0.0000000 |        1.5594758 |         0.0000000 |         0.0022719 |          0.0000000 |
| 1326 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1327 | head.layers.35.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |   20.7489510 |   145.3792572 |       145.3792572 |     -139.1942139 |         0.0000000 |      145.3792572 |         0.0000000 |        -6.0321755 |          0.0000000 |
| 1328 | head.layers.35.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     1.0000000 |         1.0000000 |        0.0000000 |         0.0000000 |        1.0000000 |         0.0000000 |         0.0039062 |          0.0000000 |
| 1329 | head.layers.35.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     1.0000000 |         1.0000000 |        0.0000000 |         0.0000000 |        1.0000000 |         0.0000000 |         0.0039062 |          0.0000000 |
| 1330 | head.layers.35.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1015444 |     1.1200196 |         1.1200196 |       -1.0710682 |         0.0000000 |        1.1200196 |         0.0000000 |        -0.0013200 |          0.0000000 |
| 1331 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1332 | head.layers.35.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1333 | head.layers.35.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2052507 |     1.6252639 |         1.6252639 |       -1.6252639 |         0.0000000 |        1.5088488 |         0.0000000 |         0.0085927 |          0.0000000 |
| 1334 | head.layers.35.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     1.0000000 |         1.0000000 |        0.0000000 |         0.0000000 |        1.0000000 |         0.0000000 |         0.0039062 |          0.0000000 |
| 1335 | head.layers.35.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.0039062 |     0.5680311 |         0.5680311 |        0.0000000 |         0.0000000 |        0.5680311 |         0.0000000 |         0.0039062 |          0.0000000 |
| 1336 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1337 | head.layers.35.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2052507 |     1.6252639 |         1.6252639 |       -1.6252639 |         0.0000000 |        1.5088488 |         0.0000000 |         0.0085927 |          0.0000000 |
| 1338 | head.layers.35.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.0000000 |    0.6978868 |     7.1550484 |         7.1550484 |       -4.4135156 |         0.0000000 |        7.1550484 |         0.0000000 |         0.0243881 |          0.0000000 |
| 1339 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.0000000 |    0.6823542 |    10.1970062 |      6682.6080315 |      -10.1000414 |         0.0000000 |       10.1970062 |         0.0000000 |         0.0330641 |          0.0000000 |
| 1340 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1780314 |     4.3064914 |     28222.5912324 |       -4.3064914 |         0.0000000 |        2.9677820 |         0.0000000 |        -0.0041252 |          0.0000000 |
| 1341 | head.layers.36.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6665019 |    10.1970062 |     22275.3601050 |      -10.1000414 |         0.0000000 |       10.1970062 |         0.0000000 |         0.0340062 |          0.0000000 |
| 1342 | head.layers.36.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0002277 |  0.0000000 |    0.6665019 |    10.1970062 |     44773.6107460 |      -10.1000414 |         0.0000000 |       10.1970062 |         0.0000000 |         0.0340062 |          0.0000000 |
| 1343 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6665019 |    10.1970062 |     22275.3601050 |      -10.1000414 |         0.0000000 |       10.1970062 |         0.0000000 |         0.0340062 |          0.0000000 |
| 1344 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0002277 |  0.0000000 |    0.6665019 |    10.1970062 |     44773.6107460 |      -10.1000414 |         0.0000000 |       10.1970062 |         0.0000000 |         0.0340062 |          0.0000000 |
| 1345 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1780314 |     4.3064914 |     28222.5912324 |       -4.3064914 |         0.0000000 |        2.9677820 |         0.0000000 |        -0.0041252 |          0.0000000 |
| 1346 | head.layers.36.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.0000000 |    0.6665019 |    10.1970062 |     22275.3601050 |      -10.1000414 |         0.0000000 |       10.1970062 |         0.0000000 |         0.0340062 |          0.0000000 |
| 1347 | head.layers.36.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0002277 |  0.0000000 |    0.6665019 |    10.1970062 |     44773.6107460 |      -10.1000414 |         0.0000000 |       10.1970062 |         0.0000000 |         0.0340062 |          0.0000000 |
| 1348 | head.layers.36.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.0000000 |    0.1780314 |     4.3064914 |     28222.5912324 |       -4.3064914 |         0.0000000 |        2.9677820 |         0.0000000 |        -0.0041252 |          0.0000000 |
| 1349 | head.layers.36.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.2846414 |     8.5604677 |         8.5604677 |       -8.5604677 |         0.0000000 |        8.4414663 |         0.0000000 |         0.0188070 |          0.0000000 |
| 1350 | head.layers.36.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    1.5079312 |    13.6135807 |        13.6135807 |      -12.4560270 |         0.0000000 |       13.6135807 |         0.0000000 |        -0.0212782 |          0.0000000 |
| 1351 | head.layers.36.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2032210 |     4.3225670 |         4.3225670 |       -4.3225670 |         0.0000000 |        3.9090145 |         0.0000000 |         0.0076648 |          0.0000000 |
| 1352 | head.layers.36.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.2846414 |     8.5604677 |         8.5604677 |       -8.5604677 |         0.0000000 |        8.4414663 |         0.0000000 |         0.0188070 |          0.0000000 |
| 1353 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.2846414 |     8.5604677 |         8.5604677 |       -8.5604677 |         0.0000000 |        8.4414663 |         0.0000000 |         0.0188070 |          0.0000000 |
| 1354 | head.layers.36.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.5079312 |    13.6135807 |        13.6135807 |      -12.4560270 |         0.0000000 |       13.6135807 |         0.0000000 |        -0.0212782 |          0.0000000 |
| 1355 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    1.5079312 |    13.6135807 |        13.6135807 |      -12.4560270 |         0.0000000 |       13.6135807 |         0.0000000 |        -0.0212782 |          0.0000000 |
| 1356 | head.layers.36.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.2032210 |     4.3225670 |         4.3225670 |       -4.3225670 |         0.0000000 |        3.9090145 |         0.0000000 |         0.0076648 |          0.0000000 |
| 1357 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.2032210 |     4.3225670 |         4.3225670 |       -4.3225670 |         0.0000000 |        3.9090145 |         0.0000000 |         0.0076648 |          0.0000000 |
| 1358 | head.layers.36.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.0000000 |    0.1605802 |     1.0700585 |         8.5604677 |       -1.0700585 |         0.0000000 |        1.0551833 |         0.0000000 |         0.0023509 |          0.0000000 |
| 1359 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1360 | head.layers.36.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    6.9962931 |    59.7286568 |        59.7286568 |      -57.0967598 |         0.0000000 |       59.7286568 |         0.0000000 |        -1.5484376 |          0.0000000 |
| 1361 | head.layers.36.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9999986 |         0.9999986 |        0.0000000 |         0.0000000 |        0.9999986 |         0.0000000 |         0.0019531 |          0.0000000 |
| 1362 | head.layers.36.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9999986 |         0.9999986 |        0.0000000 |         0.0000000 |        0.9999986 |         0.0000000 |         0.0019531 |          0.0000000 |
| 1363 | head.layers.36.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0000000 |    0.1669138 |     2.9516459 |         2.9516459 |       -2.9516459 |         0.0000000 |        2.1235895 |         0.0000000 |         0.0056324 |          0.0000000 |
| 1364 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1365 | head.layers.36.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1366 | head.layers.36.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2898174 |     1.8539349 |         1.8539349 |       -1.7729082 |         0.0000000 |        1.8539349 |         0.0000000 |         0.0046081 |          0.0000000 |
| 1367 | head.layers.36.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 512])     | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.9999986 |         0.9999986 |        0.0000000 |         0.0000000 |        0.9999986 |         0.0000000 |         0.0019531 |          0.0000000 |
| 1368 | head.layers.36.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.0019531 |     0.2665950 |         0.2665950 |        0.0000000 |         0.0000000 |        0.2665950 |         0.0000000 |         0.0019531 |          0.0000000 |
| 1369 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1370 | head.layers.36.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.2898174 |     1.8539349 |         1.8539349 |       -1.7729082 |         0.0000000 |        1.8539349 |         0.0000000 |         0.0046081 |          0.0000000 |
| 1371 | head.layers.36.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.0000000 |    0.7225960 |    10.3691978 |        10.3691978 |       -9.4872608 |         0.0000000 |       10.3691978 |         0.0000000 |         0.0386143 |          0.0000000 |
| 1372 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.0000000 |    1.2763884 |    23.8033142 |     15599.5019705 |      -23.8033142 |         0.0000000 |       18.8280678 |         0.0000000 |        -0.0025859 |          0.0000000 |
| 1373 | head.layers.37                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.5904828 |     6.4915037 |         6.4915037 |       -6.4915037 |         0.0000000 |        5.7244334 |         0.0000000 |        -0.0011720 |          0.0000000 |
| 1374 | head.layers.38.kps_generator.offset            | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 24])         | qint16        | 1.0000000 |  0.0000000 |    1.0225489 |     6.7195587 |         6.7195587 |       -6.7195587 |         0.0000000 |        5.5638151 |         0.0000000 |        -0.4007958 |          0.0000000 |
| 1375 | head.layers.38.kps_generator                   | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.0000000 |    1.0225489 |     6.7195587 |         6.7195587 |       -6.7195587 |         0.0000000 |        5.5638151 |         0.0000000 |        -0.4007958 |          0.0000000 |
| 1376 | head.layers.38.kps_generator                   | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 3])       | qint16        | 1.0000000 |  0.6436465 |    8.8688383 |   112.6563568 |       112.6563568 |      -61.4566994 |       -58.0000000 |       67.0949707 |        60.0000000 |         8.2281027 |          1.2368289 |
| 1377 | head.layers.38.kps_generator.keypoints_add     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.6487842 |    9.1919975 |   113.4861603 |       113.4861603 |      -65.9268951 |       -58.0000000 |       70.8208313 |        60.0000000 |         7.8273067 |          1.2368289 |
| 1378 | head.layers.38.weight_add                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.9022496 |     8.3263397 |         8.3263397 |       -6.9698806 |         0.0000000 |        8.3263397 |         0.0000000 |         0.0337762 |          0.0000000 |
| 1379 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 3, 4])         | qint16        | 1.0000000 |  0.9977293 |    0.2101564 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3522723 |          0.3472222 |
| 1380 | head.layers.38                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 6, 12])           | qint16        | 1.0000000 |  0.9977293 |    0.2101564 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3522723 |          0.3472222 |
| 1381 | head.layers.38.camera_encoder.0                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.7774102 |     6.1848178 |         6.1848178 |       -5.6705537 |         0.0000000 |        6.1848178 |         0.0000000 |        -0.0660872 |          0.0000000 |
| 1382 | head.layers.38.camera_encoder.1                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.3556615 |     6.1848178 |         6.1848178 |        0.0000000 |         0.0000000 |        6.1848178 |         0.0000000 |         0.3556615 |          0.0000000 |
| 1383 | head.layers.38.camera_encoder.2                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    0.7958609 |     5.2689471 |         5.2689471 |       -0.8415864 |         0.0000000 |        5.2689471 |         0.0000000 |         0.0144101 |          0.0000000 |
| 1384 | head.layers.38.camera_encoder.3                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    2.4288182 |    31.1488247 |        31.1488247 |      -11.8981352 |         0.0000000 |       31.1488247 |         0.0000000 |         0.1222667 |          0.0000000 |
| 1385 | head.layers.38.camera_encoder.4                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.0000000 |    1.2755424 |    31.1488247 |        31.1488247 |        0.0000000 |         0.0000000 |       31.1488247 |         0.0000000 |         1.2755425 |          0.0000000 |
| 1386 | head.layers.38.camera_encoder.5                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.1199721 |    0.4668152 |     7.8239322 |         7.8239322 |       -1.5707064 |        -1.0000000 |        7.8239322 |         1.0000000 |         0.0144880 |          0.0000000 |
| 1387 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 256])     | qint8         | 1.0000000 |  0.0000000 |    0.9022496 |     8.3263397 |         8.3263397 |       -6.9698806 |         0.0000000 |        8.3263397 |         0.0000000 |         0.0337762 |          0.0000000 |
| 1388 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 6, 256])       | qint8         | 1.0000000 |  0.1199721 |    0.4668152 |     7.8239322 |         7.8239322 |       -1.5707064 |        -1.0000000 |        7.8239322 |         1.0000000 |         0.0144880 |          0.0000000 |
| 1389 | head.layers.38.cam_add                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 6, 256])     | qint8         | 1.0000000 | -0.0814757 |    0.9720909 |    14.0536461 |        14.0536461 |       -5.9131470 |        -1.0000000 |       14.0536461 |         1.0000000 |         0.0482642 |          0.0000000 |
| 1390 | head.layers.38.weights_fc                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 6, 64])      | qint8         | 1.0000000 |  0.0000000 |    1.5824394 |    10.4307756 |        10.4307756 |       -7.5380688 |         0.0000000 |       10.4307756 |         0.0000000 |         0.3289796 |          0.0000000 |
| 1391 | head.layers.38                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.0000000 |    1.5824394 |    10.4307756 |        10.4307756 |       -7.5380688 |         0.0000000 |       10.4307756 |         0.0000000 |         0.3289796 |          0.0000000 |
| 1392 | head.layers.38.weight_softmax                  | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.0000000 |    0.0208333 |     0.9191344 |         0.9191344 |        0.0000001 |         0.0000000 |        0.9191344 |         0.0000000 |         0.0208333 |          0.0000000 |
| 1393 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  0.8166704 |    9.9582777 |   113.4861603 |       113.4861603 |      -65.9268951 |       -58.0000000 |       70.8208313 |        60.0000000 |        10.2139587 |          5.5649042 |
| 1394 | head.layers.38                                 | torch.ones_like                                                               | torch.ones_like                                                         | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
| 1395 | head.layers.38.point_quant_stub                | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
| 1396 | head.layers.38.point_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 8, 4])       | qint16        | 1.0000000 |  0.6489345 |    6.8939981 |   113.4861603 |       113.4861603 |      -65.9268951 |       -58.0000000 |       70.8208313 |        60.0000000 |         6.1204801 |          1.1776217 |
| 1397 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 1, 1, 4, 4])   | qint16        | 1.0000000 |  0.9977411 |    0.1576173 |     0.4806752 |         0.4806752 |      -10.4531412 |       -10.0000000 |       27.1799469 |        27.0000000 |         0.3267042 |          0.3229167 |
| 1398 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 512, 8, 1, 4]) | qint16        | 1.0000000 |  0.6489345 |    6.8939981 |   113.4861603 |       113.4861603 |      -65.9268951 |       -58.0000000 |       70.8208313 |        60.0000000 |         6.1204801 |          1.1776217 |
| 1399 | head.layers.38.point_matmul                    | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 4, 4]) | qint16        | 1.0000000 |  0.0815561 |    5.1173892 |   790.5250244 |       790.5250244 |     -360.5250549 |      -180.0000000 |      250.0918732 |       440.0000000 |        -0.2075575 |          1.5117548 |
| 1400 | head.layers.38.point_sum                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 6, 512, 8, 4])    | qint16        | 0.0244144 |  0.0743473 |   15.9037819 |   821.3240356 |     33640.9191803 |     -315.9518433 |      -180.0076294 |      323.8444214 |       518.0010376 |        -0.8302299 |          6.0472698 |
| 1401 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.7175272 |   11.4751263 |   125.5057449 |      5140.6368726 |      -74.0057602 |       -63.9902344 |       74.7569656 |        61.9882507 |        -0.9517055 |          1.5886035 |
| 1402 | head.layers.38                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.8129769 |    6.1193466 |    58.0934372 |      2379.4708797 |        0.0100000 |         0.0000000 |       74.7569656 |        61.9882507 |        11.2387934 |         12.4062328 |
| 1403 | head.layers.38.reciprocal_op                   | horizon_plugin_pytorch.nn.reciprocal.Reciprocal                               | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.7935449 |   46.8245201 |    99.9816895 |     32761.5000992 |        0.0133767 |         0.0152590 |      100.0000000 |         1.2787061 |        47.2641716 |          0.6503971 |
| 1404 | head.layers.38                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.7935449 |   46.8245201 |    99.9816895 |     32761.5000992 |        0.0133767 |         0.0152590 |      100.0000000 |         1.2787061 |        47.2641716 |          0.6503971 |
| 1405 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0244144 | -0.0373479 |   26.0695057 |   821.3240356 |     33640.9191803 |     -315.9518433 |      -180.0076294 |      323.8444214 |       518.0010376 |        -1.6846073 |         10.7997427 |
| 1406 | head.layers.38.point_mul                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.4588337 | 1402.8511963 | 32374.4414062 | 106082950.9026290 |   -31595.1835938 |       -10.0001526 |    32384.4414062 |         9.9998474 |       621.3420410 |          2.1499586 |
| 1407 | head.layers.38                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.4613630 |    0.6141123 |     2.1998703 |      7208.4251644 |       -1.1000000 |        -1.0998703 |        1.1000000 |         1.0998703 |         0.0992231 |          0.2038586 |
| 1408 | head.layers.38                                 | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([156, 512, 8, 2])      | qint16        | 0.0003052 |  0.4613630 |    0.6141123 |     2.1998703 |      7208.4251644 |       -1.1000000 |        -1.0998703 |        1.1000000 |         1.0998703 |         0.0992231 |          0.2038586 |
| 1409 | head.layers.38                                 | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer            | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer      | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.0000000 |    0.4684153 |    54.5818253 |        54.5818253 |      -54.5818253 |         0.0000000 |       49.7408180 |         0.0000000 |         0.0158534 |          0.0000000 |
| 1410 | head.layers.38.feat_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.0000000 |    0.4684153 |    54.5818253 |        54.5818253 |      -54.5818253 |         0.0000000 |       49.7408180 |         0.0000000 |         0.0158534 |          0.0000000 |
| 1411 | head.layers.38                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 6, 256, 512, 8])  | qint8         | 1.0000000 |  0.0000000 |    0.4684153 |    54.5818253 |        54.5818253 |      -54.5818253 |         0.0000000 |       49.7408180 |         0.0000000 |         0.0158534 |          0.0000000 |
| 1412 | head.layers.38                                 | torch.Tensor.permute                                                          | torch.Tensor.permute                                                    | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.0000000 |    0.4684153 |    54.5818253 |        54.5818253 |      -54.5818253 |         0.0000000 |       49.7408180 |         0.0000000 |         0.0158534 |          0.0000000 |
| 1413 | head.layers.38                                 | torch.Tensor.contiguous                                                       | torch.Tensor.contiguous                                                 | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.0000000 |    0.4684153 |    54.5818253 |        54.5818253 |      -54.5818253 |         0.0000000 |       49.7408180 |         0.0000000 |         0.0158534 |          0.0000000 |
| 1414 | head.layers.38                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.0000000 |    0.4684153 |    54.5818253 |        54.5818253 |      -54.5818253 |         0.0000000 |       49.7408180 |         0.0000000 |         0.0158534 |          0.0000000 |
| 1415 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 48, 8, 1])   | qint8         | 1.0000000 |  0.0000000 |    0.0208333 |     0.9191344 |         0.9191344 |        0.0000001 |         0.0000000 |        0.9191344 |         0.0000000 |         0.0208333 |          0.0000000 |
| 1416 | head.layers.38                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.0000000 |    0.4684153 |    54.5818253 |        54.5818253 |      -54.5818253 |         0.0000000 |       49.7408180 |         0.0000000 |         0.0158534 |          0.0000000 |
| 1417 | head.layers.38.feat_mul                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.0000000 |    0.0053601 |     6.3004990 |         6.3004990 |       -6.3004990 |         0.0000000 |        5.0457811 |         0.0000000 |         0.0000271 |          0.0000000 |
| 1418 | head.layers.38                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.0000000 |    0.0053601 |     6.3004990 |         6.3004990 |       -6.3004990 |         0.0000000 |        5.0457811 |         0.0000000 |         0.0000271 |          0.0000000 |
| 1419 | head.layers.38.feat_sum                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.2109525 |    10.7476654 |        10.7476654 |      -10.7476654 |         0.0000000 |        7.9030981 |         0.0000000 |         0.0013010 |          0.0000000 |
| 1420 | head.layers.38.output_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3329905 |     7.6902881 |         7.6902881 |       -7.3575673 |         0.0000000 |        7.6902881 |         0.0000000 |        -0.0153247 |          0.0000000 |
| 1421 | head.layers.38.proj_drop                       | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.3329905 |     7.6902881 |         7.6902881 |       -7.3575673 |         0.0000000 |        7.6902881 |         0.0000000 |        -0.0153247 |          0.0000000 |
| 1422 | head.layers.38.residual_op                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.4617367 |     7.6902881 |         7.6902881 |       -7.3575673 |         0.0000000 |        7.6902881 |         0.0000000 |        -0.0082484 |          0.0000000 |
| 1423 | head.layers.39.pre_norm                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.0000000 |    0.5702813 |     8.2140551 |         8.2140551 |       -8.2140551 |         0.0000000 |        7.6695199 |         0.0000000 |        -0.0010345 |          0.0000000 |
| 1424 | head.layers.39.layers.0.0                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.0000000 |    2.4649019 |    13.5714827 |        13.5714827 |      -12.0732155 |         0.0000000 |       13.5714827 |         0.0000000 |        -1.7987701 |          0.0000000 |
| 1425 | head.layers.39.layers.0.2                      | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.0000000 |    0.3330658 |    13.5714827 |        13.5714827 |        0.0000000 |         0.0000000 |       13.5714827 |         0.0000000 |         0.3330658 |          0.0000000 |
| 1426 | head.layers.39.layers.1                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    3.7203572 |    43.8182297 |        43.8182297 |      -43.8182297 |         0.0000000 |       36.6201744 |         0.0000000 |         0.1520285 |          0.0000000 |
| 1427 | head.layers.39.layers.2                        | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    3.7203572 |    43.8182297 |        43.8182297 |      -43.8182297 |         0.0000000 |       36.6201744 |         0.0000000 |         0.1520285 |          0.0000000 |
| 1428 | head.layers.39.identity_fc                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1429 | head.layers.39.short_add                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.linear.LinearAdd                          | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    5.6667547 |    44.1741562 |        44.1741562 |      -44.1741562 |         0.0000000 |       40.5494537 |         0.0000000 |         0.1798285 |          0.0000000 |
| 1430 | head.layers.40                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.7639614 |     4.9356894 |         4.9356894 |       -4.9356894 |         0.0000000 |        3.4371281 |         0.0000000 |        -0.0064360 |          0.0000000 |
| 1431 | head.layers.41.add1                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.0075266 |     7.7023649 |         7.7023649 |       -4.3490496 |         0.0000000 |        7.7023649 |         0.0000000 |         0.0285122 |          0.0000000 |
| 1432 | head.layers.41.layers.0                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.3008482 |     8.4195204 |         8.4195204 |       -8.4195204 |         0.0000000 |        7.9042425 |         0.0000000 |        -0.4808555 |          0.0000000 |
| 1433 | head.layers.41.layers.1                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4099963 |     7.9042425 |         7.9042425 |        0.0000000 |         0.0000000 |        7.9042425 |         0.0000000 |         0.4099963 |          0.0000000 |
| 1434 | head.layers.41.layers.2                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.1615865 |     8.8047609 |         8.8047609 |       -8.8047609 |         0.0000000 |        7.7995653 |         0.0000000 |        -0.7386702 |          0.0000000 |
| 1435 | head.layers.41.layers.3                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.2114582 |     7.7995653 |         7.7995653 |        0.0000000 |         0.0000000 |        7.7995653 |         0.0000000 |         0.2114582 |          0.0000000 |
| 1436 | head.layers.41.layers.4                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.6447034 |     9.0624790 |         9.0624790 |       -0.6063811 |         0.0000000 |        9.0624790 |         0.0000000 |         0.0317844 |          0.0000000 |
| 1437 | head.layers.41.layers.5                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.3446724 |     7.0157652 |         7.0157652 |       -7.0157652 |         0.0000000 |        6.3375840 |         0.0000000 |        -0.8438421 |          0.0000000 |
| 1438 | head.layers.41.layers.6                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.2504151 |     6.3375840 |         6.3375840 |        0.0000000 |         0.0000000 |        6.3375840 |         0.0000000 |         0.2504151 |          0.0000000 |
| 1439 | head.layers.41.layers.7                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.8586369 |    14.5972347 |        14.5972347 |       -4.0572543 |         0.0000000 |       14.5972347 |         0.0000000 |        -0.5396904 |          0.0000000 |
| 1440 | head.layers.41.layers.8                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.1594732 |    14.5972347 |        14.5972347 |        0.0000000 |         0.0000000 |       14.5972347 |         0.0000000 |         0.1594732 |          0.0000000 |
| 1441 | head.layers.41.layers.9                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4656981 |    13.3125353 |        13.3125353 |       -0.8111324 |         0.0000000 |       13.3125353 |         0.0000000 |         0.0209470 |          0.0000000 |
| 1442 | head.layers.41.layers.10                       | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 11])         | qint8         | 1.0000000 |  0.0000000 |    0.7906690 |     9.2959175 |         9.2959175 |       -8.4591856 |         0.0000000 |        9.2959175 |         0.0000000 |        -0.0131134 |          0.0000000 |
| 1443 | head.layers.41.layers.11.scale_quant_stub      | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([11])                  | qint16        | 1.0000000 |  0.9881581 |    0.0262975 |     0.1453433 |         0.1453433 |        0.0019809 |         0.0000000 |        0.9931466 |         1.0000000 |         0.1159605 |          0.0909091 |
| 1444 | head.layers.41.layers.11.mul                   | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.0000000 |    0.0490851 |     1.3510996 |         1.3510996 |       -1.2294862 |         0.0000000 |        1.3510996 |         0.0000000 |        -0.0167264 |          0.0000000 |
| 1445 | head.layers.41.add2                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.5510540 |    5.2238426 |   112.6496506 |       112.6496506 |      -61.4356117 |       -58.0000000 |       67.1256485 |        60.0000000 |         0.6362703 |          0.6780485 |
| 1446 | head.layers.41.cls_layers.0                    | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    2.0447636 |    10.6430969 |        10.6430969 |       -9.4580460 |         0.0000000 |       10.6430969 |         0.0000000 |        -0.4075266 |          0.0000000 |
| 1447 | head.layers.41.cls_layers.1                    | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.8186185 |    10.6430969 |        10.6430969 |        0.0000000 |         0.0000000 |       10.6430969 |         0.0000000 |         0.8186186 |          0.0000000 |
| 1448 | head.layers.41.cls_layers.2                    | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.6569315 |     6.5887203 |         6.5887203 |       -0.7511225 |         0.0000000 |        6.5887203 |         0.0000000 |         0.0418928 |          0.0000000 |
| 1449 | head.layers.41.cls_layers.3                    | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.8174965 |    14.4964895 |        14.4964895 |       -9.3974276 |         0.0000000 |       14.4964895 |         0.0000000 |        -0.8643701 |          0.0000000 |
| 1450 | head.layers.41.cls_layers.4                    | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.4765632 |    14.4964895 |        14.4964895 |        0.0000000 |         0.0000000 |       14.4964895 |         0.0000000 |         0.4765633 |          0.0000000 |
| 1451 | head.layers.41.cls_layers.5                    | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.5404656 |     9.5154924 |         9.5154924 |       -0.7832292 |         0.0000000 |        9.5154924 |         0.0000000 |         0.0257115 |          0.0000000 |
| 1452 | head.layers.41.cls_layers.6                    | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 4])          | torch.float32 |           |  0.9733738 |    1.7015612 |     8.0015774 |                   |       -8.2513523 |        -4.5786386 |        3.4624825 |        -4.5390954 |        -5.9018373 |         -4.5660830 |
| 1453 | head.layers.41.quality_layers.0                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.9945945 |    13.3002062 |        13.3002062 |      -13.3002062 |         0.0000000 |       10.8907080 |         0.0000000 |        -0.6515269 |          0.0000000 |
| 1454 | head.layers.41.quality_layers.1                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.6715339 |    10.8907080 |        10.8907080 |        0.0000000 |         0.0000000 |       10.8907080 |         0.0000000 |         0.6715339 |          0.0000000 |
| 1455 | head.layers.41.quality_layers.2                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.7048533 |     7.7405000 |         7.7405000 |       -0.9491971 |         0.0000000 |        7.7405000 |         0.0000000 |         0.0151683 |          0.0000000 |
| 1456 | head.layers.41.quality_layers.3                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    1.7347682 |    28.7623215 |        28.7623215 |      -11.1005507 |         0.0000000 |       28.7623215 |         0.0000000 |        -1.1997699 |          0.0000000 |
| 1457 | head.layers.41.quality_layers.4                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.2674991 |    28.7623215 |        28.7623215 |        0.0000000 |         0.0000000 |       28.7623215 |         0.0000000 |         0.2674991 |          0.0000000 |
| 1458 | head.layers.41.quality_layers.5                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 | -0.1413874 |    0.2589698 |     9.5358448 |         9.5358448 |       -0.8803175 |        -1.0000000 |        9.5358448 |         0.0000000 |         0.0411248 |         -0.0078125 |
| 1459 | head.layers.41.quality_layers.6                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 2])          | torch.float32 |           | -0.9311447 |    3.7917023 |    11.1701527 |                   |       -3.8205884 |        -0.7609041 |       10.4092484 |        -0.0321188 |         3.0236740 |         -0.3965115 |
| 1460 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 11])         | torch.float32 |           |  0.5510540 |    5.2238426 |   112.6496506 |                   |      -61.4356117 |       -58.0000000 |       67.1256485 |        60.0000000 |         0.6362703 |          0.6780485 |
| 1461 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 4])          | torch.float32 |           |  0.9733738 |    1.7015612 |     8.0015774 |                   |       -8.2513523 |        -4.5786386 |        3.4624825 |        -4.5390954 |        -5.9018373 |         -4.5660830 |
| 1462 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 2])          | torch.float32 |           | -0.9311447 |    3.7917023 |    11.1701527 |                   |       -3.8205884 |        -0.7609041 |       10.4092484 |        -0.0321188 |         3.0236740 |         -0.3965115 |
| 1463 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 4])          | torch.float32 |           |  0.9733738 |    1.7015612 |     8.0015774 |                   |       -8.2513523 |        -4.5786386 |        3.4624825 |        -4.5390954 |        -5.9018373 |         -4.5660830 |
| 1464 | head.instance_bank.dequant                     | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 11])         | torch.float32 |           |  0.5510540 |    5.2238426 |   112.6496506 |                   |      -61.4356117 |       -58.0000000 |       67.1256485 |        60.0000000 |         0.6362703 |          0.6780485 |
| 1465 | head.instance_bank.dequant                     | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 256])        | torch.float32 |           |  0.0000000 |    0.7639614 |     4.9356894 |                   |       -4.9356894 |         0.0000000 |        3.4371281 |         0.0000000 |        -0.0064360 |          0.0000000 |
| 1466 | head                                           | torch.Tensor.detach                                                           | torch.Tensor.detach                                                     | torch.Size([26, 512, 256])        | torch.float32 |           |  0.0000000 |    0.7639614 |     4.9356894 |                   |       -4.9356894 |         0.0000000 |        3.4371281 |         0.0000000 |        -0.0064360 |          0.0000000 |
| 1467 | head                                           | torch.Tensor.detach                                                           | torch.Tensor.detach                                                     | torch.Size([26, 512, 11])         | torch.float32 |           |  0.5510540 |    5.2238426 |   112.6496506 |                   |      -61.4356117 |       -58.0000000 |       67.1256485 |        60.0000000 |         0.6362703 |          0.6780485 |
| 1468 | head                                           | torch.Tensor.detach                                                           | torch.Tensor.detach                                                     | torch.Size([26, 512, 4])          | torch.float32 |           |  0.9733738 |    1.7015612 |     8.0015774 |                   |       -8.2513523 |        -4.5786386 |        3.4624825 |        -4.5390954 |        -5.9018373 |         -4.5660830 |
| 1469 | head                                           | torch.Tensor.max                                                              | torch.Tensor.max                                                        | torch.Size([26, 512])             | torch.float32 |           |  0.9489148 |    1.5040385 |     8.0015774 |                   |       -7.3521481 |        -4.5390954 |        3.4624825 |        -4.5390954 |        -5.0222473 |         -4.5390954 |
| 1470 | head                                           | torch.Tensor.sigmoid                                                          | torch.Tensor.sigmoid                                                    | torch.Size([26, 512])             | torch.float32 |           |  0.3742087 |    0.0236863 |     0.9590311 |                   |        0.0006408 |         0.0105701 |        0.9696013 |         0.0105701 |         0.0247225 |          0.0105701 |
| 1471 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 128])             | torch.float32 |           |  0.2319079 |    0.0102133 |     0.5990746 |                   |        0.0010876 |         0.0105701 |        0.6096448 |         0.0105701 |         0.0092187 |          0.0105701 |
| 1472 | head                                           | torch.maximum                                                                 | torch.maximum                                                           | torch.Size([26, 128])             | torch.float32 |           |  0.6516134 |    0.0510647 |     0.5990746 |                   |        0.0156421 |         0.0105701 |        0.6096448 |         0.0105701 |         0.0616348 |          0.0105701 |
| 1473 | head                                           | torch.topk                                                                    | torch.topk                                                              | torch.Size([26, 128])             | torch.float32 |           |  0.6591042 |    0.0881939 |     0.9590311 |                   |        0.0280274 |         0.0105701 |        0.9696013 |         0.0105701 |         0.0987641 |          0.0105701 |
| 1474 | head                                           | torch.Tensor.add                                                              | torch.Tensor.add                                                        | torch.Size([26, 128])             | torch.int64   |           |  0.9997138 |  157.7061310 |   496.0000000 |                   |        0.0000000 |         0.0000000 |    13305.0000000 |     12927.0000000 |      6594.8847656 |       6463.5000000 |
| 1475 | head                                           | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([3328])                | torch.int64   |           |  0.9997138 |  157.7061310 |   496.0000000 |                   |        0.0000000 |         0.0000000 |    13305.0000000 |     12927.0000000 |      6594.8847656 |       6463.5000000 |
| 1476 | head                                           | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([13312, 256])          | torch.float32 |           |  0.0000000 |    0.7639614 |     4.9356894 |                   |       -4.9356894 |         0.0000000 |        3.4371281 |         0.0000000 |        -0.0064360 |          0.0000000 |
| 1477 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([3328, 256])           | torch.float32 |           |  0.0000000 |    0.7463908 |     4.9356894 |                   |       -4.9356894 |         0.0000000 |        3.4371281 |         0.0000000 |        -0.0056891 |          0.0000000 |
| 1478 | head                                           | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 128, 256])        | torch.float32 |           |  0.0000000 |    0.7463908 |     4.9356894 |                   |       -4.9356894 |         0.0000000 |        3.4371281 |         0.0000000 |        -0.0056891 |          0.0000000 |
| 1479 | head                                           | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([13312, 11])           | torch.float32 |           |  0.5510540 |    5.2238426 |   112.6496506 |                   |      -61.4356117 |       -58.0000000 |       67.1256485 |        60.0000000 |         0.6362703 |          0.6780485 |
| 1480 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([3328, 11])            | torch.float32 |           | -0.3152977 |   12.3969450 |   112.6496506 |                   |      -60.3900108 |       -51.0000000 |       66.7594757 |         0.0000000 |         0.2073521 |         -4.8986015 |
| 1481 | head                                           | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 128, 11])         | torch.float32 |           | -0.3152977 |   12.3969450 |   112.6496506 |                   |      -60.3900108 |       -51.0000000 |       66.7594757 |         0.0000000 |         0.2073521 |         -4.8986015 |
| 1482 | head                                           | torch.cat                                                                     | torch.cat                                                               | torch.Size([26, 512])             | torch.float32 |           |  0.6026762 |    0.0829554 |     0.9723920 |                   |        0.0259288 |         0.0105701 |        0.9829622 |         0.0105701 |         0.0935256 |          0.0105701 |
| 1483 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 384])             | torch.float32 |           |  0.6233615 |    0.0860009 |     0.9719778 |                   |        0.0259288 |         0.0105701 |        0.9825479 |         0.0105701 |         0.0965711 |          0.0105701 |
| 1484 | head                                           | torch.cat                                                                     | torch.cat                                                               | torch.Size([26, 512, 256])        | torch.float32 |           |  0.0000000 |    0.5728117 |     4.9356894 |                   |       -4.9356894 |         0.0000000 |        3.5256600 |         0.0000000 |        -0.0042508 |          0.0000000 |
| 1485 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 384, 256])        | torch.float32 |           |  0.0000000 |    0.5921156 |     4.9356894 |                   |       -4.9356894 |         0.0000000 |        3.5052328 |         0.0000000 |        -0.0044061 |          0.0000000 |
| 1486 | head                                           | torch.cat                                                                     | torch.cat                                                               | torch.Size([26, 512, 11])         | torch.float32 |           | -0.4127282 |   11.7189655 |   140.5460510 |                   |      -88.4849091 |       -51.2515602 |      112.4251709 |         0.0000000 |         1.1882014 |         -4.9083738 |
| 1487 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 384, 11])         | torch.float32 |           | -0.4118236 |   11.9249592 |   140.5460510 |                   |      -88.4849091 |       -51.2515602 |      112.4251709 |         0.0000000 |         1.1962492 |         -4.9024920 |
| 1488 |                                                | torch.Tensor.sigmoid                                                          | torch.Tensor.sigmoid                                                    | torch.Size([26, 512, 4])          | torch.float32 |           |  0.2830507 |    0.0123149 |     0.9590311 |                   |        0.0002608 |         0.0101645 |        0.9696013 |         0.0105701 |         0.0095667 |          0.0102928 |
| 1489 |                                                | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([26, 2048])            | torch.float32 |           |  0.2830507 |    0.0123149 |     0.9590311 |                   |        0.0002608 |         0.0101645 |        0.9696013 |         0.0105701 |         0.0095667 |          0.0102928 |
| 1490 |                                                | torch.Tensor.topk                                                             | torch.Tensor.topk                                                       | torch.Size([26, 300])             | torch.float32 |           |  0.5230796 |    0.0370488 |     0.9590311 |                   |        0.0060306 |         0.0105701 |        0.9696013 |         0.0105701 |         0.0462789 |          0.0105701 |
| 1491 |                                                | torch.Tensor.remainder                                                        | torch.Tensor.remainder                                                  | torch.Size([26, 300])             | torch.int64   |           |  0.0000000 |    0.8407692 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8407692 |          0.0000000 |
| 1492 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512])             | torch.float32 |           |  0.1079396 |    0.6633422 |     2.4385896 |                   |       -2.4707084 |        -0.0321188 |        2.1072216 |        -0.0321188 |        -0.0952053 |         -0.0321188 |
| 1493 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([26, 300])             | torch.int64   |           |  0.7017321 |  192.1361542 |   498.0000000 |                   |        0.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       265.8623047 |        149.5000000 |
| 1494 |                                                | torch.gather                                                                  | torch.gather                                                            | torch.Size([26, 300])             | torch.float32 |           |  0.4146920 |    0.4200929 |     2.4230037 |                   |       -2.4551225 |        -0.0321188 |        2.1072216 |        -0.0321188 |        -0.2487996 |         -0.0321188 |
| 1495 |                                                | torch.Tensor.sigmoid                                                          | torch.Tensor.sigmoid                                                    | torch.Size([26, 300])             | torch.float32 |           |  0.9651351 |    0.0976944 |     0.4129062 |                   |        0.0790648 |         0.4919710 |        0.8916031 |         0.4919710 |         0.4431795 |          0.4919710 |
| 1496 |                                                | torch.Tensor.mul_                                                             | torch.Tensor.mul_                                                       | torch.Size([26, 300])             | torch.float32 |           |  0.3925943 |    0.0192537 |     0.8508317 |                   |        0.0006062 |         0.0052002 |        0.8560320 |         0.0052002 |         0.0233135 |          0.0052002 |
| 1497 |                                                | torch.sort                                                                    | torch.sort                                                              | torch.Size([26, 300])             | torch.float32 |           |  0.3925943 |    0.0192537 |     0.8508317 |                   |        0.0006062 |         0.0052002 |        0.8560320 |         0.0052002 |         0.0233135 |          0.0052002 |
| 1498 |                                                | torch.gather                                                                  | torch.gather                                                            | torch.Size([26, 300])             | torch.int64   |           |  0.0000000 |    0.8407692 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8407692 |          0.0000000 |
| 1499 |                                                | torch.gather                                                                  | torch.gather                                                            | torch.Size([26, 300])             | torch.int64   |           |  0.7070124 |  760.0258789 |  1988.0000000 |                   |        0.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |      1064.2899170 |        598.0000000 |
| 1500 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7700000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7700000 |          0.0000000 |
| 1501 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.3142821 |    0.0139539 |     0.7443404 |                   |        0.0006455 |         0.0052002 |        0.7495406 |         0.0052002 |         0.0172254 |          0.0052002 |
| 1502 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6008523 |  773.4500122 |  1884.0000000 |                   |        0.0000000 |         0.0000000 |     2028.0000000 |      1196.0000000 |       885.9967041 |        598.0000000 |
| 1503 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.6001885 |  193.4933319 |   471.0000000 |                   |        0.0000000 |         0.0000000 |      507.0000000 |       299.0000000 |       221.3066711 |        149.5000000 |
| 1504 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0260679 |   10.6833868 |   117.7887650 |                   |      -60.3647995 |       -58.0000000 |       61.8769302 |        60.0000000 |        -0.5927907 |         -0.5678788 |
| 1505 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0047010 |     0.0637286 |                   |       -0.0185777 |         0.0000000 |        0.0637286 |         0.0000000 |         0.0003481 |          0.0000000 |
| 1506 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7706615 |    1.6183158 |     2.2608452 |                   |       -1.2608452 |         0.0000000 |       -0.8565107 |         1.0000000 |        -1.0449827 |          0.5733333 |
| 1507 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1372261 |     3.1415904 |                   |       -3.1415904 |         0.0000000 |        3.1415868 |         0.0000000 |        -1.5713127 |          0.0000000 |
| 1508 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0331067 |   26.5641327 |   117.7887650 |                   |      -60.3647995 |       -58.0000000 |       61.8769302 |        60.0000000 |         6.9887891 |         -3.0366666 |
| 1509 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7883775 |    0.5535663 |     1.7475767 |                   |        0.4625443 |         0.0000000 |        2.6109724 |         2.0000000 |         1.0309188 |          0.7633334 |
| 1510 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8696632 |    1.4150385 |     6.2232246 |                   |        1.5881094 |         1.0000000 |       13.6122808 |         7.3890562 |         3.1939926 |          2.8758757 |
| 1511 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1372261 |     3.1415904 |                   |       -3.1415904 |         0.0000000 |        3.1415868 |         0.0000000 |        -1.5713127 |          0.0000000 |
| 1512 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   11.5137167 |    45.2345657 |                   |      -45.2345657 |         0.0000000 |       12.2668753 |         0.0000000 |        -9.8450623 |          0.0000000 |
| 1513 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0109147 |   12.1615887 |   117.7887650 |                   |      -60.3647995 |       -58.0000000 |       61.8769302 |        60.0000000 |        -0.0558155 |         -0.0482373 |
| 1514 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0109147 |   12.1615887 |   117.7887650 |                   |      -60.3647995 |       -58.0000000 |       61.8769302 |        60.0000000 |        -0.0558155 |         -0.0482373 |
| 1515 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.3142821 |    0.0139539 |     0.7443404 |                   |        0.0006455 |         0.0052002 |        0.7495406 |         0.0052002 |         0.0172254 |          0.0052002 |
| 1516 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7700000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7700000 |          0.0000000 |
| 1517 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8100000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8100000 |          0.0000000 |
| 1518 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.7187200 |    0.0311430 |     0.4002290 |                   |        0.0049057 |         0.0052002 |        0.4054292 |         0.0052002 |         0.0363412 |          0.0052002 |
| 1519 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8275746 |  733.4166870 |  1944.0000000 |                   |      512.0000000 |         0.0000000 |     2031.0000000 |      1196.0000000 |      1286.5833740 |        598.0000000 |
| 1520 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8272923 |  183.2500000 |   486.0000000 |                   |      128.0000000 |         0.0000000 |      507.0000000 |       299.0000000 |       321.4433289 |        149.5000000 |
| 1521 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0446537 |    8.8889227 |   119.1058655 |                   |      -59.4715691 |       -58.0000000 |       66.4969101 |        60.0000000 |        -2.3737054 |          1.3715152 |
| 1522 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0006909 |     0.0093084 |                   |       -0.0053347 |         0.0000000 |        0.0093084 |         0.0000000 |         0.0000579 |          0.0000000 |
| 1523 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7561354 |    1.5717725 |     2.0023100 |                   |       -1.0023100 |         0.0000000 |       -0.5993444 |         1.0000000 |        -0.9984391 |          0.5733333 |
| 1524 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1408961 |     3.1415837 |                   |       -3.1415837 |         0.0000000 |        3.1415789 |         0.0000000 |         0.1884425 |          0.0000000 |
| 1525 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0565090 |   20.7446461 |   119.1058655 |                   |      -59.4715691 |       -58.0000000 |       66.4969101 |        60.0000000 |        -0.6146175 |          4.0744448 |
| 1526 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7639625 |    0.6236274 |     2.5862415 |                   |        0.4433556 |         0.0000000 |        2.5862415 |         2.0000000 |         0.9965397 |          0.7633334 |
| 1527 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8264950 |    1.6971831 |    12.2797661 |                   |        1.5579262 |         1.0000000 |       13.2797661 |         7.3890562 |         3.2010901 |          2.8758757 |
| 1528 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1408961 |     3.1415837 |                   |       -3.1415837 |         0.0000000 |        3.1415789 |         0.0000000 |         0.1884425 |          0.0000000 |
| 1529 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   10.7002869 |    45.8962822 |                   |      -45.8962822 |         0.0000000 |       13.9206505 |         0.0000000 |        -8.7527151 |          0.0000000 |
| 1530 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0258658 |   10.2567244 |   119.1058655 |                   |      -59.4715691 |       -58.0000000 |       66.4969101 |        60.0000000 |        -1.8310283 |          2.0850961 |
| 1531 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0258658 |   10.2567244 |   119.1058655 |                   |      -59.4715691 |       -58.0000000 |       66.4969101 |        60.0000000 |        -1.8310285 |          2.0850961 |
| 1532 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.7187200 |    0.0311430 |     0.4002290 |                   |        0.0049057 |         0.0052002 |        0.4054292 |         0.0052002 |         0.0363412 |          0.0052002 |
| 1533 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8100000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8100000 |          0.0000000 |
| 1534 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.9666666 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.9666667 |          0.0000000 |
| 1535 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.2747777 |    0.0102738 |     0.6589702 |                   |        0.0006590 |         0.0052002 |        0.6641704 |         0.0052002 |         0.0137449 |          0.0052002 |
| 1536 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6926345 |  768.3333130 |  1988.0000000 |                   |        0.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |      1034.1400146 |        598.0000000 |
| 1537 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.6920295 |  192.1199951 |   497.0000000 |                   |        0.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       258.2933350 |        149.5000000 |
| 1538 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0402363 |   13.0362310 |   119.3208466 |                   |      -60.0560379 |       -58.0000000 |       61.6467171 |        60.0000000 |        -1.8797288 |         -4.2527275 |
| 1539 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0052169 |     0.0703120 |                   |       -0.0155058 |         0.0000000 |        0.0703120 |         0.0000000 |         0.0030339 |          0.0000000 |
| 1540 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7625819 |    1.5983971 |     2.3144379 |                   |       -1.3144380 |         0.0000000 |       -0.8345767 |         1.0000000 |        -1.0250638 |          0.5733333 |
| 1541 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1365540 |     3.1415873 |                   |       -3.1415870 |         0.0000000 |        3.1415873 |         0.0000000 |        -1.1339911 |          0.0000000 |
| 1542 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0504205 |   35.5462761 |   119.3208466 |                   |      -60.0560379 |       -58.0000000 |       61.6467171 |        60.0000000 |         1.4922811 |        -16.5477791 |
| 1543 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7760520 |    0.5720940 |     2.6532729 |                   |        0.4362316 |         0.0000000 |        2.6532729 |         2.0000000 |         1.0286939 |          0.7633334 |
| 1544 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8427415 |    1.5023655 |    13.2004385 |                   |        1.5468670 |         1.0000000 |       14.2004385 |         7.3890562 |         3.2329116 |          2.8758757 |
| 1545 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1365540 |     3.1415873 |                   |       -3.1415870 |         0.0000000 |        3.1415873 |         0.0000000 |        -1.1339911 |          0.0000000 |
| 1546 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   11.1466055 |    40.4193001 |                   |      -40.4193001 |         0.0000000 |       15.2040329 |         0.0000000 |        -9.0726366 |          0.0000000 |
| 1547 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0298965 |   14.7722301 |   119.3208466 |                   |      -60.0560379 |       -58.0000000 |       61.6467171 |        60.0000000 |        -1.4176325 |         -4.1015701 |
| 1548 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0298965 |   14.7722301 |   119.3208466 |                   |      -60.0560379 |       -58.0000000 |       61.6467171 |        60.0000000 |        -1.4176325 |         -4.1015711 |
| 1549 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.2747777 |    0.0102738 |     0.6589702 |                   |        0.0006590 |         0.0052002 |        0.6641704 |         0.0052002 |         0.0137449 |          0.0052002 |
| 1550 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.9666666 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.9666666 |          0.0000000 |
| 1551 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8900000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8900000 |          0.0000000 |
| 1552 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.5034819 |    0.0352184 |     0.7595077 |                   |        0.0030732 |         0.0052002 |        0.7647079 |         0.0052002 |         0.0403952 |          0.0052002 |
| 1553 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8257926 |  734.4566650 |  1948.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |      1281.6234131 |        598.0000000 |
| 1554 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8254972 |  183.4966736 |   487.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       320.1833496 |        149.5000000 |
| 1555 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0461158 |    8.9256220 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -2.2652466 |          1.3715152 |
| 1556 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0010053 |     0.0787874 |                   |       -0.0279753 |         0.0000000 |        0.0787874 |         0.0000000 |         0.0002803 |          0.0000000 |
| 1557 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7565420 |    1.5722388 |     2.0020328 |                   |       -1.0020328 |         0.0000000 |       -0.8249742 |         1.0000000 |        -0.9989055 |          0.5733333 |
| 1558 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1405218 |     3.1415746 |                   |       -3.1415746 |         0.0000000 |        3.1415720 |         0.0000000 |         0.5232612 |          0.0000000 |
| 1559 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0570010 |   21.2927151 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -0.6124920 |          4.0744448 |
| 1560 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7625993 |    0.6259766 |     2.6107976 |                   |        0.4520407 |         0.0000000 |        2.6107976 |         2.0000000 |         1.0018735 |          0.7633334 |
| 1561 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8236331 |    1.7123199 |    12.6099024 |                   |        1.5715159 |         1.0000000 |       13.6099024 |         7.3890562 |         3.2226293 |          2.8758757 |
| 1562 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1405218 |     3.1415746 |                   |       -3.1415746 |         0.0000000 |        3.1415720 |         0.0000000 |         0.5232612 |          0.0000000 |
| 1563 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   10.2841749 |    43.9282455 |                   |      -43.9282455 |         0.0000000 |       11.7817955 |         0.0000000 |        -8.3624105 |          0.0000000 |
| 1564 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0274685 |   10.3008146 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -1.6733559 |          2.0850961 |
| 1565 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0274685 |   10.3008146 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -1.6733559 |          2.0850961 |
| 1566 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.5034819 |    0.0352184 |     0.7595077 |                   |        0.0030732 |         0.0052002 |        0.7647079 |         0.0052002 |         0.0403952 |          0.0052002 |
| 1567 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8900000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8900000 |          0.0000000 |
| 1568 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7066666 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7066667 |          0.0000000 |
| 1569 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.4731478 |    0.0150905 |     0.3183137 |                   |        0.0015417 |         0.0052002 |        0.3235140 |         0.0052002 |         0.0189348 |          0.0052002 |
| 1570 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.5502974 |  828.2199707 |  1844.0000000 |                   |        0.0000000 |         0.0000000 |     2028.0000000 |      1196.0000000 |       896.8266602 |        598.0000000 |
| 1571 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.5496368 |  207.1766663 |   461.0000000 |                   |        0.0000000 |         0.0000000 |      507.0000000 |       299.0000000 |       224.0299988 |        149.5000000 |
| 1572 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0261537 |   12.9837036 |   119.1261597 |                   |      -59.9743538 |       -58.0000000 |       66.1661148 |        60.0000000 |         0.1157178 |         -2.7400000 |
| 1573 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0095206 |     0.0642429 |                   |       -0.0402521 |         0.0000000 |        0.0642429 |         0.0000000 |        -0.0042980 |          0.0000000 |
| 1574 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7724742 |    1.6133493 |     2.2805219 |                   |       -1.2805220 |         0.0000000 |       -0.8972044 |         1.0000000 |        -1.0400161 |          0.5733333 |
| 1575 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1329558 |     3.1415901 |                   |       -3.1415901 |         0.0000000 |        3.1415870 |         0.0000000 |        -1.2531065 |          0.0000000 |
| 1576 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0329333 |   33.6670609 |   119.1261597 |                   |      -59.9743538 |       -58.0000000 |       66.1661148 |        60.0000000 |        11.1798487 |        -11.0011110 |
| 1577 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7900258 |    0.5418662 |     1.7380401 |                   |        0.4442392 |         0.0000000 |        2.6017427 |         2.0000000 |         1.0019368 |          0.7633334 |
| 1578 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8752584 |    1.3686107 |     6.0981665 |                   |        1.5593034 |         1.0000000 |       13.4872227 |         7.3890562 |         3.0763197 |          2.8758757 |
| 1579 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1329558 |     3.1415901 |                   |       -3.1415901 |         0.0000000 |        3.1415870 |         0.0000000 |        -1.2531065 |          0.0000000 |
| 1580 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   12.8570337 |    46.0273209 |                   |      -46.0273209 |         0.0000000 |       12.6311359 |         0.0000000 |       -11.4093819 |          0.0000000 |
| 1581 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0161899 |   14.6811066 |   119.1261597 |                   |      -59.9743538 |       -58.0000000 |       66.1661148 |        60.0000000 |         0.7287253 |         -2.4375706 |
| 1582 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0161899 |   14.6811066 |   119.1261597 |                   |      -59.9743538 |       -58.0000000 |       66.1661148 |        60.0000000 |         0.7287252 |         -2.4375706 |
| 1583 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.4731478 |    0.0150905 |     0.3183137 |                   |        0.0015417 |         0.0052002 |        0.3235140 |         0.0052002 |         0.0189348 |          0.0052002 |
| 1584 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7066666 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7066666 |          0.0000000 |
| 1585 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8900000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8900000 |          0.0000000 |
| 1586 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.5034819 |    0.0352184 |     0.7595077 |                   |        0.0030732 |         0.0052002 |        0.7647079 |         0.0052002 |         0.0403952 |          0.0052002 |
| 1587 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8257926 |  734.4566650 |  1948.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |      1281.6234131 |        598.0000000 |
| 1588 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8254972 |  183.4966736 |   487.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       320.1833496 |        149.5000000 |
| 1589 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0461158 |    8.9256220 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -2.2652466 |          1.3715152 |
| 1590 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0010053 |     0.0787874 |                   |       -0.0279753 |         0.0000000 |        0.0787874 |         0.0000000 |         0.0002803 |          0.0000000 |
| 1591 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7565420 |    1.5722388 |     2.0020328 |                   |       -1.0020328 |         0.0000000 |       -0.8249742 |         1.0000000 |        -0.9989055 |          0.5733333 |
| 1592 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1405218 |     3.1415746 |                   |       -3.1415746 |         0.0000000 |        3.1415720 |         0.0000000 |         0.5232612 |          0.0000000 |
| 1593 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0570010 |   21.2927151 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -0.6124920 |          4.0744448 |
| 1594 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7625993 |    0.6259766 |     2.6107976 |                   |        0.4520407 |         0.0000000 |        2.6107976 |         2.0000000 |         1.0018735 |          0.7633334 |
| 1595 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8236331 |    1.7123199 |    12.6099024 |                   |        1.5715159 |         1.0000000 |       13.6099024 |         7.3890562 |         3.2226293 |          2.8758757 |
| 1596 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1405218 |     3.1415746 |                   |       -3.1415746 |         0.0000000 |        3.1415720 |         0.0000000 |         0.5232612 |          0.0000000 |
| 1597 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   10.2841749 |    43.9282455 |                   |      -43.9282455 |         0.0000000 |       11.7817955 |         0.0000000 |        -8.3624105 |          0.0000000 |
| 1598 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0274685 |   10.3008146 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -1.6733559 |          2.0850961 |
| 1599 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0274685 |   10.3008146 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -1.6733559 |          2.0850961 |
| 1600 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.5034819 |    0.0352184 |     0.7595077 |                   |        0.0030732 |         0.0052002 |        0.7647079 |         0.0052002 |         0.0403952 |          0.0052002 |
| 1601 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8900000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8900000 |          0.0000000 |
| 1602 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8166667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8166667 |          0.0000000 |
| 1603 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.2835294 |    0.0109666 |     0.6942171 |                   |        0.0006973 |         0.0052002 |        0.6994173 |         0.0052002 |         0.0147005 |          0.0052002 |
| 1604 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6446701 |  778.0300293 |  1976.0000000 |                   |        0.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |       973.3633423 |        598.0000000 |
| 1605 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.6440906 |  194.5633392 |   494.0000000 |                   |        0.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       243.1366730 |        149.5000000 |
| 1606 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0268621 |   10.1254902 |   119.6064224 |                   |      -60.0493546 |       -58.0000000 |       61.6582184 |        60.0000000 |        -1.7246748 |         -0.4903030 |
| 1607 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0064674 |     0.0686461 |                   |       -0.0148973 |         0.0000000 |        0.0686461 |         0.0000000 |         0.0045306 |          0.0000000 |
| 1608 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7668473 |    1.6006831 |     2.3037853 |                   |       -1.3037853 |         0.0000000 |       -0.8248336 |         1.0000000 |        -1.0273497 |          0.5733333 |
| 1609 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1353467 |     3.1415825 |                   |       -3.1415825 |         0.0000000 |        3.1415670 |         0.0000000 |        -0.7374787 |          0.0000000 |
| 1610 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0340271 |   25.3798370 |   119.6064224 |                   |      -60.0493546 |       -58.0000000 |       61.6582184 |        60.0000000 |         1.3384951 |         -2.7522223 |
| 1611 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7916043 |    0.5443582 |     2.6645815 |                   |        0.4368647 |         0.0000000 |        2.6645815 |         2.0000000 |         1.0532647 |          0.7633334 |
| 1612 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8562165 |    1.4279649 |    13.3619385 |                   |        1.5478467 |         1.0000000 |       14.3619385 |         7.3890562 |         3.3173113 |          2.8758757 |
| 1613 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1353467 |     3.1415825 |                   |       -3.1415825 |         0.0000000 |        3.1415670 |         0.0000000 |        -0.7374787 |          0.0000000 |
| 1614 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   10.6668873 |    39.7199631 |                   |      -39.7199631 |         0.0000000 |       15.4871912 |         0.0000000 |        -8.3746271 |          0.0000000 |
| 1615 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0093323 |   11.5559425 |   119.6064224 |                   |      -60.0493546 |       -58.0000000 |       61.6582184 |        60.0000000 |        -1.1893942 |          0.0370960 |
| 1616 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0093323 |   11.5559425 |   119.6064224 |                   |      -60.0493546 |       -58.0000000 |       61.6582184 |        60.0000000 |        -1.1893941 |          0.0370961 |
| 1617 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.2835294 |    0.0109666 |     0.6942171 |                   |        0.0006973 |         0.0052002 |        0.6994173 |         0.0052002 |         0.0147005 |          0.0052002 |
| 1618 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8166667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8166667 |          0.0000000 |
| 1619 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.9666666 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.9666667 |          0.0000000 |
| 1620 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.2692741 |    0.0103023 |     0.6783351 |                   |        0.0006308 |         0.0052002 |        0.6835353 |         0.0052002 |         0.0137759 |          0.0052002 |
| 1621 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6940166 |  766.6599731 |  1988.0000000 |                   |        0.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |      1034.1134033 |        598.0000000 |
| 1622 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.6934148 |  191.6999969 |   497.0000000 |                   |        0.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       258.2866821 |        149.5000000 |
| 1623 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0604759 |   13.1607456 |   119.5782547 |                   |      -60.1239700 |       -58.0000000 |       61.6496506 |        60.0000000 |        -1.9110966 |         -4.2527275 |
| 1624 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0050180 |     0.0696483 |                   |       -0.0158355 |         0.0000000 |        0.0696483 |         0.0000000 |         0.0027413 |          0.0000000 |
| 1625 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7625480 |    1.5984228 |     2.3158336 |                   |       -1.3158336 |         0.0000000 |       -0.8258166 |         1.0000000 |        -1.0250895 |          0.5733333 |
| 1626 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1367457 |     3.1415920 |                   |       -3.1415811 |         0.0000000 |        3.1415920 |         0.0000000 |        -1.1127665 |          0.0000000 |
| 1627 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0754927 |   36.0088959 |   119.5782547 |                   |      -60.1239700 |       -58.0000000 |       61.6496506 |        60.0000000 |         1.4801593 |        -16.5477791 |
| 1628 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7761801 |    0.5719614 |     2.6509991 |                   |        0.4384293 |         0.0000000 |        2.6509991 |         2.0000000 |         1.0292416 |          0.7633334 |
| 1629 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8426405 |    1.5028527 |    13.1681862 |                   |        1.5502703 |         1.0000000 |       14.1681862 |         7.3890562 |         3.2354975 |          2.8758757 |
| 1630 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1367457 |     3.1415920 |                   |       -3.1415811 |         0.0000000 |        3.1415920 |         0.0000000 |        -1.1127665 |          0.0000000 |
| 1631 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   11.1407318 |    40.6757736 |                   |      -40.6757736 |         0.0000000 |       15.0773544 |         0.0000000 |        -9.1759720 |          0.0000000 |
| 1632 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0498390 |   14.9094181 |   119.5782547 |                   |      -60.1239700 |       -58.0000000 |       61.6496506 |        60.0000000 |        -1.4493712 |         -4.1015701 |
| 1633 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0498390 |   14.9094181 |   119.5782547 |                   |      -60.1239700 |       -58.0000000 |       61.6496506 |        60.0000000 |        -1.4493712 |         -4.1015711 |
| 1634 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.2692741 |    0.0103023 |     0.6783351 |                   |        0.0006308 |         0.0052002 |        0.6835353 |         0.0052002 |         0.0137759 |          0.0052002 |
| 1635 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.9666666 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.9666666 |          0.0000000 |
| 1636 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8900000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8900000 |          0.0000000 |
| 1637 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.5034819 |    0.0352184 |     0.7595077 |                   |        0.0030732 |         0.0052002 |        0.7647079 |         0.0052002 |         0.0403952 |          0.0052002 |
| 1638 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8257926 |  734.4566650 |  1948.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |      1281.6234131 |        598.0000000 |
| 1639 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8254972 |  183.4966736 |   487.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       320.1833496 |        149.5000000 |
| 1640 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0461158 |    8.9256220 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -2.2652466 |          1.3715152 |
| 1641 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0010053 |     0.0787874 |                   |       -0.0279753 |         0.0000000 |        0.0787874 |         0.0000000 |         0.0002803 |          0.0000000 |
| 1642 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7565420 |    1.5722388 |     2.0020328 |                   |       -1.0020328 |         0.0000000 |       -0.8249742 |         1.0000000 |        -0.9989055 |          0.5733333 |
| 1643 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1405218 |     3.1415746 |                   |       -3.1415746 |         0.0000000 |        3.1415720 |         0.0000000 |         0.5232612 |          0.0000000 |
| 1644 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0570010 |   21.2927151 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -0.6124920 |          4.0744448 |
| 1645 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7625993 |    0.6259766 |     2.6107976 |                   |        0.4520407 |         0.0000000 |        2.6107976 |         2.0000000 |         1.0018735 |          0.7633334 |
| 1646 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8236331 |    1.7123199 |    12.6099024 |                   |        1.5715159 |         1.0000000 |       13.6099024 |         7.3890562 |         3.2226293 |          2.8758757 |
| 1647 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1405218 |     3.1415746 |                   |       -3.1415746 |         0.0000000 |        3.1415720 |         0.0000000 |         0.5232612 |          0.0000000 |
| 1648 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   10.2841749 |    43.9282455 |                   |      -43.9282455 |         0.0000000 |       11.7817955 |         0.0000000 |        -8.3624105 |          0.0000000 |
| 1649 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0274685 |   10.3008146 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -1.6733559 |          2.0850961 |
| 1650 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0274685 |   10.3008146 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -1.6733559 |          2.0850961 |
| 1651 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.5034819 |    0.0352184 |     0.7595077 |                   |        0.0030732 |         0.0052002 |        0.7647079 |         0.0052002 |         0.0403952 |          0.0052002 |
| 1652 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8900000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8900000 |          0.0000000 |
| 1653 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7966667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7966667 |          0.0000000 |
| 1654 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.3212234 |    0.0141909 |     0.7426779 |                   |        0.0006839 |         0.0052002 |        0.7478781 |         0.0052002 |         0.0175901 |          0.0052002 |
| 1655 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6106890 |  757.2433472 |  1888.0000000 |                   |        0.0000000 |         0.0000000 |     2028.0000000 |      1196.0000000 |       882.3699951 |        598.0000000 |
| 1656 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.6100029 |  189.4533386 |   472.0000000 |                   |        0.0000000 |         0.0000000 |      507.0000000 |       299.0000000 |       220.3933411 |        149.5000000 |
| 1657 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0408274 |   10.5557957 |   118.1275406 |                   |      -60.3547630 |       -58.0000000 |       61.8795013 |        60.0000000 |        -0.6580349 |         -0.5678788 |
| 1658 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0054417 |     0.0679284 |                   |       -0.0185368 |         0.0000000 |        0.0679284 |         0.0000000 |         0.0013517 |          0.0000000 |
| 1659 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7681537 |    1.6161752 |     2.2575884 |                   |       -1.2575883 |         0.0000000 |       -0.8602105 |         1.0000000 |        -1.0428418 |          0.5733333 |
| 1660 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1364956 |     3.1415913 |                   |       -3.1415899 |         0.0000000 |        3.1415913 |         0.0000000 |        -1.2999995 |          0.0000000 |
| 1661 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0511713 |   26.3896217 |   118.1275406 |                   |      -60.3547630 |       -58.0000000 |       61.8795013 |        60.0000000 |         6.3105111 |         -3.0366666 |
| 1662 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7847802 |    0.5590828 |     1.7487859 |                   |        0.4619113 |         0.0000000 |        2.6046956 |         2.0000000 |         1.0287809 |          0.7633334 |
| 1663 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8681195 |    1.4262787 |     6.1380510 |                   |        1.5871046 |         1.0000000 |       13.5271072 |         7.3890562 |         3.1784458 |          2.8758757 |
| 1664 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1364956 |     3.1415913 |                   |       -3.1415899 |         0.0000000 |        3.1415913 |         0.0000000 |        -1.2999995 |          0.0000000 |
| 1665 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   11.2153387 |    45.2201538 |                   |      -45.2201538 |         0.0000000 |       12.3199091 |         0.0000000 |        -9.4049244 |          0.0000000 |
| 1666 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0252446 |   12.0230207 |   118.1275406 |                   |      -60.3547630 |       -58.0000000 |       61.8795013 |        60.0000000 |        -0.1047897 |         -0.0482373 |
| 1667 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0252446 |   12.0230207 |   118.1275406 |                   |      -60.3547630 |       -58.0000000 |       61.8795013 |        60.0000000 |        -0.1047898 |         -0.0482373 |
| 1668 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.3212234 |    0.0141909 |     0.7426779 |                   |        0.0006839 |         0.0052002 |        0.7478781 |         0.0052002 |         0.0175901 |          0.0052002 |
| 1669 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7966667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7966667 |          0.0000000 |
| 1670 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7766666 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7766667 |          0.0000000 |
| 1671 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.3181528 |    0.0141428 |     0.7404379 |                   |        0.0006502 |         0.0052002 |        0.7456381 |         0.0052002 |         0.0174522 |          0.0052002 |
| 1672 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6129581 |  771.4299927 |  1880.0000000 |                   |        0.0000000 |         0.0000000 |     2028.0000000 |      1196.0000000 |       900.0300293 |        598.0000000 |
| 1673 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.6122921 |  192.9866638 |   470.0000000 |                   |        0.0000000 |         0.0000000 |      507.0000000 |       299.0000000 |       224.8133392 |        149.5000000 |
| 1674 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0245358 |   10.7064037 |   118.1330566 |                   |      -60.3722343 |       -58.0000000 |       61.8794670 |        60.0000000 |        -0.6416630 |         -0.5678788 |
| 1675 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0040287 |     0.0653028 |                   |       -0.0186738 |         0.0000000 |        0.0653028 |         0.0000000 |        -0.0003982 |          0.0000000 |
| 1676 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7703615 |    1.6170998 |     2.2589395 |                   |       -1.2589395 |         0.0000000 |       -0.8600102 |         1.0000000 |        -1.0437665 |          0.5733333 |
| 1677 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1378667 |     3.1415918 |                   |       -3.1415918 |         0.0000000 |        3.1415863 |         0.0000000 |        -1.6543798 |          0.0000000 |
| 1678 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0313438 |   26.5463753 |   118.1330566 |                   |      -60.3722343 |       -58.0000000 |       61.8794670 |        60.0000000 |         6.9802151 |         -3.0366666 |
| 1679 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7873518 |    0.5561910 |     1.7477413 |                   |        0.4620042 |         0.0000000 |        2.6093724 |         2.0000000 |         1.0256125 |          0.7633334 |
| 1680 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8686769 |    1.4233054 |     6.2014618 |                   |        1.5872520 |         1.0000000 |       13.5905180 |         7.3890562 |         3.1784170 |          2.8758757 |
| 1681 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1378667 |     3.1415918 |                   |       -3.1415918 |         0.0000000 |        3.1415863 |         0.0000000 |        -1.6543798 |          0.0000000 |
| 1682 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   11.6138706 |    45.2323608 |                   |      -45.2323608 |         0.0000000 |       12.3443832 |         0.0000000 |       -10.0105362 |          0.0000000 |
| 1683 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0094740 |   12.1888514 |   118.1330566 |                   |      -60.3722343 |       -58.0000000 |       61.8794670 |        60.0000000 |        -0.1210095 |         -0.0482373 |
| 1684 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0094740 |   12.1888514 |   118.1330566 |                   |      -60.3722343 |       -58.0000000 |       61.8794670 |        60.0000000 |        -0.1210095 |         -0.0482373 |
| 1685 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.3181528 |    0.0141428 |     0.7404379 |                   |        0.0006502 |         0.0052002 |        0.7456381 |         0.0052002 |         0.0174522 |          0.0052002 |
| 1686 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7766666 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7766666 |          0.0000000 |
| 1687 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.9733334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.9733334 |          0.0000000 |
| 1688 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.2680261 |    0.0103707 |     0.6784487 |                   |        0.0006062 |         0.0052002 |        0.6836489 |         0.0052002 |         0.0137603 |          0.0052002 |
| 1689 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6990886 |  771.0266724 |  1988.0000000 |                   |        0.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |      1050.5733643 |        598.0000000 |
| 1690 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.6984978 |  192.7733307 |   497.0000000 |                   |        0.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       262.3999939 |        149.5000000 |
| 1691 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0172316 |   12.8917027 |   116.5374680 |                   |      -60.3900108 |       -58.0000000 |       61.6465416 |        60.0000000 |        -1.8974147 |         -4.2527275 |
| 1692 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0047264 |     0.0698710 |                   |       -0.0157694 |         0.0000000 |        0.0698710 |         0.0000000 |         0.0024341 |          0.0000000 |
| 1693 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7632113 |    1.5988427 |     2.3147001 |                   |       -1.3147001 |         0.0000000 |       -0.8297150 |         1.0000000 |        -1.0255097 |          0.5733333 |
| 1694 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1370215 |     3.1415839 |                   |       -3.1415775 |         0.0000000 |        3.1415839 |         0.0000000 |        -1.1334274 |          0.0000000 |
| 1695 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0220792 |   34.9634018 |   116.5374680 |                   |      -60.3900108 |       -58.0000000 |       61.6465416 |        60.0000000 |         1.6766676 |        -16.5477791 |
| 1696 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7755927 |    0.5735866 |     2.6489737 |                   |        0.4392934 |         0.0000000 |        2.6489737 |         2.0000000 |         1.0253844 |          0.7633334 |
| 1697 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8427400 |    1.5052629 |    13.1395197 |                   |        1.5516105 |         1.0000000 |       14.1395197 |         7.3890562 |         3.2223995 |          2.8758757 |
| 1698 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1370215 |     3.1415839 |                   |       -3.1415775 |         0.0000000 |        3.1415839 |         0.0000000 |        -1.1334274 |          0.0000000 |
| 1699 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   11.1980686 |    40.8516159 |                   |      -40.8516159 |         0.0000000 |       14.8867445 |         0.0000000 |        -9.3182144 |          0.0000000 |
| 1700 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0072795 |   14.6137209 |   116.5374680 |                   |      -60.3900108 |       -58.0000000 |       61.6465416 |        60.0000000 |        -1.4390868 |         -4.1015701 |
| 1701 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0072795 |   14.6137209 |   116.5374680 |                   |      -60.3900108 |       -58.0000000 |       61.6465416 |        60.0000000 |        -1.4390868 |         -4.1015711 |
| 1702 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.2680261 |    0.0103707 |     0.6784487 |                   |        0.0006062 |         0.0052002 |        0.6836489 |         0.0052002 |         0.0137603 |          0.0052002 |
| 1703 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.9733334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.9733334 |          0.0000000 |
| 1704 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8900000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8900000 |          0.0000000 |
| 1705 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.5034819 |    0.0352184 |     0.7595077 |                   |        0.0030732 |         0.0052002 |        0.7647079 |         0.0052002 |         0.0403952 |          0.0052002 |
| 1706 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8257926 |  734.4566650 |  1948.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |      1281.6234131 |        598.0000000 |
| 1707 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8254972 |  183.4966736 |   487.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       320.1833496 |        149.5000000 |
| 1708 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0461158 |    8.9256220 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -2.2652466 |          1.3715152 |
| 1709 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0010053 |     0.0787874 |                   |       -0.0279753 |         0.0000000 |        0.0787874 |         0.0000000 |         0.0002803 |          0.0000000 |
| 1710 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7565420 |    1.5722388 |     2.0020328 |                   |       -1.0020328 |         0.0000000 |       -0.8249742 |         1.0000000 |        -0.9989055 |          0.5733333 |
| 1711 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1405218 |     3.1415746 |                   |       -3.1415746 |         0.0000000 |        3.1415720 |         0.0000000 |         0.5232612 |          0.0000000 |
| 1712 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0570010 |   21.2927151 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -0.6124920 |          4.0744448 |
| 1713 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7625993 |    0.6259766 |     2.6107976 |                   |        0.4520407 |         0.0000000 |        2.6107976 |         2.0000000 |         1.0018735 |          0.7633334 |
| 1714 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8236331 |    1.7123199 |    12.6099024 |                   |        1.5715159 |         1.0000000 |       13.6099024 |         7.3890562 |         3.2226293 |          2.8758757 |
| 1715 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1405218 |     3.1415746 |                   |       -3.1415746 |         0.0000000 |        3.1415720 |         0.0000000 |         0.5232612 |          0.0000000 |
| 1716 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   10.2841749 |    43.9282455 |                   |      -43.9282455 |         0.0000000 |       11.7817955 |         0.0000000 |        -8.3624105 |          0.0000000 |
| 1717 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0274685 |   10.3008146 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -1.6733559 |          2.0850961 |
| 1718 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0274685 |   10.3008146 |   119.1311722 |                   |      -59.5374260 |       -58.0000000 |       67.1256485 |        60.0000000 |        -1.6733559 |          2.0850961 |
| 1719 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.5034819 |    0.0352184 |     0.7595077 |                   |        0.0030732 |         0.0052002 |        0.7647079 |         0.0052002 |         0.0403952 |          0.0052002 |
| 1720 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8900000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8900000 |          0.0000000 |
| 1721 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8100000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8100000 |          0.0000000 |
| 1722 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.7187200 |    0.0311430 |     0.4002290 |                   |        0.0049057 |         0.0052002 |        0.4054292 |         0.0052002 |         0.0363412 |          0.0052002 |
| 1723 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8275746 |  733.4166870 |  1944.0000000 |                   |      512.0000000 |         0.0000000 |     2031.0000000 |      1196.0000000 |      1286.5833740 |        598.0000000 |
| 1724 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8272923 |  183.2500000 |   486.0000000 |                   |      128.0000000 |         0.0000000 |      507.0000000 |       299.0000000 |       321.4433289 |        149.5000000 |
| 1725 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0446537 |    8.8889227 |   119.1058655 |                   |      -59.4715691 |       -58.0000000 |       66.4969101 |        60.0000000 |        -2.3737054 |          1.3715152 |
| 1726 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0006909 |     0.0093084 |                   |       -0.0053347 |         0.0000000 |        0.0093084 |         0.0000000 |         0.0000579 |          0.0000000 |
| 1727 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7561354 |    1.5717725 |     2.0023100 |                   |       -1.0023100 |         0.0000000 |       -0.5993444 |         1.0000000 |        -0.9984391 |          0.5733333 |
| 1728 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1408961 |     3.1415837 |                   |       -3.1415837 |         0.0000000 |        3.1415789 |         0.0000000 |         0.1884425 |          0.0000000 |
| 1729 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0565090 |   20.7446461 |   119.1058655 |                   |      -59.4715691 |       -58.0000000 |       66.4969101 |        60.0000000 |        -0.6146175 |          4.0744448 |
| 1730 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7639625 |    0.6236274 |     2.5862415 |                   |        0.4433556 |         0.0000000 |        2.5862415 |         2.0000000 |         0.9965397 |          0.7633334 |
| 1731 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8264950 |    1.6971831 |    12.2797661 |                   |        1.5579262 |         1.0000000 |       13.2797661 |         7.3890562 |         3.2010901 |          2.8758757 |
| 1732 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1408961 |     3.1415837 |                   |       -3.1415837 |         0.0000000 |        3.1415789 |         0.0000000 |         0.1884425 |          0.0000000 |
| 1733 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   10.7002869 |    45.8962822 |                   |      -45.8962822 |         0.0000000 |       13.9206505 |         0.0000000 |        -8.7527151 |          0.0000000 |
| 1734 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0258658 |   10.2567244 |   119.1058655 |                   |      -59.4715691 |       -58.0000000 |       66.4969101 |        60.0000000 |        -1.8310283 |          2.0850961 |
| 1735 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0258658 |   10.2567244 |   119.1058655 |                   |      -59.4715691 |       -58.0000000 |       66.4969101 |        60.0000000 |        -1.8310285 |          2.0850961 |
| 1736 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.7187200 |    0.0311430 |     0.4002290 |                   |        0.0049057 |         0.0052002 |        0.4054292 |         0.0052002 |         0.0363412 |          0.0052002 |
| 1737 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8100000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8100000 |          0.0000000 |
| 1738 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.9033333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.9033334 |          0.0000000 |
| 1739 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.2583551 |    0.0109310 |     0.8415903 |                   |        0.0006184 |         0.0052002 |        0.8467905 |         0.0052002 |         0.0143976 |          0.0052002 |
| 1740 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6193128 |  760.7299805 |  1924.0000000 |                   |        0.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |       896.4500122 |        598.0000000 |
| 1741 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.6186708 |  190.2533264 |   481.0000000 |                   |        0.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       223.8866730 |        149.5000000 |
| 1742 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.0142794 |   12.2138853 |   119.0915375 |                   |      -59.6982803 |       -58.0000000 |       63.6228218 |        60.0000000 |        -0.7193201 |         -2.7400000 |
| 1743 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0057304 |     0.0862067 |                   |       -0.0333804 |         0.0000000 |        0.0862067 |         0.0000000 |        -0.0016220 |          0.0000000 |
| 1744 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7663790 |    1.6086284 |     2.1754405 |                   |       -1.1754405 |         0.0000000 |       -0.9015458 |         1.0000000 |        -1.0352950 |          0.5733333 |
| 1745 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1362810 |     3.1415834 |                   |       -3.1415811 |         0.0000000 |        3.1415834 |         0.0000000 |        -0.9831053 |          0.0000000 |
| 1746 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0166379 |   31.6857166 |   119.0915375 |                   |      -59.6982803 |       -58.0000000 |       63.6228218 |        60.0000000 |         7.2304225 |        -11.0011110 |
| 1747 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7878176 |    0.5532727 |     2.6140122 |                   |        0.4587088 |         0.0000000 |        2.6676037 |         2.0000000 |         1.0488545 |          0.7633334 |
| 1748 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8644053 |    1.4197640 |    12.6537228 |                   |        1.5820298 |         1.0000000 |       14.4054089 |         7.3890562 |         3.2749057 |          2.8758757 |
| 1749 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1362810 |     3.1415834 |                   |       -3.1415811 |         0.0000000 |        3.1415834 |         0.0000000 |        -0.9831053 |          0.0000000 |
| 1750 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   12.0071344 |    43.6997070 |                   |      -43.6997070 |         0.0000000 |       12.0271435 |         0.0000000 |       -10.5711451 |          0.0000000 |
| 1751 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.0251879 |   13.8474131 |   119.0915375 |                   |      -59.6982803 |       -58.0000000 |       63.6228218 |        60.0000000 |        -0.1180556 |         -2.4375706 |
| 1752 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.0251879 |   13.8474131 |   119.0915375 |                   |      -59.6982803 |       -58.0000000 |       63.6228218 |        60.0000000 |        -0.1180556 |         -2.4375706 |
| 1753 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.2583551 |    0.0109310 |     0.8415903 |                   |        0.0006184 |         0.0052002 |        0.8467905 |         0.0052002 |         0.0143976 |          0.0052002 |
| 1754 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.9033333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.9033333 |          0.0000000 |
| 1755 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7766666 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7766667 |          0.0000000 |
| 1756 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.3159614 |    0.0141656 |     0.7412657 |                   |        0.0006425 |         0.0052002 |        0.7464659 |         0.0052002 |         0.0174610 |          0.0052002 |
| 1757 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.5978138 |  777.2833252 |  1884.0000000 |                   |        0.0000000 |         0.0000000 |     2028.0000000 |      1196.0000000 |       887.6033325 |        598.0000000 |
| 1758 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.5971363 |  194.4533386 |   471.0000000 |                   |        0.0000000 |         0.0000000 |      507.0000000 |       299.0000000 |       221.7066650 |        149.5000000 |
| 1759 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0181583 |   10.6566143 |   119.1386108 |                   |      -60.3715210 |       -58.0000000 |       61.8770752 |        60.0000000 |        -0.6149433 |         -0.5678788 |
| 1760 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0046199 |     0.0635831 |                   |       -0.0184119 |         0.0000000 |        0.0635831 |         0.0000000 |         0.0001636 |          0.0000000 |
| 1761 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7711105 |    1.6175402 |     2.2614880 |                   |       -1.2614881 |         0.0000000 |       -0.8562430 |         1.0000000 |        -1.0442067 |          0.5733333 |
| 1762 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1373041 |     3.1415923 |                   |       -3.1415923 |         0.0000000 |        3.1415679 |         0.0000000 |        -1.5083003 |          0.0000000 |
| 1763 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0234833 |   26.4655838 |   119.1386108 |                   |      -60.3715210 |       -58.0000000 |       61.8770752 |        60.0000000 |         6.9123650 |         -3.0366666 |
| 1764 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7895485 |    0.5518292 |     1.7486444 |                   |        0.4621809 |         0.0000000 |        2.6146863 |         2.0000000 |         1.0302788 |          0.7633334 |
| 1765 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8702985 |    1.4116876 |     6.2738724 |                   |        1.5875324 |         1.0000000 |       13.6629286 |         7.3890562 |         3.1937203 |          2.8758757 |
| 1766 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1373041 |     3.1415923 |                   |       -3.1415923 |         0.0000000 |        3.1415679 |         0.0000000 |        -1.5083003 |          0.0000000 |
| 1767 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   11.5161200 |    45.2251816 |                   |      -45.2251816 |         0.0000000 |       12.2074537 |         0.0000000 |        -9.8494205 |          0.0000000 |
| 1768 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0030866 |   12.1317472 |   119.1386108 |                   |      -60.3715210 |       -58.0000000 |       61.8770752 |        60.0000000 |        -0.0738309 |         -0.0482373 |
| 1769 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0030866 |   12.1317472 |   119.1386108 |                   |      -60.3715210 |       -58.0000000 |       61.8770752 |        60.0000000 |        -0.0738309 |         -0.0482373 |
| 1770 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.3159614 |    0.0141656 |     0.7412657 |                   |        0.0006425 |         0.0052002 |        0.7464659 |         0.0052002 |         0.0174610 |          0.0052002 |
| 1771 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7766666 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7766666 |          0.0000000 |
| 1772 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7966667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7966667 |          0.0000000 |
| 1773 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.3212234 |    0.0141909 |     0.7426779 |                   |        0.0006839 |         0.0052002 |        0.7478781 |         0.0052002 |         0.0175901 |          0.0052002 |
| 1774 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6106890 |  757.2433472 |  1888.0000000 |                   |        0.0000000 |         0.0000000 |     2028.0000000 |      1196.0000000 |       882.3699951 |        598.0000000 |
| 1775 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.6100029 |  189.4533386 |   472.0000000 |                   |        0.0000000 |         0.0000000 |      507.0000000 |       299.0000000 |       220.3933411 |        149.5000000 |
| 1776 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0408274 |   10.5557957 |   118.1275406 |                   |      -60.3547630 |       -58.0000000 |       61.8795013 |        60.0000000 |        -0.6580349 |         -0.5678788 |
| 1777 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0054417 |     0.0679284 |                   |       -0.0185368 |         0.0000000 |        0.0679284 |         0.0000000 |         0.0013517 |          0.0000000 |
| 1778 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7681537 |    1.6161752 |     2.2575884 |                   |       -1.2575883 |         0.0000000 |       -0.8602105 |         1.0000000 |        -1.0428418 |          0.5733333 |
| 1779 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1364956 |     3.1415913 |                   |       -3.1415899 |         0.0000000 |        3.1415913 |         0.0000000 |        -1.2999995 |          0.0000000 |
| 1780 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0511713 |   26.3896217 |   118.1275406 |                   |      -60.3547630 |       -58.0000000 |       61.8795013 |        60.0000000 |         6.3105111 |         -3.0366666 |
| 1781 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7847802 |    0.5590828 |     1.7487859 |                   |        0.4619113 |         0.0000000 |        2.6046956 |         2.0000000 |         1.0287809 |          0.7633334 |
| 1782 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8681195 |    1.4262787 |     6.1380510 |                   |        1.5871046 |         1.0000000 |       13.5271072 |         7.3890562 |         3.1784458 |          2.8758757 |
| 1783 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1364956 |     3.1415913 |                   |       -3.1415899 |         0.0000000 |        3.1415913 |         0.0000000 |        -1.2999995 |          0.0000000 |
| 1784 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   11.2153387 |    45.2201538 |                   |      -45.2201538 |         0.0000000 |       12.3199091 |         0.0000000 |        -9.4049244 |          0.0000000 |
| 1785 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0252446 |   12.0230207 |   118.1275406 |                   |      -60.3547630 |       -58.0000000 |       61.8795013 |        60.0000000 |        -0.1047897 |         -0.0482373 |
| 1786 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0252446 |   12.0230207 |   118.1275406 |                   |      -60.3547630 |       -58.0000000 |       61.8795013 |        60.0000000 |        -0.1047898 |         -0.0482373 |
| 1787 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.3212234 |    0.0141909 |     0.7426779 |                   |        0.0006839 |         0.0052002 |        0.7478781 |         0.0052002 |         0.0175901 |          0.0052002 |
| 1788 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7966667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7966667 |          0.0000000 |
| 1789 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8166667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8166667 |          0.0000000 |
| 1790 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.2841445 |    0.0109805 |     0.6913367 |                   |        0.0006996 |         0.0052002 |        0.6965369 |         0.0052002 |         0.0147048 |          0.0052002 |
| 1791 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6441690 |  779.8166504 |  1976.0000000 |                   |        0.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |       972.8833618 |        598.0000000 |
| 1792 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.6435880 |  195.0099945 |   494.0000000 |                   |        0.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       243.0166779 |        149.5000000 |
| 1793 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0378433 |   10.1488543 |   119.6073990 |                   |      -60.0682793 |       -58.0000000 |       61.6401939 |        60.0000000 |        -1.7170496 |         -0.4903030 |
| 1794 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0065538 |     0.0685079 |                   |       -0.0148621 |         0.0000000 |        0.0685079 |         0.0000000 |         0.0046090 |          0.0000000 |
| 1795 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7681618 |    1.6008803 |     2.3038731 |                   |       -1.3038731 |         0.0000000 |       -0.8238392 |         1.0000000 |        -1.0275468 |          0.5733333 |
| 1796 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1352673 |     3.1415923 |                   |       -3.1415837 |         0.0000000 |        3.1415923 |         0.0000000 |        -0.7375513 |          0.0000000 |
| 1797 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0472886 |   25.4988365 |   119.6073990 |                   |      -60.0682793 |       -58.0000000 |       61.6401939 |        60.0000000 |         1.3238236 |         -2.7522223 |
| 1798 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7933744 |    0.5405580 |     2.6629992 |                   |        0.4367511 |         0.0000000 |        2.6629992 |         2.0000000 |         1.0556489 |          0.7633334 |
| 1799 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8579335 |    1.4157959 |    13.3392305 |                   |        1.5476707 |         1.0000000 |       14.3392305 |         7.3890562 |         3.3249435 |          2.8758757 |
| 1800 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1352673 |     3.1415923 |                   |       -3.1415837 |         0.0000000 |        3.1415923 |         0.0000000 |        -0.7375513 |          0.0000000 |
| 1801 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   10.6372547 |    39.7290611 |                   |      -39.7290611 |         0.0000000 |       15.5083370 |         0.0000000 |        -8.3343420 |          0.0000000 |
| 1802 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0200223 |   11.5790939 |   119.6073990 |                   |      -60.0682793 |       -58.0000000 |       61.6401939 |        60.0000000 |        -1.1794274 |          0.0370960 |
| 1803 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0200223 |   11.5790939 |   119.6073990 |                   |      -60.0682793 |       -58.0000000 |       61.6401939 |        60.0000000 |        -1.1794274 |          0.0370961 |
| 1804 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.2841445 |    0.0109805 |     0.6913367 |                   |        0.0006996 |         0.0052002 |        0.6965369 |         0.0052002 |         0.0147048 |          0.0052002 |
| 1805 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8166667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8166667 |          0.0000000 |
| 1806 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8066667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8066667 |          0.0000000 |
| 1807 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.2842749 |    0.0109714 |     0.6900055 |                   |        0.0006891 |         0.0052002 |        0.6952057 |         0.0052002 |         0.0146654 |          0.0052002 |
| 1808 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6422186 |  781.3866577 |  1976.0000000 |                   |        0.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |       971.6066895 |        598.0000000 |
| 1809 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.6416451 |  195.3999939 |   494.0000000 |                   |        0.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       242.7000122 |        149.5000000 |
| 1810 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0073582 |   10.1246452 |   119.5963440 |                   |      -60.0705643 |       -58.0000000 |       61.6532326 |        60.0000000 |        -1.6934311 |         -0.4903030 |
| 1811 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0065909 |     0.0687780 |                   |       -0.0148455 |         0.0000000 |        0.0687780 |         0.0000000 |         0.0046301 |          0.0000000 |
| 1812 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7683625 |    1.6013991 |     2.3045554 |                   |       -1.3045553 |         0.0000000 |       -0.8256112 |         1.0000000 |        -1.0280658 |          0.5733333 |
| 1813 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1352348 |     3.1415925 |                   |       -3.1415925 |         0.0000000 |        3.1415515 |         0.0000000 |        -0.7375743 |          0.0000000 |
| 1814 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0103451 |   25.4056473 |   119.5963440 |                   |      -60.0705643 |       -58.0000000 |       61.6532326 |        60.0000000 |         1.4118348 |         -2.7522223 |
| 1815 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7929376 |    0.5419698 |     2.6638124 |                   |        0.4363401 |         0.0000000 |        2.6638124 |         2.0000000 |         1.0550030 |          0.7633334 |
| 1816 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8573358 |    1.4218297 |    13.3508968 |                   |        1.5470350 |         1.0000000 |       14.3508968 |         7.3890562 |         3.3229380 |          2.8758757 |
| 1817 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1352348 |     3.1415925 |                   |       -3.1415925 |         0.0000000 |        3.1415515 |         0.0000000 |        -0.7375743 |          0.0000000 |
| 1818 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   10.6400843 |    39.6987457 |                   |      -39.6987457 |         0.0000000 |       15.4877710 |         0.0000000 |        -8.3349390 |          0.0000000 |
| 1819 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.0097719 |   11.5537920 |   119.5963440 |                   |      -60.0705643 |       -58.0000000 |       61.6532326 |        60.0000000 |        -1.1538075 |          0.0370960 |
| 1820 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.0097719 |   11.5537920 |   119.5963440 |                   |      -60.0705643 |       -58.0000000 |       61.6532326 |        60.0000000 |        -1.1538074 |          0.0370961 |
| 1821 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.2842749 |    0.0109714 |     0.6900055 |                   |        0.0006891 |         0.0052002 |        0.6952057 |         0.0052002 |         0.0146654 |          0.0052002 |
| 1822 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8066667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8066667 |          0.0000000 |
| 1823 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8100000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8100000 |          0.0000000 |
| 1824 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.7187200 |    0.0311430 |     0.4002290 |                   |        0.0049057 |         0.0052002 |        0.4054292 |         0.0052002 |         0.0363412 |          0.0052002 |
| 1825 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8275746 |  733.4166870 |  1944.0000000 |                   |      512.0000000 |         0.0000000 |     2031.0000000 |      1196.0000000 |      1286.5833740 |        598.0000000 |
| 1826 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8272923 |  183.2500000 |   486.0000000 |                   |      128.0000000 |         0.0000000 |      507.0000000 |       299.0000000 |       321.4433289 |        149.5000000 |
| 1827 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0446537 |    8.8889227 |   119.1058655 |                   |      -59.4715691 |       -58.0000000 |       66.4969101 |        60.0000000 |        -2.3737054 |          1.3715152 |
| 1828 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0006909 |     0.0093084 |                   |       -0.0053347 |         0.0000000 |        0.0093084 |         0.0000000 |         0.0000579 |          0.0000000 |
| 1829 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7561354 |    1.5717725 |     2.0023100 |                   |       -1.0023100 |         0.0000000 |       -0.5993444 |         1.0000000 |        -0.9984391 |          0.5733333 |
| 1830 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1408961 |     3.1415837 |                   |       -3.1415837 |         0.0000000 |        3.1415789 |         0.0000000 |         0.1884425 |          0.0000000 |
| 1831 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0565090 |   20.7446461 |   119.1058655 |                   |      -59.4715691 |       -58.0000000 |       66.4969101 |        60.0000000 |        -0.6146175 |          4.0744448 |
| 1832 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7639625 |    0.6236274 |     2.5862415 |                   |        0.4433556 |         0.0000000 |        2.5862415 |         2.0000000 |         0.9965397 |          0.7633334 |
| 1833 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8264950 |    1.6971831 |    12.2797661 |                   |        1.5579262 |         1.0000000 |       13.2797661 |         7.3890562 |         3.2010901 |          2.8758757 |
| 1834 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1408961 |     3.1415837 |                   |       -3.1415837 |         0.0000000 |        3.1415789 |         0.0000000 |         0.1884425 |          0.0000000 |
| 1835 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   10.7002869 |    45.8962822 |                   |      -45.8962822 |         0.0000000 |       13.9206505 |         0.0000000 |        -8.7527151 |          0.0000000 |
| 1836 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0258658 |   10.2567244 |   119.1058655 |                   |      -59.4715691 |       -58.0000000 |       66.4969101 |        60.0000000 |        -1.8310283 |          2.0850961 |
| 1837 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0258658 |   10.2567244 |   119.1058655 |                   |      -59.4715691 |       -58.0000000 |       66.4969101 |        60.0000000 |        -1.8310285 |          2.0850961 |
| 1838 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.7187200 |    0.0311430 |     0.4002290 |                   |        0.0049057 |         0.0052002 |        0.4054292 |         0.0052002 |         0.0363412 |          0.0052002 |
| 1839 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8100000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8100000 |          0.0000000 |
| 1840 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7533333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7533333 |          0.0000000 |
| 1841 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.3188899 |    0.0201893 |     0.7868649 |                   |        0.0006228 |         0.0052002 |        0.7920651 |         0.0052002 |         0.0241180 |          0.0052002 |
| 1842 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8058329 |  739.6466675 |  1872.0000000 |                   |        4.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |      1230.9000244 |        598.0000000 |
| 1843 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8055946 |  184.8233337 |   468.0000000 |                   |        1.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       307.5366821 |        149.5000000 |
| 1844 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0683677 |   10.8991346 |   120.1219254 |                   |      -60.1887398 |       -58.0000000 |       66.7582779 |        60.0000000 |        -1.1436102 |         -0.5290909 |
| 1845 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0014830 |     0.0583318 |                   |       -0.0307194 |         0.0000000 |        0.0583318 |         0.0000000 |         0.0002422 |          0.0000000 |
| 1846 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7556946 |    1.5726424 |     2.0588670 |                   |       -1.0588671 |         0.0000000 |       -0.7628196 |         1.0000000 |        -0.9993091 |          0.5733333 |
| 1847 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1400542 |     3.1415927 |                   |       -3.1415923 |         0.0000000 |        3.1415927 |         0.0000000 |        -0.3773223 |          0.0000000 |
| 1848 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0893003 |   25.8814049 |   120.1219254 |                   |      -60.1887398 |       -58.0000000 |       66.7582779 |        60.0000000 |         6.7212510 |         -2.8944445 |
| 1849 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7465675 |    0.6331629 |     1.9414177 |                   |        0.4476525 |         0.0000000 |        2.1373701 |         2.0000000 |         0.9191101 |          0.7633334 |
| 1850 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8456253 |    1.6167225 |     5.9686236 |                   |        1.5646348 |         1.0000000 |        8.4771147 |         7.3890562 |         2.8348227 |          2.8758757 |
| 1851 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1400542 |     3.1415927 |                   |       -3.1415923 |         0.0000000 |        3.1415927 |         0.0000000 |        -0.3773223 |          0.0000000 |
| 1852 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   12.9242229 |    44.0233917 |                   |      -44.0233917 |         0.0000000 |       12.1562881 |         0.0000000 |       -11.5005760 |          0.0000000 |
| 1853 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0546732 |   12.4407101 |   120.1219254 |                   |      -60.1887398 |       -58.0000000 |       66.7582779 |        60.0000000 |        -0.6210829 |         -0.0055706 |
| 1854 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0546732 |   12.4407101 |   120.1219254 |                   |      -60.1887398 |       -58.0000000 |       66.7582779 |        60.0000000 |        -0.6210830 |         -0.0055706 |
| 1855 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.3188899 |    0.0201893 |     0.7868649 |                   |        0.0006228 |         0.0052002 |        0.7920651 |         0.0052002 |         0.0241180 |          0.0052002 |
| 1856 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7533333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7533333 |          0.0000000 |
| 1857 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7066666 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7066667 |          0.0000000 |
| 1858 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.4731478 |    0.0150905 |     0.3183137 |                   |        0.0015417 |         0.0052002 |        0.3235140 |         0.0052002 |         0.0189348 |          0.0052002 |
| 1859 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.5502974 |  828.2199707 |  1844.0000000 |                   |        0.0000000 |         0.0000000 |     2028.0000000 |      1196.0000000 |       896.8266602 |        598.0000000 |
| 1860 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.5496368 |  207.1766663 |   461.0000000 |                   |        0.0000000 |         0.0000000 |      507.0000000 |       299.0000000 |       224.0299988 |        149.5000000 |
| 1861 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0261537 |   12.9837036 |   119.1261597 |                   |      -59.9743538 |       -58.0000000 |       66.1661148 |        60.0000000 |         0.1157178 |         -2.7400000 |
| 1862 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0095206 |     0.0642429 |                   |       -0.0402521 |         0.0000000 |        0.0642429 |         0.0000000 |        -0.0042980 |          0.0000000 |
| 1863 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7724742 |    1.6133493 |     2.2805219 |                   |       -1.2805220 |         0.0000000 |       -0.8972044 |         1.0000000 |        -1.0400161 |          0.5733333 |
| 1864 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1329558 |     3.1415901 |                   |       -3.1415901 |         0.0000000 |        3.1415870 |         0.0000000 |        -1.2531065 |          0.0000000 |
| 1865 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0329333 |   33.6670609 |   119.1261597 |                   |      -59.9743538 |       -58.0000000 |       66.1661148 |        60.0000000 |        11.1798487 |        -11.0011110 |
| 1866 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7900258 |    0.5418662 |     1.7380401 |                   |        0.4442392 |         0.0000000 |        2.6017427 |         2.0000000 |         1.0019368 |          0.7633334 |
| 1867 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8752584 |    1.3686107 |     6.0981665 |                   |        1.5593034 |         1.0000000 |       13.4872227 |         7.3890562 |         3.0763197 |          2.8758757 |
| 1868 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1329558 |     3.1415901 |                   |       -3.1415901 |         0.0000000 |        3.1415870 |         0.0000000 |        -1.2531065 |          0.0000000 |
| 1869 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   12.8570337 |    46.0273209 |                   |      -46.0273209 |         0.0000000 |       12.6311359 |         0.0000000 |       -11.4093819 |          0.0000000 |
| 1870 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0161899 |   14.6811066 |   119.1261597 |                   |      -59.9743538 |       -58.0000000 |       66.1661148 |        60.0000000 |         0.7287253 |         -2.4375706 |
| 1871 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0161899 |   14.6811066 |   119.1261597 |                   |      -59.9743538 |       -58.0000000 |       66.1661148 |        60.0000000 |         0.7287252 |         -2.4375706 |
| 1872 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.4731478 |    0.0150905 |     0.3183137 |                   |        0.0015417 |         0.0052002 |        0.3235140 |         0.0052002 |         0.0189348 |          0.0052002 |
| 1873 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7066666 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7066666 |          0.0000000 |
| 1874 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    1.0700001 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         1.0700001 |          0.0000000 |
| 1875 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.5082909 |    0.0253783 |     0.8508317 |                   |        0.0041300 |         0.0052002 |        0.8560320 |         0.0052002 |         0.0305661 |          0.0052002 |
| 1876 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8261675 |  729.8366699 |  1984.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |      1282.8433838 |        598.0000000 |
| 1877 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8258363 |  182.2966614 |   496.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       320.4433289 |        149.5000000 |
| 1878 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0945992 |    8.4654493 |   120.1969910 |                   |      -59.6016731 |       -58.0000000 |       62.7153206 |        60.0000000 |        -3.1246965 |          1.3715152 |
| 1879 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0008974 |     0.0245259 |                   |       -0.0055305 |         0.0000000 |        0.0245259 |         0.0000000 |         0.0002128 |          0.0000000 |
| 1880 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7567794 |    1.5728360 |     2.0029910 |                   |       -1.0029910 |         0.0000000 |       -0.9172935 |         1.0000000 |        -0.9995028 |          0.5733333 |
| 1881 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1406891 |     3.1415896 |                   |       -3.1415722 |         0.0000000 |        3.1415896 |         0.0000000 |         0.5233802 |          0.0000000 |
| 1882 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.1151643 |   20.5827618 |   120.1969910 |                   |      -59.6016731 |       -58.0000000 |       62.7153206 |        60.0000000 |        -5.2018447 |          4.0744448 |
| 1883 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7621130 |    0.6240299 |     2.5864871 |                   |        0.4525006 |         0.0000000 |        2.5864871 |         2.0000000 |         1.0083934 |          0.7633334 |
| 1884 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8207007 |    1.7108473 |    12.2830276 |                   |        1.5722388 |         1.0000000 |       13.2830276 |         7.3890562 |         3.2438955 |          2.8758757 |
| 1885 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1406891 |     3.1415896 |                   |       -3.1415722 |         0.0000000 |        3.1415896 |         0.0000000 |         0.5233802 |          0.0000000 |
| 1886 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |    9.3086109 |    43.2268257 |                   |      -43.2268257 |         0.0000000 |       12.0594921 |         0.0000000 |        -6.9306722 |          0.0000000 |
| 1887 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0732280 |    9.7947359 |   120.1969910 |                   |      -59.6016731 |       -58.0000000 |       62.7153206 |        60.0000000 |        -2.6142483 |          2.0850961 |
| 1888 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0732280 |    9.7947359 |   120.1969910 |                   |      -59.6016731 |       -58.0000000 |       62.7153206 |        60.0000000 |        -2.6142483 |          2.0850961 |
| 1889 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.5082909 |    0.0253783 |     0.8508317 |                   |        0.0041300 |         0.0052002 |        0.8560320 |         0.0052002 |         0.0305661 |          0.0052002 |
| 1890 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    1.0700001 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         1.0700001 |          0.0000000 |
| 1891 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.9133334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.9133334 |          0.0000000 |
| 1892 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.2576785 |    0.0107807 |     0.8409446 |                   |        0.0006202 |         0.0052002 |        0.8461449 |         0.0052002 |         0.0142559 |          0.0052002 |
| 1893 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6208025 |  758.3666382 |  1924.0000000 |                   |        0.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |       895.5933838 |        598.0000000 |
| 1894 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.6201696 |  189.6566620 |   481.0000000 |                   |        0.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       223.6699982 |        149.5000000 |
| 1895 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.0021639 |   12.2700415 |   117.0015411 |                   |      -59.7236519 |       -58.0000000 |       63.6318588 |        60.0000000 |        -0.6912781 |         -2.7400000 |
| 1896 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0055357 |     0.0884010 |                   |       -0.0332674 |         0.0000000 |        0.0884010 |         0.0000000 |        -0.0018556 |          0.0000000 |
| 1897 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7658073 |    1.6094422 |     2.1817756 |                   |       -1.1817756 |         0.0000000 |       -0.8996699 |         1.0000000 |        -1.0361091 |          0.5733333 |
| 1898 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1364739 |     3.1415894 |                   |       -3.1415851 |         0.0000000 |        3.1415894 |         0.0000000 |        -0.9828914 |          0.0000000 |
| 1899 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0017296 |   31.8511486 |   117.0015411 |                   |      -59.7236519 |       -58.0000000 |       63.6318588 |        60.0000000 |         7.4461370 |        -11.0011110 |
| 1900 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7843000 |    0.5596606 |     2.6132312 |                   |        0.4584962 |         0.0000000 |        2.6683385 |         2.0000000 |         1.0441402 |          0.7633334 |
| 1901 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8617991 |    1.4362658 |    12.6430635 |                   |        1.5816936 |         1.0000000 |       14.4159975 |         7.3890562 |         3.2571695 |          2.8758757 |
| 1902 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1364739 |     3.1415894 |                   |       -3.1415851 |         0.0000000 |        3.1415894 |         0.0000000 |        -0.9828914 |          0.0000000 |
| 1903 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   12.0410156 |    43.6960030 |                   |      -43.6960030 |         0.0000000 |       12.0883141 |         0.0000000 |       -10.6789761 |          0.0000000 |
| 1904 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.0131111 |   13.9121761 |   117.0015411 |                   |      -59.7236519 |       -58.0000000 |       63.6318588 |        60.0000000 |        -0.0909899 |         -2.4375706 |
| 1905 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.0131111 |   13.9121761 |   117.0015411 |                   |      -59.7236519 |       -58.0000000 |       63.6318588 |        60.0000000 |        -0.0909899 |         -2.4375706 |
| 1906 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.2576785 |    0.0107807 |     0.8409446 |                   |        0.0006202 |         0.0052002 |        0.8461449 |         0.0052002 |         0.0142559 |          0.0052002 |
| 1907 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.9133334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.9133334 |          0.0000000 |
| 1908 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7533333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7533333 |          0.0000000 |
| 1909 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.3191804 |    0.0201882 |     0.7868917 |                   |        0.0006217 |         0.0052002 |        0.7920919 |         0.0052002 |         0.0241184 |          0.0052002 |
| 1910 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8058188 |  739.5933228 |  1872.0000000 |                   |        4.0000000 |         0.0000000 |     2040.0000000 |      1196.0000000 |      1230.8734131 |        598.0000000 |
| 1911 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8055807 |  184.8099976 |   468.0000000 |                   |        1.0000000 |         0.0000000 |      510.0000000 |       299.0000000 |       307.5299988 |        149.5000000 |
| 1912 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0342308 |   10.7285910 |   120.1816254 |                   |      -60.1816254 |       -58.0000000 |       66.7594757 |        60.0000000 |        -1.1433889 |         -0.5290909 |
| 1913 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0014837 |     0.0586022 |                   |       -0.0305446 |         0.0000000 |        0.0586022 |         0.0000000 |         0.0002460 |          0.0000000 |
| 1914 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7554221 |    1.5726308 |     2.0036588 |                   |       -1.0585845 |         0.0000000 |       -0.7613889 |         1.0000000 |        -0.9992975 |          0.5733333 |
| 1915 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1400523 |     3.1415927 |                   |       -3.1415904 |         0.0000000 |        3.1415927 |         0.0000000 |        -0.3354391 |          0.0000000 |
| 1916 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0452667 |   25.2532024 |   120.1816254 |                   |      -60.1816254 |       -58.0000000 |       66.7594757 |        60.0000000 |         6.7220221 |         -2.8944445 |
| 1917 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7446532 |    0.6358966 |     1.9409519 |                   |        0.4476413 |         0.0000000 |        2.1406169 |         2.0000000 |         0.9191504 |          0.7633334 |
| 1918 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8438243 |    1.6258551 |     5.9653788 |                   |        1.5646173 |         1.0000000 |        8.5046825 |         7.3890562 |         2.8350050 |          2.8758757 |
| 1919 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1400523 |     3.1415927 |                   |       -3.1415904 |         0.0000000 |        3.1415927 |         0.0000000 |        -0.3354391 |          0.0000000 |
| 1920 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   12.9243603 |    44.0286751 |                   |      -44.0286751 |         0.0000000 |       12.1543760 |         0.0000000 |       -11.5005808 |          0.0000000 |
| 1921 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0211599 |   12.2550316 |   120.1816254 |                   |      -60.1816254 |       -58.0000000 |       66.7594757 |        60.0000000 |        -0.6166100 |         -0.0055706 |
| 1922 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0211599 |   12.2550316 |   120.1816254 |                   |      -60.1816254 |       -58.0000000 |       66.7594757 |        60.0000000 |        -0.6166101 |         -0.0055706 |
| 1923 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.3191804 |    0.0201882 |     0.7868917 |                   |        0.0006217 |         0.0052002 |        0.7920919 |         0.0052002 |         0.0241184 |          0.0052002 |
| 1924 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.7533333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.7533333 |          0.0000000 |
| 1925 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8000000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8000000 |          0.0000000 |
| 1926 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.3201880 |    0.0141357 |     0.7429392 |                   |        0.0006771 |         0.0052002 |        0.7481394 |         0.0052002 |         0.0175510 |          0.0052002 |
| 1927 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6098674 |  756.0800171 |  1876.0000000 |                   |        0.0000000 |         0.0000000 |     2028.0000000 |      1196.0000000 |       879.9333496 |        598.0000000 |
| 1928 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.6091802 |  189.1633301 |   469.0000000 |                   |        0.0000000 |         0.0000000 |      507.0000000 |       299.0000000 |       219.7833405 |        149.5000000 |
| 1929 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           | -0.0277774 |   10.5149527 |   119.1160431 |                   |      -60.3541679 |       -58.0000000 |       61.8847580 |        60.0000000 |        -0.6984384 |         -0.5678788 |
| 1930 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    0.0051525 |     0.0658745 |                   |       -0.0188213 |         0.0000000 |        0.0658745 |         0.0000000 |         0.0012176 |          0.0000000 |
| 1931 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.7684062 |    1.6151855 |     2.2595618 |                   |       -1.2595618 |         0.0000000 |       -0.8551486 |         1.0000000 |        -1.0418522 |          0.5733333 |
| 1932 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0000000 |    3.1367707 |     3.1415849 |                   |       -3.1415837 |         0.0000000 |        3.1415849 |         0.0000000 |        -1.3417491 |          0.0000000 |
| 1933 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0352364 |   26.2788715 |   119.1160431 |                   |      -60.3541679 |       -58.0000000 |       61.8847580 |        60.0000000 |         6.1297798 |         -3.0366666 |
| 1934 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.7865273 |    0.5577512 |     1.7478665 |                   |        0.4621528 |         0.0000000 |        2.6137877 |         2.0000000 |         1.0310159 |          0.7633334 |
| 1935 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8677894 |    1.4295087 |     6.2616005 |                   |        1.5874878 |         1.0000000 |       13.6506567 |         7.3890562 |         3.1923568 |          2.8758757 |
| 1936 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0000000 |    3.1367707 |     3.1415849 |                   |       -3.1415837 |         0.0000000 |        3.1415849 |         0.0000000 |        -1.3417491 |          0.0000000 |
| 1937 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0000000 |   11.1780872 |    45.2277565 |                   |      -45.2277565 |         0.0000000 |       12.3808136 |         0.0000000 |        -9.3748579 |          0.0000000 |
| 1938 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           | -0.0122824 |   11.9796171 |   119.1160431 |                   |      -60.3541679 |       -58.0000000 |       61.8847580 |        60.0000000 |        -0.1499914 |         -0.0482373 |
| 1939 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           | -0.0122824 |   11.9796171 |   119.1160431 |                   |      -60.3541679 |       -58.0000000 |       61.8847580 |        60.0000000 |        -0.1499915 |         -0.0482373 |
| 1940 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.3201880 |    0.0141357 |     0.7429392 |                   |        0.0006771 |         0.0052002 |        0.7481394 |         0.0052002 |         0.0175510 |          0.0052002 |
| 1941 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0000000 |    0.8000000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         0.0000000 |         0.8000000 |          0.0000000 |
+------+------------------------------------------------+-------------------------------------------------------------------------------+-------------------------------------------------------------------------+-----------------------------------+---------------+-----------+------------+--------------+---------------+-------------------+------------------+-------------------+------------------+-------------------+-------------------+--------------------+