+------+------------------------------------------------+-------------------------------------------------------------------------------+-------------------------------------------------------------------------+-----------------------------------+---------------+-----------+------------+--------------+---------------+-------------------+------------------+-------------------+------------------+-------------------+-------------------+--------------------+
|      | mod_name                                       | base_op_type                                                                  | analy_op_type                                                           | shape                             | quant_dtype   |    qscale |     Cosine |           L1 |          Atol |   max_qscale_diff |   base_model_min |   analy_model_min |   base_model_max |   analy_model_max |   base_model_mean |   analy_model_mean |
|------+------------------------------------------------+-------------------------------------------------------------------------------+-------------------------------------------------------------------------+-----------------------------------+---------------+-----------+------------+--------------+---------------+-------------------+------------------+-------------------+------------------+-------------------+-------------------+--------------------|
|    0 | backbone.quant                                 | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([156, 3, 256, 704])    | qint8         | 1.0000000 |  1.0124134 |    0.0000000 |     0.0000000 |         0.0000000 |       -0.8750000 |        -0.8750000 |        0.8281250 |         0.8281250 |        -0.0538268 |         -0.0538268 |
|    1 | backbone.patch_embed.0.0                       | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.ConvReLU2d                         | torch.Size([156, 32, 128, 352])   | qint8         | 1.0000000 |  0.3864226 |    0.0876143 |     2.1843956 |         2.1843956 |       -2.1520221 |         0.0000000 |        2.4978926 |         1.2530288 |         0.0155168 |          0.0302157 |
|    2 | backbone.patch_embed.0.1                       | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 32, 128, 352])   | qint8         | 1.0000000 |  0.6723548 |    0.0239229 |     1.2728612 |         1.2728612 |       -1.2728612 |         0.0000000 |        1.2530285 |         1.2530288 |         0.0062928 |          0.0302157 |
|    3 | backbone.patch_embed.0.2                       | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 32, 128, 352])   | qint8         | 1.0000000 |  1.0476432 |    0.0000000 |     0.0000005 |         0.0000005 |        0.0000000 |         0.0000000 |        1.2530285 |         1.2530288 |         0.0302157 |          0.0302157 |
|    4 | backbone.patch_embed.1.0                       | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.ConvReLU2d                         | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.0605295 |    0.1075244 |     2.7915201 |         2.7915201 |       -2.7915201 |         0.0000000 |        1.2053459 |         0.5375794 |        -0.0352510 |          0.0256364 |
|    5 | backbone.patch_embed.1.1                       | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.9123566 |    0.0073788 |     0.6889656 |         0.6889656 |       -0.6889656 |         0.0000000 |        0.5375948 |         0.5375794 |         0.0182610 |          0.0256364 |
|    6 | backbone.patch_embed.1.2                       | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  1.0304425 |    0.0000027 |     0.0001207 |         0.0001207 |        0.0000000 |         0.0000000 |        0.5375948 |         0.5375794 |         0.0256371 |          0.0256364 |
|    7 | backbone.stages.0.block.0.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.1926465 |    0.2412763 |    13.1849022 |        13.1849022 |       -0.3775516 |       -12.9328127 |        0.3664352 |        13.3727055 |         0.0111314 |          0.0177905 |
|    8 | backbone.stages.0.block.0.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  1.0288363 |    0.0000934 |     0.0053996 |         0.0053996 |      -12.9339705 |       -12.9328127 |       13.3737764 |        13.3727055 |         0.0178028 |          0.0177905 |
|    9 | backbone.stages.0.block.0.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 |  1.0857277 |    0.0001062 |     0.0125751 |         0.0125751 |      -20.8860588 |       -20.8862438 |        3.1507902 |         3.1501548 |        -0.2301355 |         -0.2301374 |
|   10 | backbone.stages.0.block.0.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 |  1.1121863 |    0.0000333 |     0.0033934 |         0.0033934 |       -0.1699712 |        -0.1699712 |        3.1482251 |         3.1475844 |        -0.0271438 |         -0.0271442 |
|   11 | backbone.stages.0.block.0.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   12 | backbone.stages.0.block.0.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   13 | backbone.stages.0.block.0.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  1.0359191 |    0.0000069 |     0.0004441 |         0.0004441 |       -0.4369636 |        -0.4368731 |        0.6138867 |         0.6138716 |         0.0204119 |          0.0204121 |
|   14 | backbone.stages.0.block.0.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  1.0359191 |    0.0000069 |     0.0004441 |         0.0004441 |       -0.4369636 |        -0.4368731 |        0.6138867 |         0.6138716 |         0.0204119 |          0.0204121 |
|   15 | backbone.stages.0.block.1.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.3167961 |    0.2765068 |     7.0570149 |         7.0570149 |       -0.4436789 |        -7.2542415 |        0.2568534 |         5.2624955 |         0.0023029 |         -0.0143802 |
|   16 | backbone.stages.0.block.1.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  1.0325835 |    0.0001121 |     0.0040832 |         0.0040832 |       -7.2563109 |        -7.2542415 |        5.2625542 |         5.2624955 |        -0.0143909 |         -0.0143802 |
|   17 | backbone.stages.0.block.1.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 |  1.0746342 |    0.0001214 |     0.0040312 |         0.0040312 |       -8.0417929 |        -8.0409079 |        2.6138728 |         2.6136053 |        -0.3816274 |         -0.3816504 |
|   18 | backbone.stages.0.block.1.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 |  1.0116278 |    0.0000330 |     0.0020299 |         0.0020299 |       -0.1699712 |        -0.1699712 |        2.6021726 |         2.6018972 |        -0.0643994 |         -0.0644037 |
|   19 | backbone.stages.0.block.1.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   20 | backbone.stages.0.block.1.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   21 | backbone.stages.0.block.1.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  1.0256341 |    0.0000122 |     0.0004694 |         0.0004694 |       -0.4411992 |        -0.4410955 |        0.6950135 |         0.6950072 |         0.0064554 |          0.0064549 |
|   22 | backbone.stages.0.block.1.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  1.0256341 |    0.0000122 |     0.0004694 |         0.0004694 |       -0.4411992 |        -0.4410955 |        0.6950135 |         0.6950072 |         0.0064554 |          0.0064549 |
|   23 | backbone.stages.0.block.2.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.3282977 |    0.3692603 |     6.5547700 |         6.5547700 |       -0.3975463 |        -5.9799194 |        0.2752692 |         6.7096963 |        -0.0080157 |         -0.0300341 |
|   24 | backbone.stages.0.block.2.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  1.0275362 |    0.0001583 |     0.0041709 |         0.0041709 |       -5.9815922 |        -5.9799194 |        6.7098112 |         6.7096963 |        -0.0300451 |         -0.0300341 |
|   25 | backbone.stages.0.block.2.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 |  1.0656571 |    0.0001733 |     0.0047717 |         0.0047717 |       -8.1312017 |        -8.1290245 |        2.6425624 |         2.6419749 |        -0.5142649 |         -0.5142659 |
|   26 | backbone.stages.0.block.2.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 |  1.0154223 |    0.0000434 |     0.0026435 |         0.0026435 |       -0.1699712 |        -0.1699712 |        2.6316907 |         2.6310868 |        -0.0666563 |         -0.0666579 |
|   27 | backbone.stages.0.block.2.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   28 | backbone.stages.0.block.2.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   29 | backbone.stages.0.block.2.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  1.0286822 |    0.0000218 |     0.0004637 |         0.0004637 |       -0.5992131 |        -0.5991402 |        0.8351721 |         0.8352416 |        -0.0000473 |         -0.0000499 |
|   30 | backbone.stages.0.block.2.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  1.0286822 |    0.0000218 |     0.0004637 |         0.0004637 |       -0.5992131 |        -0.5991402 |        0.8351721 |         0.8352416 |        -0.0000473 |         -0.0000499 |
|   31 | backbone.stages.0.block.3.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  0.2595484 |    0.4467258 |     7.9808512 |         7.9808512 |       -0.2147864 |        -8.1011963 |        0.2818395 |         7.2633524 |         0.0032614 |          0.0269282 |
|   32 | backbone.stages.0.block.3.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  1.0310236 |    0.0002550 |     0.0062895 |         0.0062895 |       -8.1003551 |        -8.1011963 |        7.2618566 |         7.2633524 |         0.0269206 |          0.0269282 |
|   33 | backbone.stages.0.block.3.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 |  1.0775789 |    0.0002488 |     0.0049796 |         0.0049796 |       -6.5912523 |        -6.5930209 |        6.0452452 |         6.0450821 |        -0.2835538 |         -0.2835542 |
|   34 | backbone.stages.0.block.3.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 128, 64, 176])   | qint8         | 1.0000000 |  1.1443083 |    0.0000907 |     0.0049906 |         0.0049906 |       -0.1699712 |        -0.1699712 |        6.0452452 |         6.0450821 |        -0.0000803 |         -0.0000875 |
|   35 | backbone.stages.0.block.3.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   36 | backbone.stages.0.block.3.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   37 | backbone.stages.0.block.3.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  1.0248864 |    0.0000295 |     0.0006366 |         0.0006366 |       -0.9459292 |        -0.9458709 |        1.0556573 |         1.0556288 |        -0.0014878 |         -0.0014901 |
|   38 | backbone.stages.0.block.3.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  1.0248864 |    0.0000295 |     0.0006366 |         0.0006366 |       -0.9459292 |        -0.9458709 |        1.0556573 |         1.0556288 |        -0.0014878 |         -0.0014901 |
|   39 | backbone.stage_norm.0                          | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.qat.batchnorm.BatchNorm2d                     | torch.Size([156, 64, 64, 176])    | qint8         | 1.0000000 |  1.0360641 |    0.0000322 |     0.0015808 |         0.0015808 |       -2.5823004 |        -2.5826988 |        1.9952118 |         1.9953612 |         0.0035341 |          0.0035302 |
|   40 | backbone.up                                    | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer            | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer      | torch.Size([156, 64, 128, 352])   | qint8         | 1.0000000 |  1.1347257 |    0.0000239 |     0.0010246 |         0.0010246 |       -2.1408832 |        -2.1409709 |        1.5269575 |         1.5267948 |         0.0035341 |          0.0035302 |
|   41 | backbone.downsample_block.0.proj.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  0.6373936 |    0.1520432 |     1.3957872 |         1.3957872 |       -1.5562987 |        -0.6029651 |        1.3459803 |         0.6555882 |        -0.0048629 |         -0.0025339 |
|   42 | backbone.downsample_block.0.proj.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  1.0107642 |    0.0000145 |     0.0003257 |         0.0003257 |       -0.6030294 |        -0.6029651 |        0.6555178 |         0.6555882 |        -0.0025338 |         -0.0025339 |
|   43 | backbone.stages.1.block.0.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  0.4050828 |    0.3639116 |     5.0475483 |         5.0475483 |       -0.2133264 |        -5.0921869 |        0.2730823 |         5.0216517 |         0.0049994 |          0.0039825 |
|   44 | backbone.stages.1.block.0.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  1.0113447 |    0.0001613 |     0.0022073 |         0.0022073 |       -5.0919390 |        -5.0921869 |        5.0226922 |         5.0216517 |         0.0039671 |          0.0039825 |
|   45 | backbone.stages.1.block.0.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  1.0248199 |    0.0001304 |     0.0020146 |         0.0020146 |       -6.1770630 |        -6.1763725 |        2.3089397 |         2.3085892 |        -0.5692577 |         -0.5692422 |
|   46 | backbone.stages.1.block.0.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  0.9791376 |    0.0000261 |     0.0012128 |         0.0012128 |       -0.1699712 |        -0.1699712 |        2.2847571 |         2.2843878 |        -0.1072593 |         -0.1072584 |
|   47 | backbone.stages.1.block.0.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   48 | backbone.stages.1.block.0.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   49 | backbone.stages.1.block.0.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  1.0110679 |    0.0000200 |     0.0003241 |         0.0003241 |       -0.5691814 |        -0.5691407 |        0.6999257 |         0.6999835 |         0.0022512 |          0.0022508 |
|   50 | backbone.stages.1.block.0.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  1.0110679 |    0.0000200 |     0.0003241 |         0.0003241 |       -0.5691814 |        -0.5691407 |        0.6999257 |         0.6999835 |         0.0022512 |          0.0022508 |
|   51 | backbone.stages.1.block.1.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  0.3183753 |    0.3838607 |     6.2699084 |         6.2699084 |       -0.3430986 |        -6.4086118 |        0.3073109 |         5.3152351 |         0.0082680 |         -0.0050086 |
|   52 | backbone.stages.1.block.1.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  1.0109113 |    0.0002037 |     0.0026667 |         0.0026667 |       -6.4078498 |        -6.4086118 |        5.3168836 |         5.3152351 |        -0.0050063 |         -0.0050086 |
|   53 | backbone.stages.1.block.1.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  1.0268047 |    0.0001755 |     0.0021019 |         0.0021019 |       -5.9514751 |        -5.9525023 |        3.5158956 |         3.5158780 |        -0.5520574 |         -0.5520766 |
|   54 | backbone.stages.1.block.1.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  0.9831942 |    0.0000378 |     0.0017073 |         0.0017073 |       -0.1699712 |        -0.1699712 |        3.5151253 |         3.5151076 |        -0.0995596 |         -0.0995643 |
|   55 | backbone.stages.1.block.1.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   56 | backbone.stages.1.block.1.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   57 | backbone.stages.1.block.1.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  1.0091199 |    0.0000285 |     0.0003370 |         0.0003370 |       -0.7021237 |        -0.7020593 |        0.7073499 |         0.7074090 |        -0.0077670 |         -0.0077676 |
|   58 | backbone.stages.1.block.1.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  1.0091199 |    0.0000285 |     0.0003370 |         0.0003370 |       -0.7021237 |        -0.7020593 |        0.7073499 |         0.7074090 |        -0.0077670 |         -0.0077676 |
|   59 | backbone.stages.1.block.2.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  0.3007355 |    0.3841820 |     5.8533559 |         5.8533559 |       -0.3190346 |        -5.7509928 |        0.3381314 |         5.9894042 |         0.0013930 |          0.0238110 |
|   60 | backbone.stages.1.block.2.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  1.0113056 |    0.0002446 |     0.0031796 |         0.0031796 |       -5.7507887 |        -5.7509928 |        5.9898725 |         5.9894042 |         0.0238204 |          0.0238110 |
|   61 | backbone.stages.1.block.2.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  1.0290383 |    0.0002046 |     0.0022521 |         0.0022521 |       -5.5523510 |        -5.5519319 |        3.4424696 |         3.4424763 |        -0.5244807 |         -0.5245348 |
|   62 | backbone.stages.1.block.2.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  0.9800074 |    0.0000456 |     0.0021147 |         0.0021147 |       -0.1699712 |        -0.1699712 |        3.4414773 |         3.4414840 |        -0.0996298 |         -0.0996417 |
|   63 | backbone.stages.1.block.2.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   64 | backbone.stages.1.block.2.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   65 | backbone.stages.1.block.2.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  1.0108314 |    0.0000371 |     0.0004190 |         0.0004190 |       -0.6561730 |        -0.6562089 |        0.7103927 |         0.7103375 |        -0.0082999 |         -0.0083004 |
|   66 | backbone.stages.1.block.2.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  1.0108314 |    0.0000371 |     0.0004190 |         0.0004190 |       -0.6561730 |        -0.6562089 |        0.7103927 |         0.7103375 |        -0.0082999 |         -0.0083004 |
|   67 | backbone.stage_norm.1                          | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.qat.batchnorm.BatchNorm2d                     | torch.Size([156, 128, 32, 88])    | qint8         | 1.0000000 |  1.0134332 |    0.0000987 |     0.0022774 |         0.0022774 |       -3.5974443 |        -3.5971408 |        4.3562884 |         4.3561430 |        -0.0014911 |         -0.0014977 |
|   68 | backbone.downsample_block.1.proj.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.5915256 |    0.2644354 |     1.7922169 |         1.7922169 |       -2.4550855 |        -0.8507398 |        1.8460264 |         0.7266555 |        -0.0159732 |          0.0013466 |
|   69 | backbone.downsample_block.1.proj.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0031950 |    0.0000187 |     0.0002929 |         0.0002929 |       -0.8507186 |        -0.8507398 |        0.7267684 |         0.7266555 |         0.0013476 |          0.0013466 |
|   70 | backbone.stages.2.block.0.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.3778436 |    0.3116243 |     3.2196984 |         3.2196984 |       -0.2546912 |        -3.1839159 |        0.2373314 |         3.2911947 |         0.0008716 |          0.0160158 |
|   71 | backbone.stages.2.block.0.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0025841 |    0.0001769 |     0.0019903 |         0.0019903 |       -3.1841176 |        -3.1839159 |        3.2912729 |         3.2911947 |         0.0160174 |          0.0160158 |
|   72 | backbone.stages.2.block.0.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  1.0043161 |    0.0002485 |     0.0025220 |         0.0025220 |       -5.4999471 |        -5.5004029 |        2.7738755 |         2.7744651 |        -0.9080083 |         -0.9080172 |
|   73 | backbone.stages.2.block.0.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.9989675 |    0.0000406 |     0.0020299 |         0.0020299 |       -0.1699712 |        -0.1699712 |        2.7661929 |         2.7667947 |        -0.1020394 |         -0.1020430 |
|   74 | backbone.stages.2.block.0.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   75 | backbone.stages.2.block.0.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   76 | backbone.stages.2.block.0.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0025548 |    0.0000225 |     0.0003159 |         0.0003159 |       -0.8559944 |        -0.8560268 |        0.7161996 |         0.7160835 |        -0.0003921 |         -0.0003928 |
|   77 | backbone.stages.2.block.0.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0025548 |    0.0000225 |     0.0003159 |         0.0003159 |       -0.8559944 |        -0.8560268 |        0.7161996 |         0.7160835 |        -0.0003921 |         -0.0003928 |
|   78 | backbone.stages.2.block.1.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.3294102 |    0.3317087 |     3.2626331 |         3.2626331 |       -0.1983189 |        -3.3526723 |        0.2568125 |         3.1691210 |         0.0010272 |          0.0039166 |
|   79 | backbone.stages.2.block.1.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0023066 |    0.0002013 |     0.0028608 |         0.0028608 |       -3.3520138 |        -3.3526723 |        3.1696455 |         3.1691210 |         0.0039176 |          0.0039166 |
|   80 | backbone.stages.2.block.1.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  1.0046339 |    0.0002757 |     0.0023226 |         0.0023226 |       -5.2858362 |        -5.2854939 |        2.8027706 |         2.8023357 |        -0.8775827 |         -0.8775456 |
|   81 | backbone.stages.2.block.1.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.9988775 |    0.0000469 |     0.0025258 |         0.0025258 |       -0.1699712 |        -0.1699712 |        2.7956703 |         2.7952271 |        -0.0971118 |         -0.0971129 |
|   82 | backbone.stages.2.block.1.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   83 | backbone.stages.2.block.1.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   84 | backbone.stages.2.block.1.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0026103 |    0.0000273 |     0.0003349 |         0.0003349 |       -0.8488334 |        -0.8488759 |        0.7494801 |         0.7493663 |         0.0030796 |          0.0030792 |
|   85 | backbone.stages.2.block.1.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0026103 |    0.0000273 |     0.0003349 |         0.0003349 |       -0.8488334 |        -0.8488759 |        0.7494801 |         0.7493663 |         0.0030796 |          0.0030792 |
|   86 | backbone.stages.2.block.2.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.3940895 |    0.3206048 |     5.4583716 |         5.4583716 |       -0.3051731 |        -5.6162820 |        0.5303699 |         4.4604621 |        -0.0025711 |         -0.0022041 |
|   87 | backbone.stages.2.block.2.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0022513 |    0.0002126 |     0.0028816 |         0.0028816 |       -5.6162529 |        -5.6162820 |        4.4603553 |         4.4604621 |        -0.0022093 |         -0.0022041 |
|   88 | backbone.stages.2.block.2.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  1.0048437 |    0.0002908 |     0.0039163 |         0.0039163 |       -5.5281630 |        -5.5286613 |        4.0132437 |         4.0133896 |        -0.8191840 |         -0.8191550 |
|   89 | backbone.stages.2.block.2.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.9982911 |    0.0000514 |     0.0023806 |         0.0023806 |       -0.1699712 |        -0.1699712 |        4.0131235 |         4.0132694 |        -0.0985882 |         -0.0985880 |
|   90 | backbone.stages.2.block.2.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   91 | backbone.stages.2.block.2.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   92 | backbone.stages.2.block.2.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0024505 |    0.0000329 |     0.0005231 |         0.0005231 |       -0.9164357 |        -0.9165106 |        0.8131083 |         0.8130023 |         0.0058483 |          0.0058482 |
|   93 | backbone.stages.2.block.2.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0024505 |    0.0000329 |     0.0005231 |         0.0005231 |       -0.9164357 |        -0.9165106 |        0.8131083 |         0.8130023 |         0.0058483 |          0.0058482 |
|   94 | backbone.stages.2.block.3.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.3053118 |    0.2918527 |     5.5474954 |         5.5474954 |       -0.2027919 |        -4.4349957 |        0.3141897 |         5.8616853 |         0.0033629 |         -0.0068125 |
|   95 | backbone.stages.2.block.3.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0027130 |    0.0001829 |     0.0021203 |         0.0021203 |       -4.4353352 |        -4.4349957 |        5.8622289 |         5.8616853 |        -0.0068025 |         -0.0068125 |
|   96 | backbone.stages.2.block.3.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  1.0046120 |    0.0002561 |     0.0031106 |         0.0031106 |       -5.7909021 |        -5.7904782 |        4.8683496 |         4.8701882 |        -0.6754150 |         -0.6753691 |
|   97 | backbone.stages.2.block.3.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.9958320 |    0.0000525 |     0.0022489 |         0.0022489 |       -0.1699712 |        -0.1699712 |        4.8683467 |         4.8701854 |        -0.0918116 |         -0.0918069 |
|   98 | backbone.stages.2.block.3.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|   99 | backbone.stages.2.block.3.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  100 | backbone.stages.2.block.3.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0022345 |    0.0000402 |     0.0006551 |         0.0006551 |       -0.9928050 |        -0.9926869 |        1.0436171 |         1.0434982 |         0.0052718 |          0.0052711 |
|  101 | backbone.stages.2.block.3.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0022345 |    0.0000402 |     0.0006551 |         0.0006551 |       -0.9928050 |        -0.9926869 |        1.0436171 |         1.0434982 |         0.0052718 |          0.0052711 |
|  102 | backbone.stages.2.block.4.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.4022524 |    0.3148500 |     5.5055456 |         5.5055456 |       -0.3147332 |        -5.5732408 |        0.5123155 |         4.5618296 |        -0.0004261 |         -0.0049782 |
|  103 | backbone.stages.2.block.4.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0023828 |    0.0002094 |     0.0025775 |         0.0025775 |       -5.5733371 |        -5.5732408 |        4.5619535 |         4.5618296 |        -0.0049562 |         -0.0049782 |
|  104 | backbone.stages.2.block.4.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  1.0045196 |    0.0002920 |     0.0035143 |         0.0035143 |       -6.9711266 |        -6.9746408 |        4.0525346 |         4.0529160 |        -0.8401532 |         -0.8401932 |
|  105 | backbone.stages.2.block.4.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.9981650 |    0.0000495 |     0.0024369 |         0.0024369 |       -0.1699712 |        -0.1699712 |        4.0524321 |         4.0528135 |        -0.1044414 |         -0.1044464 |
|  106 | backbone.stages.2.block.4.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  107 | backbone.stages.2.block.4.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  108 | backbone.stages.2.block.4.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0023955 |    0.0000476 |     0.0006557 |         0.0006557 |       -0.9974767 |        -0.9973463 |        1.0586658 |         1.0585338 |         0.0031123 |          0.0031124 |
|  109 | backbone.stages.2.block.4.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0023955 |    0.0000476 |     0.0006557 |         0.0006557 |       -0.9974767 |        -0.9973463 |        1.0586658 |         1.0585338 |         0.0031123 |          0.0031124 |
|  110 | backbone.stages.2.block.5.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.4507696 |    0.3259152 |     4.3892903 |         4.3892903 |       -0.5258338 |        -3.3268080 |        0.4029842 |         4.7748919 |         0.0007316 |         -0.0200644 |
|  111 | backbone.stages.2.block.5.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0022299 |    0.0002220 |     0.0022898 |         0.0022898 |       -3.3267536 |        -3.3268080 |        4.7753744 |         4.7748919 |        -0.0200559 |         -0.0200644 |
|  112 | backbone.stages.2.block.5.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  1.0040905 |    0.0003155 |     0.0032914 |         0.0032914 |       -5.9676948 |        -5.9687171 |        5.4045010 |         5.4073887 |        -0.9017808 |         -0.9017584 |
|  113 | backbone.stages.2.block.5.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.9976186 |    0.0000542 |     0.0037010 |         0.0037010 |       -0.1699712 |        -0.1699712 |        5.4045010 |         5.4073887 |        -0.0957836 |         -0.0957877 |
|  114 | backbone.stages.2.block.5.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  115 | backbone.stages.2.block.5.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  116 | backbone.stages.2.block.5.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0019156 |    0.0000585 |     0.0011715 |         0.0011715 |       -1.1964277 |        -1.1961278 |        1.4526852 |         1.4529016 |        -0.0008746 |         -0.0008745 |
|  117 | backbone.stages.2.block.5.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0019156 |    0.0000585 |     0.0011715 |         0.0011715 |       -1.1964277 |        -1.1961278 |        1.4526852 |         1.4529016 |        -0.0008746 |         -0.0008745 |
|  118 | backbone.stages.2.block.6.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.3137752 |    0.2477028 |     5.9539824 |         5.9539824 |       -0.2950250 |        -6.1718187 |        0.2271557 |         3.8230994 |        -0.0004493 |         -0.0045938 |
|  119 | backbone.stages.2.block.6.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0032895 |    0.0002062 |     0.0035909 |         0.0035909 |       -6.1731520 |        -6.1718187 |        3.8232098 |         3.8230994 |        -0.0045911 |         -0.0045938 |
|  120 | backbone.stages.2.block.6.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  1.0046577 |    0.0002731 |     0.0046475 |         0.0046475 |       -6.5064116 |        -6.5063443 |        5.0562983 |         5.0559850 |        -0.5679991 |         -0.5679813 |
|  121 | backbone.stages.2.block.6.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.9965242 |    0.0000558 |     0.0030111 |         0.0030111 |       -0.1699712 |        -0.1699712 |        5.0562968 |         5.0559835 |        -0.1063086 |         -0.1063082 |
|  122 | backbone.stages.2.block.6.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  123 | backbone.stages.2.block.6.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  124 | backbone.stages.2.block.6.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0022614 |    0.0000655 |     0.0011243 |         0.0011243 |       -1.3680738 |        -1.3681209 |        1.4138840 |         1.4141333 |         0.0020931 |          0.0020927 |
|  125 | backbone.stages.2.block.6.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0022614 |    0.0000655 |     0.0011243 |         0.0011243 |       -1.3680738 |        -1.3681209 |        1.4138840 |         1.4141333 |         0.0020931 |          0.0020927 |
|  126 | backbone.stages.2.block.7.dwconv.0.0           | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  0.3813956 |    0.2736412 |     4.8042850 |         4.8042850 |       -0.3779390 |        -4.7836943 |        0.3100367 |         5.0673900 |         0.0029123 |         -0.0055596 |
|  127 | backbone.stages.2.block.7.dwconv.0.1           | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0029268 |    0.0002004 |     0.0033377 |         0.0033377 |       -4.7842627 |        -4.7836943 |        5.0683379 |         5.0673900 |        -0.0055623 |         -0.0055596 |
|  128 | backbone.stages.2.block.7.pwconv1.0            | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  1.0042418 |    0.0002752 |     0.0042624 |         0.0042624 |       -6.1931481 |        -6.1935277 |        5.5073328 |         5.5072865 |        -0.6427965 |         -0.6427635 |
|  129 | backbone.stages.2.block.7.pwconv1.1            | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 384, 16, 44])    | qint8         | 1.0000000 |  0.9935640 |    0.0000567 |     0.0042691 |         0.0042691 |       -0.1699712 |        -0.1699712 |        5.5073328 |         5.5072865 |        -0.0964514 |         -0.0964494 |
|  130 | backbone.stages.2.block.7.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  131 | backbone.stages.2.block.7.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  132 | backbone.stages.2.block.7.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0023603 |    0.0000765 |     0.0017198 |         0.0017198 |       -1.6952732 |        -1.6949270 |        2.4453115 |         2.4468732 |        -0.0006310 |         -0.0006311 |
|  133 | backbone.stages.2.block.7.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0023603 |    0.0000765 |     0.0017198 |         0.0017198 |       -1.6952732 |        -1.6949270 |        2.4453115 |         2.4468732 |        -0.0006310 |         -0.0006311 |
|  134 | backbone.stage_norm.2                          | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.qat.batchnorm.BatchNorm2d                     | torch.Size([156, 192, 16, 44])    | qint8         | 1.0000000 |  1.0024242 |    0.0004745 |     0.0088816 |         0.0088816 |      -10.8844538 |       -10.8821468 |       12.6322517 |        12.6411304 |        -0.0004168 |         -0.0004151 |
|  135 | backbone.downsample_block.2.proj.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.6321901 |    0.5027852 |     5.8141994 |         5.8141994 |       -5.9304366 |        -0.4110209 |        3.9084415 |         0.3841555 |        -0.0039513 |         -0.0013925 |
|  136 | backbone.downsample_block.2.proj.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0009274 |    0.0000161 |     0.0002578 |         0.0002578 |       -0.4109803 |        -0.4110209 |        0.3841947 |         0.3841555 |        -0.0013926 |         -0.0013925 |
|  137 | backbone.stages.3.block.0.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.3123398 |    0.3002844 |     3.1278138 |         3.1278138 |       -0.1956664 |        -2.8635662 |        0.2580961 |         3.1873159 |         0.0030513 |         -0.0119020 |
|  138 | backbone.stages.3.block.0.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0009011 |    0.0001889 |     0.0020487 |         0.0020487 |       -2.8632631 |        -2.8635662 |        3.1872249 |         3.1873159 |        -0.0119089 |         -0.0119020 |
|  139 | backbone.stages.3.block.0.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  1.0025958 |    0.0003770 |     0.0032802 |         0.0032802 |       -8.7639990 |        -8.7641191 |        6.9884548 |         6.9898314 |        -1.2333148 |         -1.2332515 |
|  140 | backbone.stages.3.block.0.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  1.0021583 |    0.0000565 |     0.0030009 |         0.0030009 |       -0.1699712 |        -0.1699712 |        6.9884548 |         6.9898314 |        -0.0762439 |         -0.0762460 |
|  141 | backbone.stages.3.block.0.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  142 | backbone.stages.3.block.0.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  143 | backbone.stages.3.block.0.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0009364 |    0.0000239 |     0.0004352 |         0.0004352 |       -0.7251210 |        -0.7253252 |        0.5499279 |         0.5497915 |        -0.0011040 |         -0.0011038 |
|  144 | backbone.stages.3.block.0.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0009364 |    0.0000239 |     0.0004352 |         0.0004352 |       -0.7251210 |        -0.7253252 |        0.5499279 |         0.5497915 |        -0.0011040 |         -0.0011038 |
|  145 | backbone.stages.3.block.1.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.4100330 |    0.2943217 |     7.2611780 |         7.2611780 |       -0.2210868 |        -3.7136652 |        0.3098783 |         7.4619026 |        -0.0004401 |          0.0037379 |
|  146 | backbone.stages.3.block.1.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0009201 |    0.0002046 |     0.0034924 |         0.0034924 |       -3.7144234 |        -3.7136652 |        7.4598269 |         7.4619026 |         0.0037269 |          0.0037379 |
|  147 | backbone.stages.3.block.1.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  1.0021055 |    0.0003918 |     0.0034853 |         0.0034853 |       -7.2994556 |        -7.3006039 |        3.6756856 |         3.6754479 |        -1.2145092 |         -1.2144992 |
|  148 | backbone.stages.3.block.1.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  1.0027318 |    0.0000569 |     0.0032189 |         0.0032189 |       -0.1699712 |        -0.1699712 |        3.6752496 |         3.6750116 |        -0.0843397 |         -0.0843401 |
|  149 | backbone.stages.3.block.1.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  150 | backbone.stages.3.block.1.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  151 | backbone.stages.3.block.1.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0009358 |    0.0000319 |     0.0013533 |         0.0013533 |       -1.3546726 |        -1.3546054 |        1.2370435 |         1.2369763 |        -0.0020380 |         -0.0020381 |
|  152 | backbone.stages.3.block.1.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0009358 |    0.0000319 |     0.0013533 |         0.0013533 |       -1.3546726 |        -1.3546054 |        1.2370435 |         1.2369763 |        -0.0020380 |         -0.0020381 |
|  153 | backbone.stages.3.block.2.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.4459080 |    0.2779576 |     5.1078029 |         5.1078029 |       -0.3177767 |        -3.3156269 |        0.3516036 |         5.3822513 |         0.0011897 |          0.0040885 |
|  154 | backbone.stages.3.block.2.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0009366 |    0.0002004 |     0.0031695 |         0.0031695 |       -3.3159151 |        -3.3156269 |        5.3822432 |         5.3822513 |         0.0040860 |          0.0040885 |
|  155 | backbone.stages.3.block.2.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  1.0018228 |    0.0003852 |     0.0035114 |         0.0035114 |       -5.9032245 |        -5.9033632 |        3.7398543 |         3.7396939 |        -1.1974307 |         -1.1974274 |
|  156 | backbone.stages.3.block.2.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  1.0027889 |    0.0000541 |     0.0026282 |         0.0026282 |       -0.1699712 |        -0.1699712 |        3.7395101 |         3.7393494 |        -0.0923793 |         -0.0923797 |
|  157 | backbone.stages.3.block.2.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  158 | backbone.stages.3.block.2.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  159 | backbone.stages.3.block.2.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0009463 |    0.0000531 |     0.0013531 |         0.0013531 |       -1.3549135 |        -1.3548461 |        1.2987275 |         1.2993798 |        -0.0038254 |         -0.0038254 |
|  160 | backbone.stages.3.block.2.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0009463 |    0.0000531 |     0.0013531 |         0.0013531 |       -1.3549135 |        -1.3548461 |        1.2987275 |         1.2993798 |        -0.0038254 |         -0.0038254 |
|  161 | backbone.stages.3.block.3.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.5356310 |    0.2858911 |     4.7030110 |         4.7030110 |       -0.2966807 |        -3.2520728 |        0.3493218 |         4.9106193 |        -0.0010552 |          0.0077941 |
|  162 | backbone.stages.3.block.3.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0008682 |    0.0002270 |     0.0033355 |         0.0033355 |       -3.2538586 |        -3.2520728 |        4.9117498 |         4.9106193 |         0.0077976 |          0.0077941 |
|  163 | backbone.stages.3.block.3.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  1.0018191 |    0.0004218 |     0.0038319 |         0.0038319 |       -7.7209845 |        -7.7220478 |        4.6662040 |         4.6668262 |        -1.2883953 |         -1.2884034 |
|  164 | backbone.stages.3.block.3.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  1.0039762 |    0.0000561 |     0.0036719 |         0.0036719 |       -0.1699712 |        -0.1699712 |        4.6661968 |         4.6668191 |        -0.0890902 |         -0.0890896 |
|  165 | backbone.stages.3.block.3.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  166 | backbone.stages.3.block.3.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  167 | backbone.stages.3.block.3.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0009350 |    0.0000689 |     0.0013526 |         0.0013526 |       -1.3478730 |        -1.3478060 |        1.3116543 |         1.3122071 |        -0.0067529 |         -0.0067525 |
|  168 | backbone.stages.3.block.3.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0009350 |    0.0000689 |     0.0013526 |         0.0013526 |       -1.3478730 |        -1.3478060 |        1.3116543 |         1.3122071 |        -0.0067529 |         -0.0067525 |
|  169 | backbone.stages.3.block.4.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.5188736 |    0.2620449 |     5.4676499 |         5.4676499 |       -0.3982256 |        -5.7088733 |        0.5189627 |         2.8900802 |        -0.0003135 |          0.0020322 |
|  170 | backbone.stages.3.block.4.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0009290 |    0.0002266 |     0.0037417 |         0.0037417 |       -5.7102022 |        -5.7088733 |        2.8912404 |         2.8900802 |         0.0020305 |          0.0020322 |
|  171 | backbone.stages.3.block.4.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  1.0021617 |    0.0004298 |     0.0043581 |         0.0043581 |       -7.5386710 |        -7.5381713 |        5.0125170 |         5.0118909 |        -1.1109229 |         -1.1109859 |
|  172 | backbone.stages.3.block.4.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  1.0019249 |    0.0000645 |     0.0041788 |         0.0041788 |       -0.1699712 |        -0.1699712 |        5.0125155 |         5.0118895 |        -0.0943884 |         -0.0943875 |
|  173 | backbone.stages.3.block.4.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  174 | backbone.stages.3.block.4.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  175 | backbone.stages.3.block.4.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0010242 |    0.0000983 |     0.0038962 |         0.0038962 |       -2.2164662 |        -2.2162750 |        3.5879104 |         3.5899456 |        -0.0034283 |         -0.0034275 |
|  176 | backbone.stages.3.block.4.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0010242 |    0.0000983 |     0.0038962 |         0.0038962 |       -2.2164662 |        -2.2162750 |        3.5879104 |         3.5899456 |        -0.0034283 |         -0.0034275 |
|  177 | backbone.stages.3.block.5.dwconv.0             | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  0.5395910 |    0.2783161 |     4.7125220 |         4.7125220 |       -1.5810757 |        -5.1208930 |        0.6048745 |         4.8689160 |        -0.0023979 |          0.0051377 |
|  178 | backbone.stages.3.block.5.dwconv.1             | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0008585 |    0.0002461 |     0.0059412 |         0.0059412 |       -5.1232429 |        -5.1208930 |        4.8676281 |         4.8689160 |         0.0051278 |          0.0051377 |
|  179 | backbone.stages.3.block.5.pwconv1              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  1.0015613 |    0.0004813 |     0.0054984 |         0.0054984 |      -10.9399977 |       -10.9402027 |        8.1142807 |         8.1148291 |        -1.6239035 |         -1.6239076 |
|  180 | backbone.stages.3.block.5.act                  | torch.nn.modules.activation.GELU                                              | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([156, 1152, 8, 22])    | qint8         | 1.0000000 |  1.0053645 |    0.0000526 |     0.0048299 |         0.0048299 |       -0.1699712 |        -0.1699712 |        8.1142807 |         8.1148291 |        -0.0732656 |         -0.0732646 |
|  181 | backbone.stages.3.block.5.pwconv2              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  182 | backbone.stages.3.block.5.layer_scale          | horizon_plugin_pytorch.nn.channel_scale.ChannelScale2d                        | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  183 | backbone.stages.3.block.5.add                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0011554 |    0.0001443 |     0.0091362 |         0.0091362 |       -7.2463531 |        -7.2554893 |        7.7552214 |         7.7569051 |        -0.0041816 |         -0.0041796 |
|  184 | backbone.stages.3.block.5.extra_act            | torch.nn.modules.linear.Identity                                              | torch.nn.modules.linear.Identity                                        | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0011554 |    0.0001443 |     0.0091362 |         0.0091362 |       -7.2463531 |        -7.2554893 |        7.7552214 |         7.7569051 |        -0.0041816 |         -0.0041796 |
|  185 | backbone.stage_norm.3                          | torch.nn.modules.batchnorm.BatchNorm2d                                        | horizon_plugin_pytorch.nn.qat.batchnorm.BatchNorm2d                     | torch.Size([156, 384, 8, 22])     | qint8         | 1.0000000 |  1.0016233 |    0.0002379 |     0.0073109 |         0.0073109 |       -9.3281794 |        -9.3267565 |        8.9329262 |         8.9331207 |        -0.0010476 |         -0.0010494 |
|  186 | neck.conv_extract.0.0                          | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  187 | neck.conv_extract.1.0                          | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  188 | neck.conv_extract.2.0                          | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  189 | neck.conv_extract.3.0                          | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 8, 22])     | qint8         | 1.0000000 |  1.0005507 |    0.0003588 |     0.0090740 |         0.0090740 |       -8.8795261 |        -8.8795738 |        7.2160797 |         7.2159119 |        -0.0110467 |         -0.0110462 |
|  190 | neck.upscale.2                                 | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer            | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer      | torch.Size([156, 256, 16, 44])    | qint8         | 1.0000000 |  1.0047088 |    0.0002676 |     0.0061510 |         0.0061510 |       -6.5675225 |        -6.5677643 |        5.9787807 |         5.9772501 |        -0.0110467 |         -0.0110462 |
|  191 | neck.conv_add.0                                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 256, 16, 44])    | qint8         | 1.0000000 |  1.0038247 |    0.0004496 |     0.0083084 |         0.0083084 |      -11.8888321 |       -11.8869724 |       11.5772886 |        11.5692606 |        -0.0138818 |         -0.0138851 |
|  192 | neck.upscale.1                                 | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer            | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer      | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  1.0316271 |    0.0003660 |     0.0083084 |         0.0083084 |      -10.7379951 |       -10.7429590 |       10.9502802 |        10.9450874 |        -0.0138818 |         -0.0138851 |
|  193 | neck.conv_add.1                                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  1.0315216 |    0.0003727 |     0.0084314 |         0.0084314 |      -10.6442556 |       -10.6492510 |       11.0882082 |        11.0830431 |        -0.0110330 |         -0.0110322 |
|  194 | neck.upscale.0                                 | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer            | horizon_plugin_pytorch.nn.interpolate.autocasted_interpolate_outer      | torch.Size([156, 256, 64, 176])   | qint8         | 1.0000000 |  1.1783106 |    0.0003571 |     0.0084314 |         0.0084314 |      -10.6442556 |       -10.6492510 |       10.7696075 |        10.7658815 |        -0.0110330 |         -0.0110322 |
|  195 | neck.conv_add.2                                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.conv2d.ConvAdd2d                          | torch.Size([156, 256, 64, 176])   | qint8         | 1.0000000 |  1.1802307 |    0.0003578 |     0.0084229 |         0.0084229 |      -10.8399782 |       -10.8450584 |       10.7774515 |        10.7737284 |        -0.0066362 |         -0.0066357 |
|  196 | neck.fpn_conv.0.0                              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 64, 176])   | qint8         | 1.0000000 |  1.0512705 |    0.0002159 |     0.0060937 |         0.0060937 |       -8.3946028 |        -8.3948431 |        8.9642229 |         8.9634094 |         0.0577929 |          0.0577883 |
|  197 | neck.fpn_conv.1.0                              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 32, 88])    | qint8         | 1.0000000 |  1.0214391 |    0.0002277 |     0.0042500 |         0.0042500 |       -8.9110069 |        -8.9091206 |        9.0819302 |         9.0832720 |         0.0394749 |          0.0394743 |
|  198 | neck.fpn_conv.2.0                              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 16, 44])    | qint8         | 1.0000000 |  1.0045099 |    0.0018155 |     0.0269616 |         0.0269616 |      -79.2377167 |       -79.2264557 |       72.2690659 |        72.2583847 |        -0.0322573 |         -0.0322514 |
|  199 | neck.fpn_conv.3.0                              | torch.nn.modules.conv.Conv2d                                                  | horizon_plugin_pytorch.nn.qat.conv2d.Conv2d                             | torch.Size([156, 256, 8, 22])     | qint8         | 1.0000000 |  1.0005070 |    0.0002153 |     0.0026250 |         0.0026250 |       -2.4812112 |        -2.4818552 |        2.3407965 |         2.3411613 |         0.0002316 |          0.0002301 |
|  200 | head                                           | torch.Tensor.float                                                            | torch.Tensor.float                                                      | torch.Size([156, 256, 16, 44])    | qint8         | 1.0000000 |  1.0045099 |    0.0018155 |     0.0269616 |         0.0269616 |      -79.2377167 |       -79.2264557 |       72.2690659 |        72.2583847 |        -0.0322573 |         -0.0322514 |
|  201 | head                                           | torch.Tensor.sub                                                              | torch.Tensor.sub                                                        | torch.Size([26])                  | torch.float64 |           |  1.0000000 |    0.0000000 |     0.0000000 |                   |       -4.3000002 |        -4.3000002 |        7.6999998 |         7.6999998 |         1.6999999 |          1.6999999 |
|  202 | head                                           | torch.Tensor.to                                                               | torch.Tensor.to                                                         | torch.Size([26])                  | torch.float32 |           |  1.0000000 |    0.0000000 |     0.0000000 |                   |       -4.3000002 |        -4.3000002 |        7.6999998 |         7.6999998 |         1.6999999 |          1.6999999 |
|  203 | head                                           | torch.abs                                                                     | torch.abs                                                               | torch.Size([26])                  | torch.float32 |           |  1.0000000 |    0.0000000 |     0.0000000 |                   |        1.7000000 |         1.7000000 |        7.6999998 |         7.6999998 |         3.6000001 |          3.6000001 |
|  204 | head                                           | torch.Tensor.le                                                               | torch.Tensor.le                                                         | torch.Size([26])                  | torch.bool    |           |  1.0000002 |    0.0000000 |     0.0000000 |                   |        0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |         0.2692308 |          0.2692308 |
|  205 | head                                           | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([26, 4, 4])            | torch.float32 |           |  1.0000001 |    0.0000000 |     0.0000000 |                   |       -0.9999297 |        -0.9999297 |  3591795.7500000 |   3591795.7500000 |    271098.3437500 |     271098.3437500 |
|  206 | head                                           | torch.Tensor.to                                                               | torch.Tensor.to                                                         | torch.Size([26])                  | torch.float32 |           |  1.0000000 |    0.0000000 |     0.0000000 |                   |       -4.3000002 |        -4.3000002 |        7.6999998 |         7.6999998 |         1.6999999 |          1.6999999 |
|  207 | head                                           | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([3, 384, 26])          | torch.float32 |           |  0.4808218 |   24.5395489 |   435.3145447 |                   |     -361.1693420 |      -339.5232849 |      199.6985168 |       195.0997314 |        -6.0583863 |         -0.4504354 |
|  208 | head                                           | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([26, 384, 3])          | torch.float32 |           |  0.4808208 |   24.5395489 |   435.3145447 |                   |     -361.1693420 |      -339.5232849 |      199.6985168 |       195.0997314 |        -6.0583863 |         -0.4504354 |
|  209 | head                                           | torch.Tensor.add                                                              | torch.Tensor.add                                                        | torch.Size([26, 384, 3])          | torch.float32 |           |  0.4831755 |   33.9282074 |   509.3981934 |                   |     -387.5632629 |      -363.3916321 |      362.0098267 |       348.9443054 |        -3.5829949 |         13.4025335 |
|  210 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 384, 3, 1])       | torch.float32 |           |  0.4831755 |   33.9282074 |   509.3981934 |                   |     -387.5632629 |      -363.3916321 |      362.0098267 |       348.9443054 |        -3.5829949 |         13.4025335 |
|  211 | head                                           | torch.matmul                                                                  | torch.matmul                                                            | torch.Size([26, 384, 3, 1])       | torch.float32 |           |  0.4831755 |   33.8714485 |   509.3502808 |                   |     -387.5196228 |      -363.5214233 |      362.4449768 |       349.4360046 |        -3.7124336 |         13.2907410 |
|  212 | head                                           | torch.Tensor.squeeze                                                          | torch.Tensor.squeeze                                                    | torch.Size([26, 384, 3])          | torch.float32 |           |  0.4831755 |   33.8714485 |   509.3502808 |                   |     -387.5196228 |      -363.5214233 |      362.4449768 |       349.4360046 |        -3.7124336 |         13.2907410 |
|  213 | head                                           | torch.Tensor.add                                                              | torch.Tensor.add                                                        | torch.Size([26, 384, 3])          | torch.float32 |           |  0.7235042 |   33.8714485 |   509.3502808 |                   |     -443.6446228 |      -420.2714233 |      394.4449768 |       381.4360046 |       -21.9936848 |         -4.9905100 |
|  214 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 1])            | torch.bool    |           |  1.0000002 |    0.0000000 |     0.0000000 |                   |        0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |         0.2692308 |          0.2692308 |
|  215 | head                                           | torch.where                                                                   | torch.where                                                             | torch.Size([26, 384, 256])        | torch.float32 |           |  0.7695814 |    0.0925968 |     5.1147389 |                   |       -4.7621317 |        -3.6930170 |        3.3936565 |         3.0826545 |        -0.0011101 |         -0.0010704 |
|  216 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 1])            | torch.bool    |           |  1.0000002 |    0.0000000 |     0.0000000 |                   |        0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |         0.2692308 |          0.2692308 |
|  217 | head                                           | torch.where                                                                   | torch.where                                                             | torch.Size([26, 384, 11])         | torch.float32 |           |  0.6353844 |    1.7227440 |   198.2932281 |                   |     -147.1090393 |      -136.8962708 |       68.6145782 |        65.1100235 |        -2.7603326 |         -2.1700661 |
|  218 | head.instance_bank.anchor_quant_stub           | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 384, 11])         | qint16        | 0.0033570 |  1.0000023 |    0.0000000 |     0.0000000 |         0.0000000 |      -58.0691261 |       -58.0691261 |       98.1392365 |        98.1392365 |         2.8034270 |          2.8034270 |
|  219 | head.instance_bank.instance_feature_quant_stub | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 384, 256])        | qint8         | 1.0000000 |  0.0000000 |    0.0000000 |     0.0000000 |         0.0000000 |        0.0000000 |         0.0000000 |        0.0000000 |         0.0000000 |         0.0000000 |          0.0000000 |
|  220 | head.instance_bank.anchor_quant_stub           | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 384, 11])         | qint16        | 0.0033570 |  0.6353844 |    1.7227440 |   198.2932281 |     59068.8486809 |     -147.1090393 |      -136.8962708 |       68.6145782 |        65.1100235 |        -2.7603326 |         -2.1700661 |
|  221 | head.instance_bank.instance_feature_quant_stub | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 384, 256])        | qint8         | 1.0000000 |  0.7695814 |    0.0925968 |     5.1147389 |         5.1147389 |       -4.7621317 |        -3.6930170 |        3.3936565 |         3.0826545 |        -0.0011101 |         -0.0010704 |
|  222 | head                                           | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 384, 11])         | qint16        | 0.0033570 |  0.6814387 |    1.3672816 |   120.0000000 |     35746.3636447 |      -60.0000000 |       -60.0000000 |       60.0000000 |        60.0000000 |        -2.4648068 |         -1.9964398 |
|  223 | head                                           | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 384, 11])         | qint16        | 0.0033570 |  1.0000038 |    0.0000000 |     0.0000000 |         0.0000000 |      -58.0691261 |       -58.0691261 |       60.0000000 |        60.0000000 |         2.4505773 |          2.4505773 |
|  224 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 128, 11])         | qint16        | 0.0033570 |  0.6179339 |    1.9171664 |   120.0000000 |     35746.3636447 |      -60.0000000 |       -60.0000000 |       60.0000000 |        60.0000000 |        -2.6708233 |         -2.0246677 |
|  225 | head.instance_bank.anchor_cat                  | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 11])         | qint16        | 0.0033570 |  0.9287475 |    0.4792916 |   120.0000000 |     35746.3636447 |      -60.0000000 |       -60.0000000 |       60.0000000 |        60.0000000 |         1.1702273 |          1.3317660 |
|  226 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 128, 256])        | qint8         | 1.0000000 |  0.7294278 |    0.1410537 |     5.0864196 |         5.0864196 |       -4.6688638 |        -3.6217675 |        3.2979729 |         3.0145388 |        -0.0016171 |         -0.0014354 |
|  227 | head.instance_bank.feature_cat                 | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7294277 |    0.0352634 |     5.0864196 |         5.0864196 |       -4.6688638 |        -3.6217675 |        3.2979729 |         3.0145388 |        -0.0004043 |         -0.0003588 |
|  228 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 256, 11])         | qint16        | 0.0033570 |  0.7196854 |    1.0923390 |   120.0000000 |     35746.3636447 |      -60.0000000 |       -60.0000000 |       60.0000000 |        60.0000000 |        -2.3617980 |         -1.9823254 |
|  229 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 256, 256])        | qint8         | 1.0000000 |  0.8045375 |    0.0683683 |     5.1147389 |         5.1147389 |       -4.7621317 |        -3.6930170 |        3.3936565 |         3.0826545 |        -0.0008566 |         -0.0008879 |
|  230 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 0.0033570 |  0.9471589 |    1.1326206 |   120.0000000 |     35746.3636447 |      -60.0000000 |       -60.0000000 |       60.0000000 |        60.0000000 |         3.7520018 |          4.2772317 |
|  231 | head.anchor_encoder.pos_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6752487 |    5.3245578 |    85.3954010 |        85.3954010 |      -85.3954010 |         0.0000000 |       95.2234268 |        82.0955124 |        -0.0019914 |          4.9145603 |
|  232 | head.anchor_encoder.pos_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9588079 |    0.3926367 |    75.3748474 |        75.3748474 |        0.0000000 |         0.0000000 |       95.2234268 |        82.0955124 |         4.9299307 |          4.9145603 |
|  233 | head.anchor_encoder.pos_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9880146 |    0.0206532 |     3.8550920 |         3.8550920 |       -0.9564527 |        -0.9636280 |        3.4984415 |         3.4984415 |         0.0108297 |          0.0109281 |
|  234 | head.anchor_encoder.pos_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6510335 |    0.5716587 |     4.6776214 |         4.6776214 |       -4.6776214 |         0.0000000 |        6.2191048 |         6.2191048 |        -0.2220723 |          0.3332709 |
|  235 | head.anchor_encoder.pos_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9874491 |    0.0153047 |     3.4745381 |         3.4745381 |        0.0000000 |         0.0000000 |        6.2191048 |         6.2191048 |         0.3342817 |          0.3332709 |
|  236 | head.anchor_encoder.pos_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9952868 |    0.0116900 |     3.0274765 |         3.0274765 |       -0.9968417 |        -0.9957706 |        6.3199091 |         6.3199096 |         0.0802947 |          0.0803769 |
|  237 | head.anchor_encoder.pos_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6381013 |    0.6606085 |     6.2585297 |         6.2585297 |       -6.2585297 |         0.0000000 |        5.9430876 |         5.8804641 |        -0.1073524 |          0.5448616 |
|  238 | head.anchor_encoder.pos_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9973606 |    0.0081159 |     3.5198281 |         3.5198281 |        0.0000000 |         0.0000000 |        5.9430876 |         5.8804641 |         0.5451403 |          0.5448616 |
|  239 | head.anchor_encoder.pos_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9972437 |    0.0086655 |     3.7879703 |         3.7879703 |       -0.8402692 |        -0.8402691 |        5.6774325 |         5.7180667 |         0.0266365 |          0.0266513 |
|  240 | head.anchor_encoder.pos_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6178544 |    0.8726757 |     6.7751975 |         6.7751975 |       -6.7751975 |         0.0000000 |        8.3544235 |         8.3544226 |        -0.3612316 |          0.5035938 |
|  241 | head.anchor_encoder.pos_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9987473 |    0.0070753 |     2.2195606 |         2.2195606 |        0.0000000 |         0.0000000 |        8.3544235 |         8.3544226 |         0.5043688 |          0.5035938 |
|  242 | head.anchor_encoder.pos_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9978697 |    0.0083687 |     2.6807127 |         2.6807127 |       -0.8682820 |        -0.8682820 |        7.4341941 |         7.4341946 |         0.0260230 |          0.0260663 |
|  243 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 0.0033570 |  0.9964873 |    0.0091426 |     2.3632681 |       703.9853518 |       -0.3981397 |         0.0000000 |        2.5814149 |         2.5828195 |         0.8552794 |          0.8597940 |
|  244 | head.anchor_encoder.size_fc.0                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3408780 |    0.4633699 |     2.7925727 |         2.7925727 |       -2.7925727 |         0.0000000 |        1.4924165 |         1.4791325 |        -0.3357655 |          0.1260221 |
|  245 | head.anchor_encoder.size_fc.1                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9968976 |    0.0013678 |     1.4577950 |         1.4577950 |        0.0000000 |         0.0000000 |        1.4924165 |         1.4791325 |         0.1262367 |          0.1260221 |
|  246 | head.anchor_encoder.size_fc.2                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9966773 |    0.0056610 |     4.4490724 |         4.4490724 |       -0.7236204 |        -0.7236204 |        3.9015512 |         3.8515596 |         0.0216827 |          0.0216969 |
|  247 | head.anchor_encoder.size_fc.3                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.5950272 |    0.2489327 |     2.5721762 |         2.5721762 |       -2.5721762 |         0.0000000 |        1.8647105 |         1.8647106 |        -0.0283336 |          0.2170663 |
|  248 | head.anchor_encoder.size_fc.4                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9948909 |    0.0031932 |     1.6812924 |         1.6812924 |        0.0000000 |         0.0000000 |        1.8647105 |         1.8647106 |         0.2174059 |          0.2170663 |
|  249 | head.anchor_encoder.size_fc.5                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9932814 |    0.0112896 |     4.0871119 |         4.0871119 |       -0.9243014 |        -0.9243014 |        3.4910169 |         3.4910169 |         0.0096238 |          0.0096528 |
|  250 | head.anchor_encoder.size_fc.6                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.6726562 |    0.3878947 |     3.1387539 |         3.1387539 |       -2.0991416 |         0.0000000 |        2.2952843 |         2.0579250 |        -0.0596322 |          0.3218071 |
|  251 | head.anchor_encoder.size_fc.7                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9940639 |    0.0061052 |     2.2952843 |         2.2952843 |        0.0000000 |         0.0000000 |        2.2952843 |         2.0579250 |         0.3221573 |          0.3218071 |
|  252 | head.anchor_encoder.size_fc.8                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9917926 |    0.0129797 |     4.2107410 |         4.2107410 |       -0.8183190 |        -0.8183190 |        3.5004420 |         3.4191990 |         0.0128184 |          0.0128564 |
|  253 | head.anchor_encoder.size_fc.9                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7418945 |    0.3142309 |     3.6902137 |         3.6902137 |       -2.5880327 |         0.0000000 |        2.9671061 |         2.9671056 |         0.1215524 |          0.4278892 |
|  254 | head.anchor_encoder.size_fc.10                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9939768 |    0.0084952 |     2.8502717 |         2.8502717 |        0.0000000 |         0.0000000 |        2.9671061 |         2.9671056 |         0.4272881 |          0.4278892 |
|  255 | head.anchor_encoder.size_fc.11                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9887185 |    0.0160253 |     4.5033283 |         4.5033283 |       -1.1608952 |        -1.1608952 |        3.9256110 |         3.9256105 |         0.0271653 |          0.0273368 |
|  256 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 2])          | qint16        | 0.0033570 |  0.9992602 |    0.0017045 |     0.4752618 |       141.5740116 |       -1.4737016 |        -1.0401944 |        1.0775599 |         1.0775599 |         0.3328766 |          0.3335277 |
|  257 | head.anchor_encoder.yaw_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.6746050 |    0.2368124 |     1.3527542 |         1.3527542 |       -1.3527542 |         0.0000000 |        1.4309695 |         1.2433873 |        -0.0417091 |          0.1938918 |
|  258 | head.anchor_encoder.yaw_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9994048 |    0.0008046 |     0.4077499 |         0.4077499 |        0.0000000 |         0.0000000 |        1.4309695 |         1.2433873 |         0.1942988 |          0.1938918 |
|  259 | head.anchor_encoder.yaw_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9996691 |    0.0019397 |     0.9238408 |         0.9238408 |       -1.1665583 |        -1.1665583 |        2.9565046 |         2.9565048 |         0.0079888 |          0.0079779 |
|  260 | head.anchor_encoder.yaw_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.5421312 |    0.6178598 |     3.4250467 |         3.4250467 |       -3.4250467 |         0.0000000 |        2.2103169 |         2.2103171 |        -0.1972965 |          0.4182642 |
|  261 | head.anchor_encoder.yaw_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9990579 |    0.0018802 |     1.0568275 |         1.0568275 |        0.0000000 |         0.0000000 |        2.2103169 |         2.2103171 |         0.4186831 |          0.4182642 |
|  262 | head.anchor_encoder.yaw_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9967271 |    0.0055035 |     2.6625001 |         2.6625001 |       -0.9829913 |        -0.9829912 |        3.7781157 |         3.9474814 |        -0.0025491 |         -0.0026777 |
|  263 | head.anchor_encoder.yaw_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.5807083 |    0.3621697 |     3.4872284 |         3.4872284 |       -3.4872284 |         0.0000000 |        2.4959188 |         1.5926654 |        -0.0697694 |          0.2887949 |
|  264 | head.anchor_encoder.yaw_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9940376 |    0.0032642 |     2.3482337 |         2.3482337 |        0.0000000 |         0.0000000 |        2.4959188 |         1.5926654 |         0.2891362 |          0.2887949 |
|  265 | head.anchor_encoder.yaw_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9933380 |    0.0076429 |     3.9925592 |         3.9925592 |       -0.8186139 |        -0.8172662 |        3.6089237 |         2.9854064 |         0.0051749 |          0.0050811 |
|  266 | head.anchor_encoder.yaw_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.6598777 |    0.3538444 |     4.8740706 |         4.8740706 |       -4.8740706 |         0.0000000 |        2.4997149 |         2.4966083 |        -0.1046057 |          0.2452000 |
|  267 | head.anchor_encoder.yaw_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9948550 |    0.0036459 |     1.9453900 |         1.9453900 |        0.0000000 |         0.0000000 |        2.4997149 |         2.4966083 |         0.2455928 |          0.2452000 |
|  268 | head.anchor_encoder.yaw_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9929635 |    0.0084959 |     4.1541677 |         4.1541677 |       -0.9122334 |        -0.8781685 |        3.8050859 |         3.8050854 |         0.0509497 |          0.0507453 |
|  269 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 0.0033570 |  0.3634686 |    0.6145029 |    56.2063751 |     16743.1127022 |      -46.8903351 |       -45.6340027 |       12.7195482 |        11.1609459 |        -0.5383656 |         -0.4762357 |
|  270 | head.anchor_encoder.vel_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.2547685 |    0.6714417 |    30.3511391 |        30.3511391 |      -28.2418194 |         0.0000000 |       25.6446609 |        26.5334263 |        -0.0380110 |          0.4108578 |
|  271 | head.anchor_encoder.vel_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.5207290 |    0.2185275 |    26.4134579 |        26.4134579 |        0.0000000 |         0.0000000 |       25.6446609 |        26.5334263 |         0.4149032 |          0.4108578 |
|  272 | head.anchor_encoder.vel_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9550469 |    0.0471471 |     4.2895160 |         4.2895160 |       -0.9795623 |        -0.9795624 |        3.4839587 |         3.2213397 |         0.0353742 |          0.0353079 |
|  273 | head.anchor_encoder.vel_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.6677377 |    0.7090391 |     3.7720475 |         3.7720475 |       -3.1788363 |         0.0000000 |        3.5685937 |         3.5685935 |        -0.0782871 |          0.6171962 |
|  274 | head.anchor_encoder.vel_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9957024 |    0.0135820 |     2.4611824 |         2.4611824 |        0.0000000 |         0.0000000 |        3.5685937 |         3.5685935 |         0.6171699 |          0.6171962 |
|  275 | head.anchor_encoder.vel_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9577805 |    0.0439299 |     4.8047657 |         4.8047657 |       -0.9141013 |        -0.8997287 |        4.1519217 |         4.1217709 |         0.0382299 |          0.0381798 |
|  276 | head.anchor_encoder.vel_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.6389436 |    0.6741496 |     5.0018334 |         5.0018334 |       -4.7428360 |         0.0000000 |        3.5904813 |         3.5349579 |        -0.1047535 |          0.5445386 |
|  277 | head.anchor_encoder.vel_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9840443 |    0.0247370 |     3.5387015 |         3.5387015 |        0.0000000 |         0.0000000 |        3.5904813 |         3.5349579 |         0.5446591 |          0.5445386 |
|  278 | head.anchor_encoder.vel_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9606240 |    0.0408909 |     5.5816870 |         5.5816870 |       -0.8207414 |        -0.8087579 |        5.5225973 |         4.9532070 |         0.0216362 |          0.0215897 |
|  279 | head.anchor_encoder.vel_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.5869448 |    0.6527246 |     6.0517836 |         6.0517836 |       -4.4516349 |         0.0000000 |        4.3155999 |         4.8672647 |        -0.3039817 |          0.3225344 |
|  280 | head.anchor_encoder.vel_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9665884 |    0.0253026 |     4.8672647 |         4.8672647 |        0.0000000 |         0.0000000 |        4.3155999 |         4.8672647 |         0.3234403 |          0.3225344 |
|  281 | head.anchor_encoder.vel_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9549683 |    0.0362368 |     6.1871166 |         6.1871166 |       -0.8051474 |        -0.7347668 |        5.2120757 |         5.7195721 |         0.0147601 |          0.0147036 |
|  282 | head.anchor_encoder.cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9862297 |    0.0163087 |     6.1871166 |         6.1871166 |       -1.1608952 |        -1.1608952 |        7.4341941 |         7.4341946 |         0.0264659 |          0.0264693 |
|  283 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 256, 3])          | qint16        | 0.0033570 |  0.7550372 |    2.6275597 |   120.0000000 |     35746.3636447 |      -60.0000000 |       -60.0000000 |       60.0000000 |        60.0000000 |        -7.1311426 |         -5.7641068 |
|  284 | head.anchor_encoder.pos_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.6911311 |    3.8083086 |    90.4583740 |        90.4583740 |      -90.4583740 |         0.0000000 |       99.2346878 |        89.3930664 |         0.9830660 |          3.8557763 |
|  285 | head.anchor_encoder.pos_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.9197150 |    0.9183975 |    74.6290970 |        74.6290970 |        0.0000000 |         0.0000000 |       99.2346878 |        89.3930664 |         3.8729773 |          3.8557763 |
|  286 | head.anchor_encoder.pos_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.9757586 |    0.0421201 |     3.5287254 |         3.5287254 |       -0.9604729 |        -0.9465912 |        3.1473343 |         3.1539621 |         0.0138156 |          0.0140099 |
|  287 | head.anchor_encoder.pos_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.7298396 |    0.8199799 |     4.6635852 |         4.6635852 |       -4.6635852 |         0.0000000 |        6.2191048 |         6.2191048 |         0.0041450 |          0.7881250 |
|  288 | head.anchor_encoder.pos_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.9921104 |    0.0308872 |     3.2328286 |         3.2328286 |        0.0000000 |         0.0000000 |        6.2191048 |         6.2191048 |         0.7932377 |          0.7881250 |
|  289 | head.anchor_encoder.pos_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.9910871 |    0.0236232 |     2.5140920 |         2.5140920 |       -0.9990538 |        -0.9990538 |        5.8657508 |         5.8657513 |         0.0759168 |          0.0761456 |
|  290 | head.anchor_encoder.pos_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.6756374 |    0.8882285 |     5.7557969 |         5.7557969 |       -5.7557969 |         0.0000000 |        5.9387484 |         5.8816814 |        -0.0888079 |          0.7826438 |
|  291 | head.anchor_encoder.pos_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.9973430 |    0.0159272 |     3.0758429 |         3.0758429 |        0.0000000 |         0.0000000 |        5.9387484 |         5.8816814 |         0.7834935 |          0.7826438 |
|  292 | head.anchor_encoder.pos_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.9947503 |    0.0168523 |     3.2909679 |         3.2909679 |       -0.7632550 |        -0.7632550 |        5.6850295 |         5.6877604 |         0.0253562 |          0.0253854 |
|  293 | head.anchor_encoder.pos_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.6122185 |    0.9273341 |     6.7751975 |         6.7751975 |       -6.7751975 |         0.0000000 |        4.8005762 |         4.8005767 |        -0.2715187 |          0.6401896 |
|  294 | head.anchor_encoder.pos_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.9981882 |    0.0139551 |     2.0043464 |         2.0043464 |        0.0000000 |         0.0000000 |        4.8005762 |         4.8005767 |         0.6418604 |          0.6401896 |
|  295 | head.anchor_encoder.pos_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 128])        | qint8         | 1.0000000 |  0.9961102 |    0.0163968 |     2.4087794 |         2.4087794 |       -0.7705165 |        -0.7801436 |        5.2373042 |         5.2219696 |         0.0316392 |          0.0317420 |
|  296 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 256, 3])          | qint16        | 0.0033570 |  0.9995733 |    0.0033817 |     0.1810752 |        53.9398463 |        0.0000000 |         0.0000000 |        1.6749278 |         1.6268276 |         0.1373676 |          0.1379064 |
|  297 | head.anchor_encoder.size_fc.0                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.4844211 |    0.2632743 |     1.8852482 |         1.8852482 |       -1.8852482 |         0.0000000 |        0.8708311 |         0.8309531 |        -0.1456299 |          0.1170797 |
|  298 | head.anchor_encoder.size_fc.1                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.9999311 |    0.0006391 |     0.1117293 |         0.1117293 |        0.0000000 |         0.0000000 |        0.8708311 |         0.8309531 |         0.1170053 |          0.1170797 |
|  299 | head.anchor_encoder.size_fc.2                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.9998849 |    0.0029567 |     0.4896823 |         0.4896823 |       -0.7236204 |        -0.7236204 |        3.1520441 |         3.0460107 |         0.0190500 |          0.0190508 |
|  300 | head.anchor_encoder.size_fc.3                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.6007088 |    0.3534976 |     2.5721762 |         2.5721762 |       -2.5721762 |         0.0000000 |        1.8647105 |         1.8647106 |        -0.0585850 |          0.2927670 |
|  301 | head.anchor_encoder.size_fc.4                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.9998447 |    0.0021660 |     0.2520819 |         0.2520819 |        0.0000000 |         0.0000000 |        1.8647105 |         1.8647106 |         0.2927466 |          0.2927670 |
|  302 | head.anchor_encoder.size_fc.5                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.9992159 |    0.0092508 |     1.1030000 |         1.1030000 |       -0.8961873 |        -0.8967292 |        3.4910169 |         3.4910169 |         0.0037595 |          0.0037676 |
|  303 | head.anchor_encoder.size_fc.6                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.8003339 |    0.2575790 |     1.8526061 |         1.8526061 |       -1.8526061 |         0.0000000 |        2.0579250 |         2.0579250 |         0.1499860 |          0.4022495 |
|  304 | head.anchor_encoder.size_fc.7                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.9992590 |    0.0051902 |     0.5979420 |         0.5979420 |        0.0000000 |         0.0000000 |        2.0579250 |         2.0579250 |         0.4023748 |          0.4022495 |
|  305 | head.anchor_encoder.size_fc.8                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.9984977 |    0.0117670 |     1.2670696 |         1.2670696 |       -0.8183190 |        -0.8183190 |        3.3674543 |         3.3828375 |         0.0073454 |          0.0073841 |
|  306 | head.anchor_encoder.size_fc.9                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.8247402 |    0.2044377 |     2.5824406 |         2.5824406 |       -2.5824406 |         0.0000000 |        2.9209232 |         2.8943639 |         0.1741895 |          0.3701067 |
|  307 | head.anchor_encoder.size_fc.10                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.9983909 |    0.0083237 |     0.9023107 |         0.9023107 |        0.0000000 |         0.0000000 |        2.9209232 |         2.8943639 |         0.3703036 |          0.3701067 |
|  308 | head.anchor_encoder.size_fc.11                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.9981325 |    0.0154763 |     1.8196689 |         1.8196689 |       -0.9346060 |        -0.9126347 |        3.6054032 |         3.5731652 |         0.0214974 |          0.0215435 |
|  309 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 256, 2])          | qint16        | 0.0033570 |  0.9999973 |    0.0001211 |     0.0191988 |         5.7190478 |       -1.0031525 |        -1.0010805 |        0.0210246 |         0.0177814 |        -0.0768302 |         -0.0768060 |
|  310 | head.anchor_encoder.yaw_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.7362415 |    0.1331898 |     1.1527691 |         1.1527691 |       -1.1527691 |         0.0000000 |        1.1181885 |         1.1177886 |         0.0427469 |          0.1758647 |
|  311 | head.anchor_encoder.yaw_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  1.0000114 |    0.0000594 |     0.0219637 |         0.0219637 |        0.0000000 |         0.0000000 |        1.1181885 |         1.1177886 |         0.1758774 |          0.1758647 |
|  312 | head.anchor_encoder.yaw_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  1.0000123 |    0.0002046 |     0.0800673 |         0.0800673 |       -1.1665583 |        -1.1665583 |        2.8190818 |         2.8136566 |        -0.0008952 |         -0.0008957 |
|  313 | head.anchor_encoder.yaw_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.4887441 |    0.7028726 |     3.4250467 |         3.4250467 |       -3.4250467 |         0.0000000 |        1.7952765 |         1.7952765 |        -0.3299037 |          0.3727604 |
|  314 | head.anchor_encoder.yaw_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.9999660 |    0.0001931 |     0.1483561 |         0.1483561 |        0.0000000 |         0.0000000 |        1.7952765 |         1.7952765 |         0.3727758 |          0.3727604 |
|  315 | head.anchor_encoder.yaw_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  1.0000086 |    0.0008043 |     0.6496949 |         0.6496949 |       -0.8438867 |        -0.8438867 |        3.7653768 |         3.7725699 |        -0.0004447 |         -0.0004559 |
|  316 | head.anchor_encoder.yaw_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.5210847 |    0.4890494 |     3.4861166 |         3.4861166 |       -3.4861166 |         0.0000000 |        1.5138774 |         1.5138772 |        -0.1664152 |          0.3219565 |
|  317 | head.anchor_encoder.yaw_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  1.0000020 |    0.0006795 |     0.2954585 |         0.2954585 |        0.0000000 |         0.0000000 |        1.5138774 |         1.5138772 |         0.3219546 |          0.3219565 |
|  318 | head.anchor_encoder.yaw_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.9998299 |    0.0021664 |     0.8968377 |         0.8968377 |       -0.8143013 |        -0.8165641 |        2.9223914 |         2.9164608 |         0.0087050 |          0.0086884 |
|  319 | head.anchor_encoder.yaw_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.4652149 |    0.4988323 |     4.8485236 |         4.8485236 |       -4.8485236 |         0.0000000 |        2.4990561 |         2.4968674 |        -0.2525839 |          0.2449179 |
|  320 | head.anchor_encoder.yaw_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.9996670 |    0.0012128 |     0.9767971 |         0.9767971 |        0.0000000 |         0.0000000 |        2.4990561 |         2.4968674 |         0.2450357 |          0.2449179 |
|  321 | head.anchor_encoder.yaw_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 32])         | qint8         | 1.0000000 |  0.9997751 |    0.0028362 |     2.2084751 |         2.2084751 |       -0.8781686 |        -0.8781685 |        3.6783609 |         3.6594253 |         0.0837314 |          0.0836776 |
|  322 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 256, 3])          | qint16        | 0.0033570 |  0.5210171 |    1.3742214 |    57.0960274 |     17008.1279766 |      -46.8948555 |       -45.5831871 |       13.6264248 |        10.2470255 |        -1.6149313 |         -1.5911230 |
|  323 | head.anchor_encoder.vel_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.3523083 |    1.3207480 |    33.5563583 |        33.5563583 |      -31.2906361 |         0.0000000 |       28.3400249 |        26.4226322 |        -0.0783281 |          0.7778533 |
|  324 | head.anchor_encoder.vel_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.6301993 |    0.4839719 |    28.3400249 |        28.3400249 |        0.0000000 |         0.0000000 |       28.3400249 |        26.4226322 |         0.7584478 |          0.7778533 |
|  325 | head.anchor_encoder.vel_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.9040691 |    0.0994489 |     4.0249887 |         4.0249887 |       -0.9263648 |        -0.9263648 |        3.2804203 |         3.2103815 |         0.0339559 |          0.0336675 |
|  326 | head.anchor_encoder.vel_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.6597624 |    0.6935688 |     3.1359468 |         3.1359468 |       -3.0925813 |         0.0000000 |        3.2098315 |         3.2098312 |        -0.0868527 |          0.5790026 |
|  327 | head.anchor_encoder.vel_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.9914855 |    0.0268840 |     1.8521819 |         1.8521819 |        0.0000000 |         0.0000000 |        3.2098315 |         3.2098312 |         0.5798321 |          0.5790026 |
|  328 | head.anchor_encoder.vel_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.9062284 |    0.0948534 |     4.8143191 |         4.8143191 |       -0.9065416 |        -0.8563132 |        4.1072788 |         4.0792074 |         0.0379423 |          0.0381266 |
|  329 | head.anchor_encoder.vel_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.6166244 |    0.6869309 |     4.9210701 |         4.9210701 |       -4.5759506 |         0.0000000 |        3.5576644 |         3.5293512 |        -0.1073263 |          0.5263456 |
|  330 | head.anchor_encoder.vel_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.9632475 |    0.0531599 |     3.5319610 |         3.5319610 |        0.0000000 |         0.0000000 |        3.5576644 |         3.5293512 |         0.5264447 |          0.5263456 |
|  331 | head.anchor_encoder.vel_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.9115872 |    0.0896068 |     5.5310297 |         5.5310297 |       -0.8792642 |        -0.8464630 |        5.4802580 |         4.9394164 |         0.0215724 |          0.0214489 |
|  332 | head.anchor_encoder.vel_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.5585136 |    0.6741419 |     6.0766106 |         6.0766106 |       -4.1192098 |         0.0000000 |        4.0692005 |         4.2264500 |        -0.2904617 |          0.3254336 |
|  333 | head.anchor_encoder.vel_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.9276104 |    0.0569521 |     4.2264500 |         4.2264500 |        0.0000000 |         0.0000000 |        4.0692005 |         4.2264500 |         0.3267281 |          0.3254336 |
|  334 | head.anchor_encoder.vel_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 256, 64])         | qint8         | 1.0000000 |  0.9004644 |    0.0825317 |     5.8518438 |         5.8518438 |       -0.7771395 |        -0.7330951 |        5.1864176 |         5.2006860 |         0.0160646 |          0.0160962 |
|  335 | head.anchor_encoder.cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 256, 256])        | qint8         | 1.0000000 |  0.9767810 |    0.0311204 |     5.8518438 |         5.8518438 |       -0.9346060 |        -0.9126347 |        5.2373042 |         5.2219696 |         0.0329894 |          0.0330477 |
|  336 | head                                           | torch.Tensor.unbind                                                           | torch.Tensor.unbind                                                     | torch.Size([3, 256, 704])         | torch.float32 |           |  1.0006990 |    0.0000000 |     0.0000000 |                   |       -0.5625000 |        -0.5625000 |        0.7578125 |         0.7578125 |        -0.0576011 |         -0.0576011 |
|  337 | head                                           | torch.Tensor.double                                                           | torch.Tensor.double                                                     | torch.Size([156, 4, 4])           | torch.float64 |           |  1.0000005 |    0.0000000 |     0.0000000 |                   |    -2119.9514160 |     -2119.9514160 |     2768.2631836 |      2768.2631836 |        22.1685505 |         22.1685505 |
|  338 | head                                           | torch.matmul                                                                  | torch.matmul                                                            | torch.Size([156, 4, 4])           | torch.float64 |           |  0.9999996 |    0.0000000 |     0.0000000 |                   |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3267042 |          0.3267042 |
|  339 | head                                           | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 6, 4, 4])         | torch.float64 |           |  0.9999996 |    0.0000000 |     0.0000000 |                   |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3267042 |          0.3267042 |
|  340 | head                                           | torch.Tensor.float                                                            | torch.Tensor.float                                                      | torch.Size([26, 6, 4, 4])         | torch.float32 |           |  0.9999996 |    0.0000000 |     0.0000000 |                   |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3267042 |          0.3267042 |
|  341 | head.mat_quant_stub                            | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 6, 4, 4])         | qint16        | 1.0000000 |  0.9999996 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3267042 |          0.3267042 |
|  342 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 256, 512])        | qint16        | 0.0001526 |  0.4880881 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
|  343 | head.layers.0.query_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.9705850 |    0.0257860 |     6.1871166 |     13515.7562659 |       -4.6688638 |        -3.6217675 |        7.4341941 |         7.4341946 |         0.0130308 |          0.0130552 |
|  344 | head.layers.0.key_cat                          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 256, 512])        | qint16        | 0.0001786 |  0.9560770 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
|  345 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.9706463 |    0.0257860 |     6.1871166 |     13515.7562659 |       -4.6688638 |        -3.6217675 |        7.4341941 |         7.4341946 |         0.0130308 |          0.0130552 |
|  346 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.9560735 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
|  347 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.4880882 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
|  348 | head.layers.0.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.9706463 |    0.0257860 |     6.1871166 |     13515.7562659 |       -4.6688638 |        -3.6217675 |        7.4341941 |         7.4341946 |         0.0130308 |          0.0130552 |
|  349 | head.layers.0.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.9560735 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
|  350 | head.layers.0.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.4880882 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
|  351 | head.layers.0.attn.q_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9573963 |    0.0886089 |    15.0376673 |        15.0376673 |      -10.7254686 |       -10.1787214 |       12.5000172 |        12.4861898 |        -0.0119804 |         -0.0113162 |
|  352 | head.layers.0.attn.k_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.9661475 |    0.1460261 |     8.2902956 |         8.2902956 |       -8.8472748 |        -8.8472738 |        8.3213158 |         8.3213148 |        -0.0454942 |         -0.0488266 |
|  353 | head.layers.0.attn.v_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.4584252 |    0.0207956 |     1.3661891 |         1.3661891 |       -1.2426922 |        -0.7796572 |        1.4133743 |         0.9917387 |         0.0005401 |         -0.0002898 |
|  354 | head.layers.0.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.9573963 |    0.0886089 |    15.0376673 |        15.0376673 |      -10.7254686 |       -10.1787214 |       12.5000172 |        12.4861898 |        -0.0119804 |         -0.0113162 |
|  355 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.9573404 |    0.0886089 |    15.0376673 |        15.0376673 |      -10.7254686 |       -10.1787214 |       12.5000172 |        12.4861898 |        -0.0119804 |         -0.0113162 |
|  356 | head.layers.0.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  357 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  358 | head.layers.0.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 208, 64])        | qint8         | 1.0000000 |  0.0170839 |    1.6137675 |     8.8442860 |         8.8442860 |       -1.2426922 |        -8.8472738 |        1.4133743 |         8.3213148 |         0.0005401 |         -0.0488266 |
|  359 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 256, 64])        | qint8         | 1.0000000 |  0.0170839 |    1.6137675 |     8.8442860 |         8.8442860 |       -1.2426922 |        -8.8472738 |        1.4133743 |         8.3213148 |         0.0005401 |         -0.0488266 |
|  360 | head.layers.0.attn                             | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.9573404 |    0.0110761 |     1.8797084 |        15.0376673 |       -1.3406836 |        -1.2723402 |        1.5625021 |         1.5607737 |        -0.0014975 |         -0.0014145 |
|  361 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  362 | head.layers.0.attn.matmul                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.9128351 |    1.1594319 |    81.8950195 |        81.8950195 |      -74.2665100 |       -62.2857971 |       81.6839905 |        70.3851089 |         2.8816738 |          2.9050593 |
|  363 | head.layers.0.attn.softmax                     | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.3378074 |    0.0007460 |     0.9968415 |         0.9968415 |        0.0000000 |         0.0000000 |        0.9972149 |         0.0312500 |         0.0039062 |          0.0035822 |
|  364 | head.layers.0.attn.attention_drop              | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.3378074 |    0.0007460 |     0.9968415 |         0.9968415 |        0.0000000 |         0.0000000 |        0.9972149 |         0.0312500 |         0.0039062 |          0.0035822 |
|  365 | head.layers.0.attn.attn_matmul                 | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.4859112 |    0.0188431 |     1.1504283 |         1.1504283 |       -0.9807625 |        -0.6218082 |        1.2657390 |         0.4991706 |         0.0007277 |          0.0000377 |
|  366 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  367 | head.layers.0.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  368 | head.layers.0.attn.out_proj                    | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.8195761 |    0.0316194 |     1.3577690 |         1.3577690 |       -1.3993481 |        -0.7508144 |        1.4694084 |         0.7515438 |         0.0317873 |          0.0269390 |
|  369 | head.layers.0.attn                             | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 256])     | qint8         | 1.0000000 |  0.3378074 |    0.0007460 |     0.9968415 |         0.9968415 |        0.0000000 |         0.0000000 |        0.9972149 |         0.0312500 |         0.0039062 |          0.0035822 |
|  370 | head.layers.0.attn.attn_weights_mean           | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.5557061 |    0.0005829 |     0.3673103 |         0.3673103 |        0.0000001 |         0.0000015 |        0.3776586 |         0.0202796 |         0.0039062 |          0.0035822 |
|  371 | head.layers.0.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  372 | head.layers.0.dropout                          | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.8196130 |    0.0316194 |     1.3577690 |         1.3577690 |       -1.3993481 |        -0.7508144 |        1.4694084 |         0.7515438 |         0.0317873 |          0.0269390 |
|  373 | head.layers.0.add                              | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.9578370 |    0.0532878 |     6.1871166 |         6.1871166 |       -4.4804893 |        -3.6870725 |        7.5674262 |         7.6152420 |         0.0448182 |          0.0399942 |
|  374 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.9332867 |    0.0843816 |     6.5506282 |      4292.9541822 |       -9.4988213 |        -8.9909668 |       11.7926273 |        11.0750303 |        -0.0055782 |         -0.0060602 |
|  375 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 512])        | qint16        | 0.0001526 |  0.9235080 |    0.0219737 |     2.7184289 |     17815.2234738 |       -2.9217002 |        -2.7372544 |        2.7272012 |         2.7272007 |        -0.0007934 |          0.0000701 |
|  376 | head.layers.1.query_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.9634972 |    0.0503451 |     6.5506282 |     14309.8472741 |       -9.4988213 |        -8.9909668 |       11.7926273 |        11.0750303 |         0.0104439 |          0.0102046 |
|  377 | head.layers.1.key_cat                          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0002719 |  0.9634972 |    0.0503451 |     6.5506282 |     24090.8936344 |       -9.4988213 |        -8.9909668 |       11.7926273 |        11.0750303 |         0.0104439 |          0.0102046 |
|  378 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.9636245 |    0.0503451 |     6.5506282 |     14309.8472741 |       -9.4988213 |        -8.9909668 |       11.7926273 |        11.0750303 |         0.0104439 |          0.0102046 |
|  379 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0002719 |  0.9636245 |    0.0503451 |     6.5506282 |     24090.8936344 |       -9.4988213 |        -8.9909668 |       11.7926273 |        11.0750303 |         0.0104439 |          0.0102046 |
|  380 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.9235489 |    0.0219737 |     2.7184289 |     17815.2234738 |       -2.9217002 |        -2.7372544 |        2.7272012 |         2.7272007 |        -0.0007934 |          0.0000701 |
|  381 | head.layers.1.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.9636245 |    0.0503451 |     6.5506282 |     14309.8472741 |       -9.4988213 |        -8.9909668 |       11.7926273 |        11.0750303 |         0.0104439 |          0.0102046 |
|  382 | head.layers.1.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0002719 |  0.9636245 |    0.0503451 |     6.5506282 |     24090.8936344 |       -9.4988213 |        -8.9909668 |       11.7926273 |        11.0750303 |         0.0104439 |          0.0102046 |
|  383 | head.layers.1.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.9235489 |    0.0219737 |     2.7184289 |     17815.2234738 |       -2.9217002 |        -2.7372544 |        2.7272012 |         2.7272007 |        -0.0007934 |          0.0000701 |
|  384 | head.layers.1.attn.q_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9762602 |    0.1031476 |    11.0896740 |        11.0896740 |      -11.8151674 |       -11.8330431 |        9.3400583 |         8.9275665 |         0.0029727 |          0.0034952 |
|  385 | head.layers.1.attn.k_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9837078 |    0.0824467 |     6.7914200 |         6.7914200 |      -10.8036757 |       -10.7936611 |        9.0473127 |         8.0183516 |         0.0862893 |          0.0850909 |
|  386 | head.layers.1.attn.v_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9362864 |    0.0217736 |     1.6451427 |         1.6451427 |       -2.2660267 |        -2.2760546 |        3.2513134 |         3.2596700 |        -0.0015872 |         -0.0020604 |
|  387 | head.layers.1.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.9762602 |    0.1031476 |    11.0896740 |        11.0896740 |      -11.8151674 |       -11.8330431 |        9.3400583 |         8.9275665 |         0.0029727 |          0.0034952 |
|  388 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.9762316 |    0.1031476 |    11.0896740 |        11.0896740 |      -11.8151674 |       -11.8330431 |        9.3400583 |         8.9275665 |         0.0029727 |          0.0034952 |
|  389 | head.layers.1.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.1181333 |    1.5623894 |    16.1577148 |        16.1577148 |      -10.8036757 |       -11.8330431 |        9.0473127 |         8.9275665 |         0.0862893 |          0.0034952 |
|  390 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.1181357 |    1.5623894 |    16.1577148 |        16.1577148 |      -10.8036757 |       -11.8330431 |        9.0473127 |         8.9275665 |         0.0862893 |          0.0034952 |
|  391 | head.layers.1.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 | -0.0115077 |    1.1979215 |    10.9358635 |        10.9358635 |       -2.2660267 |       -10.7936611 |        3.2513134 |         8.0183516 |        -0.0015872 |          0.0850909 |
|  392 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 | -0.0115075 |    1.1979215 |    10.9358635 |        10.9358635 |       -2.2660267 |       -10.7936611 |        3.2513134 |         8.0183516 |        -0.0015872 |          0.0850909 |
|  393 | head.layers.1.attn                             | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.9762316 |    0.0128934 |     1.3862092 |        11.0896740 |       -1.4768959 |        -1.4791304 |        1.1675073 |         1.1159458 |         0.0003716 |          0.0004369 |
|  394 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  395 | head.layers.1.attn.matmul                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.9809060 |    0.5188645 |    43.3367996 |        43.3367996 |      -28.6396580 |       -30.0778618 |       43.6046295 |        38.4882812 |         0.1181128 |          0.0125563 |
|  396 | head.layers.1.attn.softmax                     | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.6541999 |    0.0010802 |     0.7912492 |         0.7912492 |        0.0000000 |         0.0000000 |        0.8224992 |         0.0312500 |         0.0019531 |          0.0010386 |
|  397 | head.layers.1.attn.attention_drop              | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.6541999 |    0.0010802 |     0.7912492 |         0.7912492 |        0.0000000 |         0.0000000 |        0.8224992 |         0.0312500 |         0.0019531 |          0.0010386 |
|  398 | head.layers.1.attn.attn_matmul                 | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.7876620 |    0.0574071 |     3.0335746 |         3.0335746 |       -1.5795343 |        -1.4824377 |        2.5954871 |         2.7972846 |        -0.0035035 |         -0.0072414 |
|  399 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  400 | head.layers.1.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  401 | head.layers.1.attn.out_proj                    | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.8320386 |    0.1142471 |     1.8036324 |         1.8036324 |       -1.8764789 |        -1.4573822 |        1.6173148 |         1.1917670 |        -0.0024889 |         -0.0045059 |
|  402 | head.layers.1.attn                             | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 512])     | qint8         | 1.0000000 |  0.6541999 |    0.0010802 |     0.7912492 |         0.7912492 |        0.0000000 |         0.0000000 |        0.8224992 |         0.0312500 |         0.0019531 |          0.0010386 |
|  403 | head.layers.1.attn.attn_weights_mean           | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.7422159 |    0.0010079 |     0.1491858 |         0.1491858 |        0.0000001 |         0.0000000 |        0.1492765 |         0.0156065 |         0.0019531 |          0.0010386 |
|  404 | head.layers.1.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  405 | head.layers.1.dropout                          | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.8320078 |    0.1142471 |     1.8036324 |         1.8036324 |       -1.8764789 |        -1.4573822 |        1.6173148 |         1.1917670 |        -0.0024889 |         -0.0045059 |
|  406 | head.layers.1.add                              | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.9552199 |    0.1475713 |     6.1595340 |         6.1595340 |       -9.8107061 |        -9.0309029 |       11.5833797 |        11.2033987 |         0.0079550 |          0.0056987 |
|  407 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.9892223 |    0.2347843 |    12.4214039 |      8140.3670379 |      -27.8301163 |       -27.6242905 |       25.7133026 |        25.4957027 |        -0.0262194 |         -0.0213153 |
|  408 | head.layers.2                                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9822513 |    0.0797844 |     5.3369217 |         5.3369217 |       -6.8867168 |        -6.8401561 |        6.0398593 |         6.0635576 |         0.0016712 |          0.0016562 |
|  409 | head.layers.3.kps_generator.offset             | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 24])         | qint16        | 1.0000000 |  0.9657794 |    0.1165690 |     7.1076565 |         7.1076565 |       -6.2773666 |        -6.1698513 |        4.8905149 |         4.9908652 |        -0.4656415 |         -0.4826177 |
|  410 | head.layers.3.kps_generator                    | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.9657794 |    0.1165690 |     7.1076565 |         7.1076565 |       -6.2773666 |        -6.1698513 |        4.8905149 |         4.9908652 |        -0.4656415 |         -0.4826177 |
|  411 | head.layers.3.kps_generator                    | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 3])       | qint16        | 0.0033570 |  0.9471589 |    1.1326206 |   120.0000000 |     35746.3636447 |      -60.0000000 |       -60.0000000 |       60.0000000 |        60.0000000 |         3.7520018 |          4.2772317 |
|  412 | head.layers.3.kps_generator.keypoints_add      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.9439847 |    1.2298825 |   126.7126770 |       126.7126770 |      -65.5407867 |       -65.2177963 |       63.7471046 |        63.3697128 |         3.2863598 |          3.7946143 |
|  413 | head.layers.3.weight_add                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9829261 |    0.0887248 |     7.2468548 |         7.2468548 |       -6.6051917 |        -6.5574422 |        7.7817221 |         7.7505913 |         0.0281371 |          0.0281254 |
|  414 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 3, 4])         | qint16        | 1.0000000 |  0.9999999 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3522723 |          0.3522723 |
|  415 | head.layers.3                                  | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 6, 12])           | qint16        | 1.0000000 |  0.9999999 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3522723 |          0.3522723 |
|  416 | head.layers.3.camera_encoder.0                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.6545496 |    0.4474951 |     5.1108408 |         5.1108408 |       -5.1108408 |         0.0000000 |        6.5803881 |         6.5803881 |        -0.1209187 |          0.3265764 |
|  417 | head.layers.3.camera_encoder.1                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.9999998 |    0.0000000 |     0.0000000 |         0.0000000 |        0.0000000 |         0.0000000 |        6.5803881 |         6.5803881 |         0.3265764 |          0.3265764 |
|  418 | head.layers.3.camera_encoder.2                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.9999999 |    0.0000000 |     0.0000007 |         0.0000007 |       -0.7927587 |        -0.7927587 |        4.0335946 |         4.0335946 |         0.0087450 |          0.0087450 |
|  419 | head.layers.3.camera_encoder.3                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.9094028 |    1.0674332 |     9.4951468 |         9.4951468 |       -9.4951468 |         0.0000000 |       27.9104614 |        27.9104614 |         0.1394914 |          1.2069248 |
|  420 | head.layers.3.camera_encoder.4                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000038 |         0.0000038 |        0.0000000 |         0.0000000 |       27.9104614 |        27.9104614 |         1.2069247 |          1.2069248 |
|  421 | head.layers.3.camera_encoder.5                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  1.0000013 |    0.0000000 |     0.0000014 |         0.0000014 |       -0.9630507 |        -0.9630507 |        7.4379067 |         7.4379067 |         0.0216157 |          0.0216157 |
|  422 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 256])     | qint8         | 1.0000000 |  0.9829261 |    0.0887248 |     7.2468548 |         7.2468548 |       -6.6051917 |        -6.5574422 |        7.7817221 |         7.7505913 |         0.0281371 |          0.0281254 |
|  423 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 6, 256])       | qint8         | 1.0000000 |  1.0000013 |    0.0000000 |     0.0000014 |         0.0000014 |       -0.9630507 |        -0.9630507 |        7.4379067 |         7.4379067 |         0.0216157 |          0.0216157 |
|  424 | head.layers.3.cam_add                          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 6, 256])     | qint8         | 1.0000000 |  0.9839371 |    0.0887248 |     7.2468548 |         7.2468548 |       -4.2570229 |        -3.9187441 |       11.8472252 |        12.4701557 |         0.0497528 |          0.0497411 |
|  425 | head.layers.3.weights_fc                       | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 6, 64])      | qint8         | 1.0000000 |  0.9918643 |    0.0964628 |     8.1140127 |         8.1140127 |       -9.7471733 |        -9.4304638 |        8.4070711 |         8.7059288 |        -0.2733597 |         -0.2722460 |
|  426 | head.layers.3                                  | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.9918643 |    0.0964628 |     8.1140127 |         8.1140127 |       -9.7471733 |        -9.4304638 |        8.4070711 |         8.7059288 |        -0.2733597 |         -0.2722460 |
|  427 | head.layers.3.weight_softmax                   | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.8796669 |    0.0171823 |     0.9198031 |         0.9198031 |        0.0000004 |         0.0000001 |        0.9510531 |         0.0312500 |         0.0208333 |          0.0036638 |
|  428 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  0.9342651 |    2.5082586 |   126.7126770 |       126.7126770 |      -65.5407867 |       -65.2177963 |       63.7471046 |        63.3697128 |         6.4062395 |          7.7390103 |
|  429 | head.layers.3                                  | torch.ones_like                                                               | torch.ones_like                                                         | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
|  430 | head.layers.3.point_quant_stub                 | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
|  431 | head.layers.3.point_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 8, 4])       | qint16        | 1.0000000 |  0.9440026 |    0.9224119 |   126.7126770 |       126.7126770 |      -65.5407867 |       -65.2177963 |       63.7471046 |        63.3697128 |         2.7147698 |          3.0959609 |
|  432 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 1, 1, 4, 4])   | qint16        | 1.0000000 |  0.9999996 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3267042 |          0.3267042 |
|  433 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 512, 8, 1, 4]) | qint16        | 1.0000000 |  0.9440026 |    0.9224119 |   126.7126770 |       126.7126770 |      -65.5407867 |       -65.2177963 |       63.7471046 |        63.3697128 |         2.7147698 |          3.0959609 |
|  434 | head.layers.3.point_matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 4, 4]) | qint16        | 1.0000000 |  0.9670749 |    0.4104742 |   239.5508728 |       239.5508728 |     -180.1776428 |      -179.6567841 |      606.9921875 |       583.2266235 |         1.1784286 |          1.1851066 |
|  435 | head.layers.3.point_sum                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 6, 512, 8, 4])    | qint16        | 0.0244144 |  0.9674411 |    1.3790113 |   268.7078552 |     11006.1058102 |     -182.2431488 |      -181.7289581 |      654.6254883 |       653.0628052 |         4.7137146 |          4.7404265 |
|  436 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.9365268 |    1.7551959 |   126.4968719 |      5181.2328157 |      -71.4127197 |       -71.0981064 |       77.9507141 |        77.9854126 |         1.3122821 |          1.3150772 |
|  437 | head.layers.3                                  | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.9594836 |    0.9583024 |    76.6138458 |      3138.0552421 |        0.0100000 |         0.0100000 |       77.9507141 |        77.9854126 |        11.6852636 |         11.6398125 |
|  438 | head.layers.3.reciprocal_op                    | horizon_plugin_pytorch.nn.reciprocal.Reciprocal                               | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.9500440 |   41.2797012 |    99.9833221 |     32762.0350910 |        0.0128286 |         0.0128229 |      100.0000000 |         1.2799804 |        41.9189339 |          0.6512136 |
|  439 | head.layers.3                                  | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.9500440 |   41.2797012 |    99.9833221 |     32762.0350910 |        0.0128286 |         0.0128229 |      100.0000000 |         1.2799804 |        41.9189339 |          0.6512136 |
|  440 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0244144 |  0.9727643 |    1.8804247 |   268.7078552 |     11006.1058102 |     -182.2431488 |      -181.7289581 |      654.6254883 |       653.0628052 |         8.2712889 |          8.3233147 |
|  441 | head.layers.3.point_mul                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.9003043 | 1516.4719238 | 64773.9492188 | 212248038.1519569 |   -18224.3144531 |      -232.6095123 |    65462.5468750 |       835.9075928 |      1104.8475342 |         13.7234764 |
|  442 | head.layers.3                                  | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.5824059 |  172.2905426 |   603.0393066 |   1976009.0484947 |     -500.0000000 |      -232.6095123 |      500.0000000 |       500.0000000 |        98.8541946 |         13.2191381 |
|  443 | head.layers.3                                  | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([156, 512, 8, 2])      | qint16        | 0.0003052 |  0.5824059 |  172.2905426 |   603.0393066 |   1976009.0484947 |     -500.0000000 |      -232.6095123 |      500.0000000 |       500.0000000 |        98.8541946 |         13.2191381 |
|  444 | head.layers.3                                  | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer            | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer      | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.9401089 |    0.1562481 |    26.3993797 |        26.3993797 |      -58.8512230 |       -60.1261597 |       51.1228828 |        54.5745811 |         0.0122911 |          0.0121862 |
|  445 | head.layers.3.feat_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.9401089 |    0.1562481 |    26.3993797 |        26.3993797 |      -58.8512230 |       -60.1261597 |       51.1228828 |        54.5745811 |         0.0122911 |          0.0121862 |
|  446 | head.layers.3                                  | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 6, 256, 512, 8])  | qint8         | 1.0000000 |  0.9401089 |    0.1562481 |    26.3993797 |        26.3993797 |      -58.8512230 |       -60.1261597 |       51.1228828 |        54.5745811 |         0.0122911 |          0.0121862 |
|  447 | head.layers.3                                  | torch.Tensor.permute                                                          | torch.Tensor.permute                                                    | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.9397404 |    0.1562481 |    26.3993797 |        26.3993797 |      -58.8512230 |       -60.1261597 |       51.1228828 |        54.5745811 |         0.0122911 |          0.0121862 |
|  448 | head.layers.3                                  | torch.Tensor.contiguous                                                       | torch.Tensor.contiguous                                                 | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.9397404 |    0.1562481 |    26.3993797 |        26.3993797 |      -58.8512230 |       -60.1261597 |       51.1228828 |        54.5745811 |         0.0122911 |          0.0121862 |
|  449 | head.layers.3                                  | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.9397404 |    0.1562481 |    26.3993797 |        26.3993797 |      -58.8512230 |       -60.1261597 |       51.1228828 |        54.5745811 |         0.0122911 |          0.0121862 |
|  450 | head.layers.3                                  | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 48, 8, 1])   | qint8         | 1.0000000 |  0.8796669 |    0.0171823 |     0.9198031 |         0.9198031 |        0.0000004 |         0.0000001 |        0.9510531 |         0.0312500 |         0.0208333 |          0.0036638 |
|  451 | head.layers.3                                  | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.9397404 |    0.1562481 |    26.3993797 |        26.3993797 |      -58.8512230 |       -60.1261597 |       51.1228828 |        54.5745811 |         0.0122911 |          0.0121862 |
|  452 | head.layers.3.feat_mul                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.7282604 |    0.0134455 |    11.2700348 |        11.2700348 |      -11.3507252 |        -0.6010366 |       10.4234571 |         0.6549326 |        -0.0000508 |         -0.0000074 |
|  453 | head.layers.3                                  | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.7282604 |    0.0134455 |    11.2700348 |        11.2700348 |      -11.3507252 |        -0.6010366 |       10.4234571 |         0.6549326 |        -0.0000508 |         -0.0000074 |
|  454 | head.layers.3.feat_sum                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8148810 |    0.4868473 |    11.5639229 |        11.5639229 |      -11.9421082 |        -1.5200412 |       10.8336067 |         1.5494598 |        -0.0024389 |         -0.0003535 |
|  455 | head.layers.3.output_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8045335 |    0.6278993 |    12.2120943 |        12.2120943 |      -12.8853130 |        -1.5582557 |       12.0169115 |         1.5636065 |        -0.0059419 |         -0.0001058 |
|  456 | head.layers.3.proj_drop                        | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8045335 |    0.6278993 |    12.2120943 |        12.2120943 |      -12.8853130 |        -1.5582557 |       12.0169115 |         1.5636065 |        -0.0059419 |         -0.0001058 |
|  457 | head.layers.3.residual_op                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.7067685 |    0.3538418 |    12.2120943 |        12.2120943 |      -12.8853130 |        -6.8401561 |       12.0169115 |         6.0635576 |        -0.0021353 |          0.0007752 |
|  458 | head.layers.4.pre_norm                         | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.8145329 |    0.3420337 |     4.9615607 |         4.9615607 |       -9.0094814 |        -8.9336929 |        8.7609596 |         8.7618694 |         0.0036715 |          0.0043386 |
|  459 | head.layers.4.layers.0.0                       | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.0563407 |    2.8507552 |    12.6593199 |        12.6593199 |      -12.6593199 |         0.0000000 |        8.9196358 |        13.2594967 |        -2.5977914 |          0.1032498 |
|  460 | head.layers.4.layers.0.2                       | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.4858984 |    0.1360229 |    10.0456924 |        10.0456924 |        0.0000000 |         0.0000000 |        8.9196358 |        13.2594967 |         0.1169410 |          0.1032498 |
|  461 | head.layers.4.layers.1                         | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.5786317 |    0.9827929 |    31.6561985 |        31.6561985 |      -26.5986443 |       -31.0260658 |       23.9734993 |        37.1871872 |        -0.0168957 |          0.0049495 |
|  462 | head.layers.4.layers.2                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.5786317 |    0.9827929 |    31.6561985 |        31.6561985 |      -26.5986443 |       -31.0260658 |       23.9734993 |        37.1871872 |        -0.0168957 |          0.0049495 |
|  463 | head.layers.4.identity_fc                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  464 | head.layers.4.short_add                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.linear.LinearAdd                          | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6695786 |    1.6378667 |    34.8319359 |        34.8319359 |      -32.1355705 |       -36.3552246 |       27.8818092 |        42.1912842 |         0.0922882 |          0.1353871 |
|  465 | head.layers.5                                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8199940 |    0.3912810 |     3.3959465 |         3.3959465 |       -4.3179140 |        -4.0036111 |        4.0199656 |         3.5889032 |        -0.0022418 |         -0.0019619 |
|  466 | head.layers.6.add1                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8868071 |    0.3985295 |     6.9235735 |         6.9235735 |       -4.2747202 |        -4.0604792 |        7.6857972 |         8.0752201 |         0.0242241 |          0.0245074 |
|  467 | head.layers.6.layers.0                         | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.4467622 |    1.2364522 |     8.9762716 |         8.9762716 |       -8.9762716 |         0.0000000 |        9.8269501 |        11.3216791 |        -0.6031027 |          0.4254977 |
|  468 | head.layers.6.layers.1                         | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8971077 |    0.2016577 |     6.1760902 |         6.1760902 |        0.0000000 |         0.0000000 |        9.8269501 |        11.3216791 |         0.4316919 |          0.4254977 |
|  469 | head.layers.6.layers.2                         | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.3832656 |    1.0964776 |    11.6461449 |        11.6461449 |      -11.6461449 |         0.0000000 |        9.1985331 |         9.2542639 |        -0.6216595 |          0.3355536 |
|  470 | head.layers.6.layers.3                         | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9079265 |    0.1405595 |     6.2273617 |         6.2273617 |        0.0000000 |         0.0000000 |        9.1985331 |         9.2542639 |         0.3342586 |          0.3355536 |
|  471 | head.layers.6.layers.4                         | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8787584 |    0.2359258 |     5.4224763 |         5.4224763 |       -0.8026833 |        -0.8149379 |        6.8791180 |         6.7964530 |         0.0422724 |          0.0426529 |
|  472 | head.layers.6.layers.5                         | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.5182598 |    1.1588393 |     9.0074520 |         9.0074520 |       -9.0074520 |         0.0000000 |        8.8219967 |         8.8410015 |        -0.6207373 |          0.3995560 |
|  473 | head.layers.6.layers.6                         | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9313390 |    0.1486652 |     4.3952360 |         4.3952360 |        0.0000000 |         0.0000000 |        8.8219967 |         8.8410015 |         0.3894369 |          0.3995560 |
|  474 | head.layers.6.layers.7                         | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7063416 |    1.1662850 |    13.8720932 |        13.8720932 |      -10.1906776 |         0.0000000 |       19.4642372 |        20.4338226 |        -0.3674979 |          0.6746929 |
|  475 | head.layers.6.layers.8                         | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9750653 |    0.1441057 |    13.8720932 |        13.8720932 |        0.0000000 |         0.0000000 |       19.4642372 |        20.4338226 |         0.6546814 |          0.6746929 |
|  476 | head.layers.6.layers.9                         | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9619088 |    0.1275766 |     7.1653953 |         7.1653953 |       -0.8133879 |        -0.8235244 |       11.4733353 |        11.5317402 |         0.0225951 |          0.0222691 |
|  477 | head.layers.6.layers.10                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.9867991 |    0.3898105 |    38.2871094 |        38.2871094 |      -35.6905022 |       -35.6563454 |       12.2108717 |        10.9203081 |        -1.1116499 |         -1.0164976 |
|  478 | head.layers.6.layers.11.scale_quant_stub       | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([11])                  | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        0.1426757 |         0.1426757 |        1.2617201 |         1.2617201 |         0.6749083 |          0.6749083 |
|  479 | head.layers.6.layers.11.mul                    | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.9898757 |    0.3061559 |    48.3076172 |        48.3076172 |      -45.0314217 |       -44.9883270 |       15.4067020 |        13.7783718 |        -1.1465173 |         -1.0759916 |
|  480 | head.layers.6.add2                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.9425889 |    0.7740717 |   120.1791687 |       120.1791687 |      -60.2012711 |       -59.8793411 |       61.9898529 |        62.3292618 |         0.0237099 |          0.2557743 |
|  481 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 11])         | torch.float32 |           |  0.9425889 |    0.7740717 |   120.1791687 |                   |      -60.2012711 |       -59.8793411 |       61.9898529 |        62.3292618 |         0.0237099 |          0.2557743 |
|  482 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.9460701 |    1.4478467 |   120.1791687 |       120.1791687 |      -60.2012711 |       -59.8793411 |       61.9898529 |        62.3292618 |         3.5636420 |          4.1674547 |
|  483 | head.anchor_encoder.pos_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6687818 |    5.3935633 |    81.7082443 |        81.7082443 |      -81.7082443 |         0.0000000 |       91.0530396 |        77.9847870 |        -0.0551873 |          4.8415089 |
|  484 | head.anchor_encoder.pos_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9565916 |    0.4858595 |    74.9974594 |        74.9974594 |        0.0000000 |         0.0000000 |       91.0530396 |        77.9847870 |         4.8525171 |          4.8415089 |
|  485 | head.anchor_encoder.pos_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9676985 |    0.0784026 |     3.9400728 |         3.9400728 |       -0.9529521 |        -0.9635340 |        3.7555425 |         3.5175545 |         0.0094533 |          0.0108318 |
|  486 | head.anchor_encoder.pos_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6025185 |    0.6170646 |     4.6609898 |         4.6609898 |       -4.6609898 |         0.0000000 |        5.5209126 |         5.6703563 |        -0.2194609 |          0.3528237 |
|  487 | head.anchor_encoder.pos_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9531882 |    0.0739127 |     3.6846211 |         3.6846211 |        0.0000000 |         0.0000000 |        5.5209126 |         5.6703563 |         0.3236910 |          0.3528237 |
|  488 | head.anchor_encoder.pos_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9762341 |    0.0805815 |     3.3143322 |         3.3143322 |       -0.9986548 |        -0.9973401 |        6.3530989 |         6.3522310 |         0.0797903 |          0.0799723 |
|  489 | head.anchor_encoder.pos_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6172203 |    0.7172099 |     6.1559653 |         6.1559653 |       -6.1559653 |         0.0000000 |        5.9463263 |         5.8820558 |        -0.0975678 |          0.5220918 |
|  490 | head.anchor_encoder.pos_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9683074 |    0.0935523 |     3.8008163 |         3.8008163 |        0.0000000 |         0.0000000 |        5.9463263 |         5.8820558 |         0.5260898 |          0.5220918 |
|  491 | head.anchor_encoder.pos_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9701153 |    0.1058662 |     4.1520443 |         4.1520443 |       -0.8666688 |        -0.8463172 |        5.6761312 |         5.7111058 |         0.0260872 |          0.0258068 |
|  492 | head.anchor_encoder.pos_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6053320 |    0.9282867 |     5.7916360 |         5.7916360 |       -5.7916360 |         0.0000000 |        8.3891630 |         8.3778944 |        -0.3530769 |          0.4824632 |
|  493 | head.anchor_encoder.pos_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9728584 |    0.0844792 |     2.4078877 |         2.4078877 |        0.0000000 |         0.0000000 |        8.3891630 |         8.3778944 |         0.4907306 |          0.4824632 |
|  494 | head.anchor_encoder.pos_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9647815 |    0.0929187 |     2.9072654 |         2.9072654 |       -0.8381624 |        -0.8404440 |        7.4429507 |         7.4205079 |         0.0259398 |          0.0276428 |
|  495 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.9675612 |    0.1275643 |     1.8435464 |         1.8435464 |       -0.5099061 |        -0.1294766 |        2.5511291 |         2.6214242 |         0.7584461 |          0.8554866 |
|  496 | head.anchor_encoder.size_fc.0                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3592180 |    0.4709699 |     2.7405734 |         2.7405734 |       -2.7405734 |         0.0000000 |        1.4560659 |         1.5490676 |        -0.3177033 |          0.1321013 |
|  497 | head.anchor_encoder.size_fc.1                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9756768 |    0.0196795 |     1.4235895 |         1.4235895 |        0.0000000 |         0.0000000 |        1.4560659 |         1.5490676 |         0.1335870 |          0.1321013 |
|  498 | head.anchor_encoder.size_fc.2                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9758608 |    0.0805989 |     4.3484135 |         4.3484135 |       -0.7148008 |        -0.6180097 |        3.8868599 |         3.8740594 |         0.0213137 |          0.0214574 |
|  499 | head.anchor_encoder.size_fc.3                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.5870000 |    0.2731075 |     2.0988464 |         2.0988464 |       -1.8812773 |         0.0000000 |        1.7048047 |         1.3229266 |        -0.0077304 |          0.2237134 |
|  500 | head.anchor_encoder.size_fc.4                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9571294 |    0.0449324 |     1.4686910 |         1.4686910 |        0.0000000 |         0.0000000 |        1.7048047 |         1.3229266 |         0.2204447 |          0.2237134 |
|  501 | head.anchor_encoder.size_fc.5                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9531175 |    0.1514301 |     3.9145236 |         3.9145236 |       -0.9239969 |        -0.9535928 |        3.4350216 |         3.4117222 |         0.0078074 |          0.0081977 |
|  502 | head.anchor_encoder.size_fc.6                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.6597993 |    0.4309227 |     2.9342542 |         2.9342542 |       -2.0812595 |         0.0000000 |        2.3169436 |         1.9085125 |        -0.0322892 |          0.3102325 |
|  503 | head.anchor_encoder.size_fc.7                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9524519 |    0.0763257 |     2.1555324 |         2.1555324 |        0.0000000 |         0.0000000 |        2.3169436 |         1.9085125 |         0.3223077 |          0.3102325 |
|  504 | head.anchor_encoder.size_fc.8                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9384519 |    0.1620405 |     4.2369642 |         4.2369642 |       -0.8368621 |        -0.7748329 |        3.5812027 |         3.4511120 |         0.0123763 |          0.0130382 |
|  505 | head.anchor_encoder.size_fc.9                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7377976 |    0.3869334 |     3.3357501 |         3.3357501 |       -2.7004492 |         0.0000000 |        3.0260520 |         3.0085726 |         0.1491585 |          0.4408410 |
|  506 | head.anchor_encoder.size_fc.10                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9560063 |    0.1023284 |     2.8834965 |         2.8834965 |        0.0000000 |         0.0000000 |        3.0260520 |         3.0085726 |         0.4337634 |          0.4408410 |
|  507 | head.anchor_encoder.size_fc.11                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8946969 |    0.2122964 |     4.4853816 |         4.4853816 |       -1.1688405 |        -1.1301080 |        3.9186156 |         3.9352450 |         0.0303347 |          0.0369511 |
|  508 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 2])          | qint16        | 1.0000000 |  0.9923154 |    0.0591474 |     0.6000530 |         0.6000530 |       -1.6604499 |        -1.5106233 |        0.2374119 |         0.1762468 |        -0.5388125 |         -0.5544522 |
|  509 | head.anchor_encoder.yaw_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7982717 |    0.1663424 |     1.4413574 |         1.4413574 |       -1.4413574 |         0.0000000 |        1.5450802 |         1.4554715 |         0.1383821 |          0.2756356 |
|  510 | head.anchor_encoder.yaw_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9891455 |    0.0276265 |     0.4287889 |         0.4287889 |        0.0000000 |         0.0000000 |        1.5450802 |         1.4554715 |         0.2770980 |          0.2756356 |
|  511 | head.anchor_encoder.yaw_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9844681 |    0.0787024 |     1.7309463 |         1.7309463 |       -1.2628902 |        -1.2586420 |        3.0882084 |         3.0160570 |        -0.0003910 |         -0.0003762 |
|  512 | head.anchor_encoder.yaw_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.4871357 |    0.4670845 |     3.4389715 |         3.4389715 |       -3.4389715 |         0.0000000 |        1.9853449 |         1.8928635 |        -0.1658400 |          0.2301256 |
|  513 | head.anchor_encoder.yaw_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9233390 |    0.0631658 |     1.3305652 |         1.3305652 |        0.0000000 |         0.0000000 |        1.9853449 |         1.8928635 |         0.2380787 |          0.2301256 |
|  514 | head.anchor_encoder.yaw_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9306988 |    0.1828471 |     4.3569779 |         4.3569779 |       -0.9142864 |        -0.9146953 |        4.1331739 |         4.1272311 |         0.0106486 |          0.0096045 |
|  515 | head.anchor_encoder.yaw_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.4446027 |    0.6324486 |     3.7635970 |         3.7635970 |       -3.7635970 |         0.0000000 |        2.5310853 |         2.4088964 |        -0.2264952 |          0.3132562 |
|  516 | head.anchor_encoder.yaw_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9119045 |    0.0988735 |     2.1799779 |         2.1799779 |        0.0000000 |         0.0000000 |        2.5310853 |         2.4088964 |         0.3070799 |          0.3132562 |
|  517 | head.anchor_encoder.yaw_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8950887 |    0.2318364 |     4.5955820 |         4.5955820 |       -0.9122784 |        -0.9105399 |        3.8240960 |         3.8244026 |         0.0143306 |          0.0152145 |
|  518 | head.anchor_encoder.yaw_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.4028522 |    0.8011768 |     5.1871018 |         5.1871018 |       -5.1871018 |         0.0000000 |        2.7141762 |         2.7142148 |        -0.4262188 |          0.2851115 |
|  519 | head.anchor_encoder.yaw_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9282050 |    0.0916307 |     2.6942010 |         2.6942010 |        0.0000000 |         0.0000000 |        2.7141762 |         2.7142148 |         0.2833274 |          0.2851115 |
|  520 | head.anchor_encoder.yaw_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8860699 |    0.2056133 |     5.4635277 |         5.4635277 |       -1.0186919 |        -0.9560211 |        4.9357939 |         4.9248233 |         0.0932441 |          0.0945115 |
|  521 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.9326332 |    1.2234204 |    55.8301582 |        55.8301582 |      -46.6687965 |       -45.5312538 |       15.3827620 |        13.8693819 |        -3.8759441 |         -3.7154675 |
|  522 | head.anchor_encoder.vel_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.6613048 |    3.0270598 |    30.3210964 |        30.3210964 |      -27.5495911 |         0.0000000 |       25.0330143 |        25.9292850 |        -0.0848193 |          2.5790296 |
|  523 | head.anchor_encoder.vel_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9490836 |    0.4160823 |    25.8239670 |        25.8239670 |        0.0000000 |         0.0000000 |       25.0330143 |        25.9292850 |         2.5261579 |          2.5790296 |
|  524 | head.anchor_encoder.vel_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9076468 |    0.1475030 |     4.3318715 |         4.3318715 |       -0.9912395 |        -0.9989652 |        3.5123115 |         3.4252367 |         0.0154046 |          0.0144219 |
|  525 | head.anchor_encoder.vel_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.5599109 |    0.4123211 |     4.0404220 |         4.0404220 |       -3.3068333 |         0.0000000 |        3.5574033 |         3.5163221 |        -0.0965956 |          0.2382592 |
|  526 | head.anchor_encoder.vel_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8984283 |    0.0779190 |     2.8177009 |         2.8177009 |        0.0000000 |         0.0000000 |        3.5574033 |         3.5163221 |         0.2378065 |          0.2382592 |
|  527 | head.anchor_encoder.vel_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9023335 |    0.1742384 |     4.7818570 |         4.7818570 |       -0.8998557 |        -0.9073886 |        4.3759332 |         4.1356034 |         0.0319674 |          0.0300645 |
|  528 | head.anchor_encoder.vel_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.6045172 |    0.6074607 |     5.0013347 |         5.0013347 |       -4.7643490 |         0.0000000 |        3.6646638 |         3.6138196 |        -0.0982479 |          0.3850853 |
|  529 | head.anchor_encoder.vel_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9114336 |    0.1147033 |     3.5434918 |         3.5434918 |        0.0000000 |         0.0000000 |        3.6646638 |         3.6138196 |         0.3945096 |          0.3850853 |
|  530 | head.anchor_encoder.vel_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8905487 |    0.1855620 |     5.5812383 |         5.5812383 |       -0.8632938 |        -0.8359565 |        5.4808736 |         4.9391265 |         0.0206255 |          0.0201123 |
|  531 | head.anchor_encoder.vel_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.6075380 |    0.6324660 |     5.9548202 |         5.9548202 |       -4.9982600 |         0.0000000 |        4.8160152 |         4.7645726 |        -0.2087371 |          0.3200701 |
|  532 | head.anchor_encoder.vel_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9185174 |    0.1031641 |     4.3283901 |         4.3283901 |        0.0000000 |         0.0000000 |        4.8160152 |         4.7645726 |         0.3205648 |          0.3200701 |
|  533 | head.anchor_encoder.vel_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8992679 |    0.1561580 |     5.8512607 |         5.8512607 |       -0.8030222 |        -0.7274202 |        5.6586938 |         5.6084604 |         0.0261348 |          0.0280713 |
|  534 | head.anchor_encoder.cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9288604 |    0.1377376 |     5.8512607 |         5.8512607 |       -1.1688405 |        -1.1301080 |        7.4429507 |         7.4205079 |         0.0349509 |          0.0372721 |
|  535 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 256, 512])        | qint16        | 0.0001526 |  0.4880881 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
|  536 | head.layers.7.query_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.8774846 |    0.2645093 |     5.8512607 |     12782.0789193 |       -4.3179140 |        -4.0036111 |        7.4429507 |         7.4205079 |         0.0163546 |          0.0176551 |
|  537 | head.layers.7.key_cat                          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 256, 512])        | qint16        | 0.0001786 |  0.9560770 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
|  538 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8775001 |    0.2645093 |     5.8512607 |     12782.0789193 |       -4.3179140 |        -4.0036111 |        7.4429507 |         7.4205079 |         0.0163546 |          0.0176551 |
|  539 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.9560735 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
|  540 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.4880882 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
|  541 | head.layers.7.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8775001 |    0.2645093 |     5.8512607 |     12782.0789193 |       -4.3179140 |        -4.0036111 |        7.4429507 |         7.4205079 |         0.0163546 |          0.0176551 |
|  542 | head.layers.7.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.9560735 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
|  543 | head.layers.7.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.4880882 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
|  544 | head.layers.7.attn.q_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9214144 |    0.6228931 |     6.0161047 |         6.0161047 |      -10.7316799 |        -9.0360117 |        9.8683872 |         9.1133337 |        -0.0033241 |         -0.0137091 |
|  545 | head.layers.7.attn.k_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.9755846 |    0.1413739 |     9.8856306 |         9.8856306 |       -9.0129938 |        -8.4652958 |        8.6787100 |         8.6800261 |         0.0041813 |          0.0045817 |
|  546 | head.layers.7.attn.v_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.4570197 |    0.0162922 |     2.0041575 |         2.0041575 |       -1.2572778 |        -0.7027175 |        1.8435107 |         0.7677081 |        -0.0009063 |         -0.0001427 |
|  547 | head.layers.7.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.9214144 |    0.6228931 |     6.0161047 |         6.0161047 |      -10.7316799 |        -9.0360117 |        9.8683872 |         9.1133337 |        -0.0033241 |         -0.0137091 |
|  548 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.9214509 |    0.6228931 |     6.0161047 |         6.0161047 |      -10.7316799 |        -9.0360117 |        9.8683872 |         9.1133337 |        -0.0033241 |         -0.0137091 |
|  549 | head.layers.7.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  550 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  551 | head.layers.7.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 208, 64])        | qint8         | 1.0000000 |  0.0215776 |    1.7948388 |     8.8138027 |         8.8138027 |       -1.2572778 |        -8.4652958 |        1.8435107 |         8.6800261 |        -0.0009063 |          0.0045817 |
|  552 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 256, 64])        | qint8         | 1.0000000 |  0.0215791 |    1.7948388 |     8.8138027 |         8.8138027 |       -1.2572778 |        -8.4652958 |        1.8435107 |         8.6800261 |        -0.0009063 |          0.0045817 |
|  553 | head.layers.7.attn                             | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.9214509 |    0.0778616 |     0.7520131 |         6.0161047 |       -1.3414600 |        -1.1295015 |        1.2335484 |         1.1391667 |        -0.0004155 |         -0.0017136 |
|  554 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  555 | head.layers.7.attn.matmul                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.9484959 |    4.6410646 |    94.5825348 |        94.5825348 |      -71.4770584 |       -68.3372421 |      116.8532867 |        81.1547699 |         8.9380531 |          9.8135471 |
|  556 | head.layers.7.attn.softmax                     | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.2455102 |    0.0007121 |     0.9999318 |         0.9999318 |        0.0000000 |         0.0000000 |        0.9999318 |         0.0312500 |         0.0039062 |          0.0034966 |
|  557 | head.layers.7.attn.attention_drop              | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.2455102 |    0.0007121 |     0.9999318 |         0.9999318 |        0.0000000 |         0.0000000 |        0.9999318 |         0.0312500 |         0.0039062 |          0.0034966 |
|  558 | head.layers.7.attn.attn_matmul                 | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.3820741 |    0.0135582 |     1.5999056 |         1.5999056 |       -0.9219981 |        -0.3997336 |        1.6806213 |         0.4198489 |        -0.0000047 |         -0.0001069 |
|  559 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  560 | head.layers.7.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  561 | head.layers.7.attn.out_proj                    | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.8172125 |    0.0188777 |     0.9493764 |         0.9493764 |       -0.8156293 |        -0.3363274 |        1.0391617 |         0.4233098 |         0.0170811 |          0.0155008 |
|  562 | head.layers.7.attn                             | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 256])     | qint8         | 1.0000000 |  0.2455102 |    0.0007121 |     0.9999318 |         0.9999318 |        0.0000000 |         0.0000000 |        0.9999318 |         0.0312500 |         0.0039062 |          0.0034966 |
|  563 | head.layers.7.attn.attn_weights_mean           | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.4117149 |    0.0006231 |     0.6018641 |         0.6018641 |        0.0000000 |         0.0000000 |        0.6150175 |         0.0168142 |         0.0039062 |          0.0034966 |
|  564 | head.layers.7.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  565 | head.layers.7.dropout                          | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.8172476 |    0.0188777 |     0.9493764 |         0.9493764 |       -0.8156293 |        -0.3363274 |        1.0391617 |         0.4233098 |         0.0170811 |          0.0155008 |
|  566 | head.layers.7.add                              | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.8697897 |    0.2730882 |     5.8107181 |         5.8107181 |       -4.1936011 |        -3.9428821 |        7.3567367 |         7.3609724 |         0.0334357 |          0.0331559 |
|  567 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.8112146 |    0.3468094 |     7.5781040 |      4966.3104701 |       -9.8939438 |        -8.3149939 |       10.9066572 |         8.8802614 |        -0.0014595 |          0.0035957 |
|  568 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 512])        | qint16        | 0.0001526 |  0.7521880 |    0.1091376 |     2.2986481 |     15064.1904512 |       -4.6113548 |        -3.5012121 |        3.3126488 |         3.2703350 |        -0.0007868 |         -0.0040117 |
|  569 | head.layers.8.query_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.8785297 |    0.2422735 |     7.5781040 |     16554.3682337 |       -9.8939438 |        -8.3149939 |       10.9066572 |         8.8802614 |         0.0167457 |          0.0204339 |
|  570 | head.layers.8.key_cat                          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0002277 |  0.8785297 |    0.2422735 |     7.5781040 |     33278.4410974 |       -9.8939438 |        -8.3149939 |       10.9066572 |         8.8802614 |         0.0167457 |          0.0204339 |
|  571 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8785267 |    0.2422735 |     7.5781040 |     16554.3682337 |       -9.8939438 |        -8.3149939 |       10.9066572 |         8.8802614 |         0.0167457 |          0.0204339 |
|  572 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0002277 |  0.8785267 |    0.2422735 |     7.5781040 |     33278.4410974 |       -9.8939438 |        -8.3149939 |       10.9066572 |         8.8802614 |         0.0167457 |          0.0204339 |
|  573 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.7522588 |    0.1091376 |     2.2986481 |     15064.1904512 |       -4.6113548 |        -3.5012121 |        3.3126488 |         3.2703350 |        -0.0007868 |         -0.0040117 |
|  574 | head.layers.8.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8785267 |    0.2422735 |     7.5781040 |     16554.3682337 |       -9.8939438 |        -8.3149939 |       10.9066572 |         8.8802614 |         0.0167457 |          0.0204339 |
|  575 | head.layers.8.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0002277 |  0.8785267 |    0.2422735 |     7.5781040 |     33278.4410974 |       -9.8939438 |        -8.3149939 |       10.9066572 |         8.8802614 |         0.0167457 |          0.0204339 |
|  576 | head.layers.8.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.7522588 |    0.1091376 |     2.2986481 |     15064.1904512 |       -4.6113548 |        -3.5012121 |        3.3126488 |         3.2703350 |        -0.0007868 |         -0.0040117 |
|  577 | head.layers.8.attn.q_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.8861620 |    0.4257702 |     9.8893299 |         9.8893299 |       -7.9566884 |        -8.1380062 |        7.0627093 |         6.7857327 |        -0.0000523 |          0.0061518 |
|  578 | head.layers.8.attn.k_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9165121 |    0.3811377 |     6.8654442 |         6.8654442 |       -8.0827169 |        -8.1008368 |        6.5631824 |         6.5268774 |        -0.0281485 |         -0.0340303 |
|  579 | head.layers.8.attn.v_proj                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.7252963 |    0.1093622 |     3.1979313 |         3.1979313 |       -2.3207362 |        -2.1736395 |        2.5576859 |         1.7158074 |        -0.0022641 |          0.0014306 |
|  580 | head.layers.8.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.8861620 |    0.4257702 |     9.8893299 |         9.8893299 |       -7.9566884 |        -8.1380062 |        7.0627093 |         6.7857327 |        -0.0000523 |          0.0061518 |
|  581 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.8861404 |    0.4257702 |     9.8893299 |         9.8893299 |       -7.9566884 |        -8.1380062 |        7.0627093 |         6.7857327 |        -0.0000523 |          0.0061518 |
|  582 | head.layers.8.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.1582372 |    1.3315473 |    11.3100986 |        11.3100986 |       -8.0827169 |        -8.1380062 |        6.5631824 |         6.7857327 |        -0.0281485 |          0.0061518 |
|  583 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.1582372 |    1.3315473 |    11.3100986 |        11.3100986 |       -8.0827169 |        -8.1380062 |        6.5631824 |         6.7857327 |        -0.0281485 |          0.0061518 |
|  584 | head.layers.8.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0172403 |    1.0704658 |     8.2518225 |         8.2518225 |       -2.3207362 |        -8.1008368 |        2.5576859 |         6.5268774 |        -0.0022641 |         -0.0340303 |
|  585 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0172396 |    1.0704658 |     8.2518225 |         8.2518225 |       -2.3207362 |        -8.1008368 |        2.5576859 |         6.5268774 |        -0.0022641 |         -0.0340303 |
|  586 | head.layers.8.attn                             | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.8861404 |    0.0532213 |     1.2361662 |         9.8893299 |       -0.9945861 |        -1.0172508 |        0.8828387 |         0.8482166 |        -0.0000065 |          0.0007690 |
|  587 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  588 | head.layers.8.attn.matmul                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.8962533 |    1.6350690 |    54.3311005 |        54.3311005 |      -37.0222511 |       -32.7715454 |       39.7528267 |        34.3293266 |        -0.1405244 |         -0.0856300 |
|  589 | head.layers.8.attn.softmax                     | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.6274948 |    0.0013655 |     0.9257808 |         0.9257808 |        0.0000000 |         0.0000000 |        0.9470170 |         0.0312500 |         0.0019531 |          0.0012027 |
|  590 | head.layers.8.attn.attention_drop              | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.6274948 |    0.0013655 |     0.9257808 |         0.9257808 |        0.0000000 |         0.0000000 |        0.9470170 |         0.0312500 |         0.0019531 |          0.0012027 |
|  591 | head.layers.8.attn.attn_matmul                 | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.5429764 |    0.0930799 |     1.5763983 |         1.5763983 |       -1.6161245 |        -1.3688954 |        1.4762830 |         1.0067104 |         0.0010339 |          0.0032144 |
|  592 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  593 | head.layers.8.attn                             | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  594 | head.layers.8.attn.out_proj                    | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.7082310 |    0.1323519 |     1.0312486 |         1.0312486 |       -1.1506648 |        -0.7674384 |        1.2472575 |         0.9126895 |         0.0024769 |         -0.0005373 |
|  595 | head.layers.8.attn                             | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 512])     | qint8         | 1.0000000 |  0.6274948 |    0.0013655 |     0.9257808 |         0.9257808 |        0.0000000 |         0.0000000 |        0.9470170 |         0.0312500 |         0.0019531 |          0.0012027 |
|  596 | head.layers.8.attn.attn_weights_mean           | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.7523798 |    0.0010318 |     0.1423571 |         0.1423571 |        0.0000005 |         0.0000004 |        0.1490078 |         0.0156911 |         0.0019531 |          0.0012027 |
|  597 | head.layers.8.attn                             | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  598 | head.layers.8.dropout                          | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.7082614 |    0.1323519 |     1.0312486 |         1.0312486 |       -1.1506648 |        -0.7674384 |        1.2472575 |         0.9126895 |         0.0024769 |         -0.0005373 |
|  599 | head.layers.8.add                              | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.8734762 |    0.3016709 |     7.2395649 |         7.2395649 |       -9.4361763 |        -8.1293354 |       10.8056126 |         8.8152819 |         0.0192227 |          0.0198966 |
|  600 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.9707167 |    0.4435801 |    10.4567852 |      6852.8541837 |      -33.1856079 |       -29.7496567 |       26.4051628 |        24.3639832 |        -0.0209108 |         -0.0191098 |
|  601 | head.layers.9                                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9664434 |    0.1545118 |     3.8107133 |         3.8107133 |       -6.9571629 |        -6.8754935 |        6.2081108 |         6.1977339 |         0.0010322 |          0.0011346 |
|  602 | head.layers.10.kps_generator.offset            | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 24])         | qint16        | 1.0000000 |  0.9599976 |    0.1967417 |     5.9037848 |         5.9037848 |       -7.5058212 |        -7.9785299 |        3.5014927 |         3.6445334 |        -0.4765211 |         -0.5169594 |
|  603 | head.layers.10.kps_generator                   | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.9599976 |    0.1967417 |     5.9037848 |         5.9037848 |       -7.5058212 |        -7.9785299 |        3.5014927 |         3.6445334 |        -0.4765211 |         -0.5169594 |
|  604 | head.layers.10.kps_generator                   | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 3])       | qint16        | 1.0000000 |  0.9460701 |    1.4478467 |   120.1791687 |       120.1791687 |      -60.2012711 |       -59.8793411 |       61.9898529 |        62.3292618 |         3.5636420 |          4.1674547 |
|  605 | head.layers.10.kps_generator.keypoints_add     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.9441921 |    1.4744868 |   124.5980453 |       124.5980453 |      -65.1219482 |       -65.6089172 |       64.2417145 |        64.4162521 |         3.0871210 |          3.6504948 |
|  606 | head.layers.10.weight_add                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9425431 |    0.2432194 |     7.0040874 |         7.0040874 |       -7.3956013 |        -7.3266501 |        7.8758183 |         7.7198687 |         0.0359832 |          0.0384067 |
|  607 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 3, 4])         | qint16        | 1.0000000 |  0.9999999 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3522723 |          0.3522723 |
|  608 | head.layers.10                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 6, 12])           | qint16        | 1.0000000 |  0.9999999 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3522723 |          0.3522723 |
|  609 | head.layers.10.camera_encoder.0                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.6490881 |    0.3997014 |     6.5923667 |         6.5923667 |       -6.5923667 |         0.0000000 |        5.3010874 |         5.3010874 |        -0.0614436 |          0.3382578 |
|  610 | head.layers.10.camera_encoder.1                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  1.0000008 |    0.0000000 |     0.0000000 |         0.0000000 |        0.0000000 |         0.0000000 |        5.3010874 |         5.3010874 |         0.3382578 |          0.3382578 |
|  611 | head.layers.10.camera_encoder.2                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.9999999 |    0.0000000 |     0.0000005 |         0.0000005 |       -0.8844092 |        -0.8844092 |        5.0326171 |         5.0326171 |         0.0176572 |          0.0176572 |
|  612 | head.layers.10.camera_encoder.3                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.8652490 |    1.1241816 |    13.8601160 |        13.8601160 |      -13.8601160 |         0.0000000 |       24.4463539 |        24.4463539 |        -0.0114386 |          1.1127431 |
|  613 | head.layers.10.camera_encoder.4                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  1.0000011 |    0.0000001 |     0.0000029 |         0.0000029 |        0.0000000 |         0.0000000 |       24.4463539 |        24.4463539 |         1.1127431 |          1.1127431 |
|  614 | head.layers.10.camera_encoder.5                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.9999990 |    0.0000000 |     0.0000014 |         0.0000014 |       -1.0743425 |        -1.0743425 |        7.1053290 |         7.1053276 |         0.0232620 |          0.0232620 |
|  615 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 256])     | qint8         | 1.0000000 |  0.9425431 |    0.2432194 |     7.0040874 |         7.0040874 |       -7.3956013 |        -7.3266501 |        7.8758183 |         7.7198687 |         0.0359832 |          0.0384067 |
|  616 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 6, 256])       | qint8         | 1.0000000 |  0.9999990 |    0.0000000 |     0.0000014 |         0.0000014 |       -1.0743425 |        -1.0743425 |        7.1053290 |         7.1053276 |         0.0232620 |          0.0232620 |
|  617 | head.layers.10.cam_add                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 6, 256])     | qint8         | 1.0000000 |  0.9538236 |    0.2432193 |     7.0040874 |         7.0040874 |       -7.3040352 |        -7.2350841 |       10.5447502 |        10.2262230 |         0.0592451 |          0.0616686 |
|  618 | head.layers.10.weights_fc                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 6, 64])      | qint8         | 1.0000000 |  0.9853848 |    0.2554873 |     5.9332809 |         5.9332809 |       -9.9536963 |        -9.8963404 |        7.9539552 |         7.4060040 |        -0.3042734 |         -0.3423431 |
|  619 | head.layers.10                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.9853848 |    0.2554873 |     5.9332809 |         5.9332809 |       -9.9536963 |        -9.8963404 |        7.9539552 |         7.4060040 |        -0.3042734 |         -0.3423431 |
|  620 | head.layers.10.weight_softmax                  | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.8678191 |    0.0181528 |     0.7718112 |         0.7718112 |        0.0000002 |         0.0000000 |        0.8030612 |         0.0312500 |         0.0208333 |          0.0026860 |
|  621 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  0.9349905 |    2.8491516 |   124.5980453 |       124.5980453 |      -65.1219482 |       -65.6089172 |       64.2417145 |        64.4162521 |         5.2858710 |          6.8377376 |
|  622 | head.layers.10                                 | torch.ones_like                                                               | torch.ones_like                                                         | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
|  623 | head.layers.10.point_quant_stub                | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
|  624 | head.layers.10.point_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 8, 4])       | qint16        | 1.0000000 |  0.9441876 |    1.1058651 |   124.5980453 |       124.5980453 |      -65.1219482 |       -65.6089172 |       64.2417145 |        64.4162521 |         2.5653405 |          2.9878714 |
|  625 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 1, 1, 4, 4])   | qint16        | 1.0000000 |  0.9999996 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3267042 |          0.3267042 |
|  626 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 512, 8, 1, 4]) | qint16        | 1.0000000 |  0.9441876 |    1.1058651 |   124.5980453 |       124.5980453 |      -65.1219482 |       -65.6089172 |       64.2417145 |        64.4162521 |         2.5653405 |          2.9878714 |
|  627 | head.layers.10.point_matmul                    | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 4, 4]) | qint16        | 1.0000000 |  0.9659038 |    0.4918157 |   232.7046967 |       232.7046967 |     -179.2633667 |      -178.9048462 |      587.5925903 |       568.5257568 |         1.1048276 |          1.1204882 |
|  628 | head.layers.10.point_sum                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 6, 512, 8, 4])    | qint16        | 0.0244144 |  0.9664087 |    1.6366658 |   252.8135376 |     10355.0844939 |     -181.4197083 |      -181.0998077 |      635.4523315 |       637.8849487 |         4.4193106 |          4.4819527 |
|  629 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.9373393 |    2.0166016 |   124.3936539 |      5095.0863177 |      -70.6170502 |       -70.9330521 |       79.5761414 |        76.3817749 |         1.2066154 |          1.2216843 |
|  630 | head.layers.10                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.9598373 |    1.0973952 |    76.8187103 |      3146.4463640 |        0.0100000 |         0.0100000 |       79.5761414 |        76.3817749 |        11.6032057 |         11.5431137 |
|  631 | head.layers.10.reciprocal_op                   | horizon_plugin_pytorch.nn.reciprocal.Reciprocal                               | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.9463385 |   41.1409988 |    99.9839554 |     32762.2425879 |        0.0125666 |         0.0130921 |      100.0000000 |         1.2799804 |        41.7714844 |          0.6551647 |
|  632 | head.layers.10                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.9463385 |   41.1409988 |    99.9839554 |     32762.2425879 |        0.0125666 |         0.0130921 |      100.0000000 |         1.2799804 |        41.7714844 |          0.6551647 |
|  633 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0244144 |  0.9716863 |    2.2650311 |   252.8135376 |     10355.0844939 |     -181.4197083 |      -181.0998077 |      635.4523315 |       637.8849487 |         7.7353139 |          7.8530636 |
|  634 | head.layers.10.point_mul                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.9013314 | 1492.9350586 | 62875.5742188 | 206027537.8692586 |   -18141.9707031 |      -231.8042145 |    63545.2343750 |       816.4802246 |      1068.9968262 |         13.2878265 |
|  635 | head.layers.10                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.5815741 |  173.5552826 |   602.7880859 |   1975185.8610556 |     -500.0000000 |      -231.8042145 |      500.0000000 |       500.0000000 |       102.9184799 |         12.8910007 |
|  636 | head.layers.10                                 | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([156, 512, 8, 2])      | qint16        | 0.0003052 |  0.5815741 |  173.5552826 |   602.7880859 |   1975185.8610556 |     -500.0000000 |      -231.8042145 |      500.0000000 |       500.0000000 |       102.9184799 |         12.8910007 |
|  637 | head.layers.10                                 | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer            | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer      | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.8246062 |    0.3689581 |    54.5476303 |        54.5476303 |      -59.3225403 |       -61.9247437 |       53.5170670 |        60.8896332 |         0.0248967 |          0.0259415 |
|  638 | head.layers.10.feat_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.8246062 |    0.3689581 |    54.5476303 |        54.5476303 |      -59.3225403 |       -61.9247437 |       53.5170670 |        60.8896332 |         0.0248967 |          0.0259415 |
|  639 | head.layers.10                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 6, 256, 512, 8])  | qint8         | 1.0000000 |  0.8246062 |    0.3689581 |    54.5476303 |        54.5476303 |      -59.3225403 |       -61.9247437 |       53.5170670 |        60.8896332 |         0.0248967 |          0.0259415 |
|  640 | head.layers.10                                 | torch.Tensor.permute                                                          | torch.Tensor.permute                                                    | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.8245628 |    0.3689581 |    54.5476303 |        54.5476303 |      -59.3225403 |       -61.9247437 |       53.5170670 |        60.8896332 |         0.0248967 |          0.0259415 |
|  641 | head.layers.10                                 | torch.Tensor.contiguous                                                       | torch.Tensor.contiguous                                                 | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.8245628 |    0.3689581 |    54.5476303 |        54.5476303 |      -59.3225403 |       -61.9247437 |       53.5170670 |        60.8896332 |         0.0248967 |          0.0259415 |
|  642 | head.layers.10                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.8245628 |    0.3689581 |    54.5476303 |        54.5476303 |      -59.3225403 |       -61.9247437 |       53.5170670 |        60.8896332 |         0.0248967 |          0.0259415 |
|  643 | head.layers.10                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 48, 8, 1])   | qint8         | 1.0000000 |  0.8678191 |    0.0181528 |     0.7718112 |         0.7718112 |        0.0000002 |         0.0000000 |        0.8030612 |         0.0312500 |         0.0208333 |          0.0026860 |
|  644 | head.layers.10                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.8245628 |    0.3689581 |    54.5476303 |        54.5476303 |      -59.3225403 |       -61.9247437 |       53.5170670 |        60.8896332 |         0.0248967 |          0.0259415 |
|  645 | head.layers.10.feat_mul                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.6228739 |    0.0122016 |    10.8177338 |        10.8177338 |       -7.8028436 |        -0.5865604 |       10.9194040 |         0.6829248 |        -0.0000240 |          0.0000186 |
|  646 | head.layers.10                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.6228739 |    0.0122016 |    10.8177338 |        10.8177338 |       -7.8028436 |        -0.5865604 |       10.9194040 |         0.6829248 |        -0.0000240 |          0.0000186 |
|  647 | head.layers.10.feat_sum                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7399966 |    0.4588974 |    11.5994158 |        11.5994158 |      -10.5072956 |        -1.2340865 |       11.7250528 |         1.6652162 |        -0.0011513 |          0.0008921 |
|  648 | head.layers.10.output_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7761183 |    0.6584607 |    13.3343582 |        13.3343582 |      -14.2071524 |        -1.5940818 |       11.6652822 |         1.2792654 |        -0.0284591 |         -0.0047637 |
|  649 | head.layers.10.proj_drop                       | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7761183 |    0.6584607 |    13.3343582 |        13.3343582 |      -14.2071524 |        -1.5940818 |       11.6652822 |         1.2792654 |        -0.0284591 |         -0.0047637 |
|  650 | head.layers.10.residual_op                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.6605441 |    0.4064863 |    13.3343582 |        13.3343582 |      -14.2071524 |        -6.8754935 |       11.6652822 |         6.1977339 |        -0.0137134 |         -0.0018145 |
|  651 | head.layers.11.pre_norm                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.7913144 |    0.3590399 |     4.8501329 |         4.8501329 |       -8.1613941 |        -8.0638924 |        7.7375455 |         8.0675459 |         0.0018073 |          0.0015279 |
|  652 | head.layers.11.layers.0.0                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.1714725 |    2.1114109 |    15.7945251 |        15.7945251 |      -15.7945251 |         0.0000000 |       10.6564369 |        12.5870972 |        -1.5357649 |          0.2073035 |
|  653 | head.layers.11.layers.0.2                      | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.5696912 |    0.2720455 |    12.3671293 |        12.3671293 |        0.0000000 |         0.0000000 |       10.6564369 |        12.5870972 |         0.3036004 |          0.2073035 |
|  654 | head.layers.11.layers.1                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6630629 |    1.9154540 |    36.1587448 |        36.1587448 |      -41.2506638 |       -48.6083984 |       45.0989685 |        50.9397278 |         0.0040454 |         -0.0149579 |
|  655 | head.layers.11.layers.2                        | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6630629 |    1.9154540 |    36.1587448 |        36.1587448 |      -41.2506638 |       -48.6083984 |       45.0989685 |        50.9397278 |         0.0040454 |         -0.0149579 |
|  656 | head.layers.11.identity_fc                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  657 | head.layers.11.short_add                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.linear.LinearAdd                          | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6901577 |    2.4176052 |    43.9299545 |        43.9299545 |      -42.9901657 |       -55.2390289 |       45.6833801 |        54.4379921 |         0.0351060 |          0.0157583 |
|  658 | head.layers.12                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8266587 |    0.3964716 |     3.2360337 |         3.2360337 |       -4.2579255 |        -4.1427016 |        3.7801514 |         3.4720094 |        -0.0019784 |         -0.0004309 |
|  659 | head.layers.13.add1                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8507078 |    0.4629726 |     6.0583677 |         6.0583677 |       -4.3276134 |        -4.2843990 |        8.6802197 |         7.5915990 |         0.0329726 |          0.0368412 |
|  660 | head.layers.13.layers.0                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.4094916 |    1.2950554 |     9.5145416 |         9.5145416 |       -9.2014322 |         0.0000000 |        7.3352132 |         8.1798534 |        -0.5504948 |          0.4855922 |
|  661 | head.layers.13.layers.1                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8230097 |    0.2885431 |     6.2550435 |         6.2550435 |        0.0000000 |         0.0000000 |        7.3352132 |         8.1798534 |         0.4560175 |          0.4855922 |
|  662 | head.layers.13.layers.2                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.4284372 |    1.0185007 |     8.9602518 |         8.9602518 |       -8.9602518 |         0.0000000 |        6.2453818 |         7.8412991 |        -0.3298496 |          0.4493437 |
|  663 | head.layers.13.layers.3                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8107232 |    0.2598087 |     6.9005384 |         6.9005384 |        0.0000000 |         0.0000000 |        6.2453818 |         7.8412991 |         0.4288423 |          0.4493437 |
|  664 | head.layers.13.layers.4                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7807872 |    0.3514844 |     5.5285506 |         5.5285506 |       -0.8241576 |        -0.8079963 |        7.3116388 |         6.9406590 |         0.0353564 |          0.0373288 |
|  665 | head.layers.13.layers.5                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.3888572 |    1.2116154 |     7.8285022 |         7.8285022 |       -7.8285022 |         0.0000000 |        7.9768009 |         7.4653091 |        -0.6059223 |          0.3615786 |
|  666 | head.layers.13.layers.6                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7988403 |    0.2406383 |     5.8105278 |         5.8105278 |        0.0000000 |         0.0000000 |        7.9768009 |         7.4653091 |         0.3650548 |          0.3615786 |
|  667 | head.layers.13.layers.7                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.5042981 |    1.0121477 |     8.5419750 |         8.5419750 |       -7.2733197 |         0.0000000 |       15.0167561 |        13.9757986 |        -0.4171226 |          0.3693733 |
|  668 | head.layers.13.layers.8                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8596807 |    0.2268837 |     8.5419750 |         8.5419750 |        0.0000000 |         0.0000000 |       15.0167561 |        13.9757986 |         0.3681414 |          0.3693733 |
|  669 | head.layers.13.layers.9                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7297463 |    0.3401862 |     7.6818371 |         7.6818371 |       -0.8859449 |        -0.7736182 |       10.8888092 |        10.2766991 |         0.0282080 |          0.0273939 |
|  670 | head.layers.13.layers.10                       | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.6012841 |    0.8197422 |     9.0646534 |         9.0646534 |       -6.1032748 |        -5.6867905 |        7.6909628 |         7.1137953 |        -0.1843948 |          0.0936221 |
|  671 | head.layers.13.layers.11.scale_quant_stub      | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([11])                  | qint16        | 1.0000000 |  1.0000000 |    0.0000000 |     0.0000000 |         0.0000000 |        0.0353014 |         0.0353014 |        0.9935754 |         0.9935754 |         0.2909352 |          0.2909352 |
|  672 | head.layers.13.layers.11.mul                   | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.8954685 |    0.2627181 |     5.9053993 |         5.9053993 |       -5.9016981 |        -5.3545003 |        5.0104737 |         4.6344633 |        -0.2612101 |         -0.1945487 |
|  673 | head.layers.13.add2                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.9413078 |    0.9122968 |   119.9288483 |       119.9288483 |      -61.2288322 |       -60.4901466 |       63.4121399 |        62.5821571 |        -0.2375002 |          0.0612256 |
|  674 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 11])         | torch.float32 |           |  0.9413078 |    0.9122968 |   119.9288483 |                   |      -61.2288322 |       -60.4901466 |       63.4121399 |        62.5821571 |        -0.2375002 |          0.0612256 |
|  675 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.9461623 |    1.5340674 |   119.9288483 |       119.9288483 |      -61.2288322 |       -60.4901466 |       63.4121399 |        62.5821571 |         3.5132000 |          4.0605092 |
|  676 | head.anchor_encoder.pos_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6671675 |    5.4293270 |    81.2277756 |        81.2277756 |      -81.2277756 |         0.0000000 |       90.4889832 |        77.6622086 |        -0.0709056 |          4.8222046 |
|  677 | head.anchor_encoder.pos_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9563415 |    0.5087884 |    74.6361465 |        74.6361465 |        0.0000000 |         0.0000000 |       90.4889832 |        77.6622086 |         4.8496327 |          4.8222046 |
|  678 | head.anchor_encoder.pos_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9473625 |    0.1042933 |     3.9744313 |         3.9744313 |       -0.9522926 |        -0.9632884 |        3.9762886 |         3.5001223 |         0.0107542 |          0.0108715 |
|  679 | head.anchor_encoder.pos_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6278782 |    0.6087430 |     4.6625757 |         4.6625757 |       -4.6625757 |         0.0000000 |        5.5066195 |         5.6562891 |        -0.1940463 |          0.3530036 |
|  680 | head.anchor_encoder.pos_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9602932 |    0.0692113 |     3.7402427 |         3.7402427 |        0.0000000 |         0.0000000 |        5.5066195 |         5.6562891 |         0.3454855 |          0.3530036 |
|  681 | head.anchor_encoder.pos_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9799864 |    0.0735115 |     3.3604505 |         3.3604505 |       -0.9991107 |        -0.9980868 |        6.3517904 |         6.3540111 |         0.0790222 |          0.0804197 |
|  682 | head.anchor_encoder.pos_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6304200 |    0.6695691 |     6.1258192 |         6.1258192 |       -6.1258192 |         0.0000000 |        5.9462080 |         5.8741956 |        -0.0835280 |          0.5224553 |
|  683 | head.anchor_encoder.pos_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9836190 |    0.0675904 |     3.8306897 |         3.8306897 |        0.0000000 |         0.0000000 |        5.9462080 |         5.8741956 |         0.5184509 |          0.5224553 |
|  684 | head.anchor_encoder.pos_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9831796 |    0.0807369 |     4.2020764 |         4.2020764 |       -0.8707660 |        -0.8418366 |        5.6899581 |         5.7126255 |         0.0259139 |          0.0259324 |
|  685 | head.anchor_encoder.pos_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6268103 |    0.8747202 |     5.6220007 |         5.6220007 |       -5.6220007 |         0.0000000 |        8.4006329 |         8.3814392 |        -0.3284784 |          0.4831057 |
|  686 | head.anchor_encoder.pos_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9885095 |    0.0623918 |     2.7908075 |         2.7908075 |        0.0000000 |         0.0000000 |        8.4006329 |         8.3814392 |         0.4838501 |          0.4831057 |
|  687 | head.anchor_encoder.pos_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9839010 |    0.0706804 |     2.9900944 |         2.9900944 |       -0.8277794 |        -0.8347309 |        7.4495039 |         7.4234867 |         0.0272943 |          0.0270297 |
|  688 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.9388550 |    0.1868148 |     1.8912069 |         1.8912069 |       -0.5464319 |        -0.0141040 |        2.6162450 |         2.6324100 |         0.7646925 |          0.9147825 |
|  689 | head.anchor_encoder.size_fc.0                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3474701 |    0.4801856 |     2.7581103 |         2.7581103 |       -2.7581103 |         0.0000000 |        1.4622414 |         1.4518390 |        -0.3174664 |          0.1290731 |
|  690 | head.anchor_encoder.size_fc.1                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9521167 |    0.0276754 |     1.4333545 |         1.4333545 |        0.0000000 |         0.0000000 |        1.4622414 |         1.4518390 |         0.1350438 |          0.1290731 |
|  691 | head.anchor_encoder.size_fc.2                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9513769 |    0.1100637 |     4.2827458 |         4.2827458 |       -0.7228440 |        -0.6342425 |        3.8913584 |         3.9368865 |         0.0216696 |          0.0220584 |
|  692 | head.anchor_encoder.size_fc.3                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.5528548 |    0.2919432 |     2.0663271 |         2.0663271 |       -1.8059421 |         0.0000000 |        1.6251451 |         1.0799669 |        -0.0189920 |          0.2098786 |
|  693 | head.anchor_encoder.size_fc.4                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9124849 |    0.0615676 |     1.4454505 |         1.4454505 |        0.0000000 |         0.0000000 |        1.6251451 |         1.0799669 |         0.2113836 |          0.2098786 |
|  694 | head.anchor_encoder.size_fc.5                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9124060 |    0.2098577 |     3.6547203 |         3.6547203 |       -0.9186025 |        -0.9364325 |        3.3718150 |         3.3350952 |         0.0079017 |          0.0089353 |
|  695 | head.anchor_encoder.size_fc.6                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.6231980 |    0.4708045 |     2.8602338 |         2.8602338 |       -2.0915556 |         0.0000000 |        2.2784817 |         1.7804329 |        -0.0433837 |          0.3051595 |
|  696 | head.anchor_encoder.size_fc.7                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9209056 |    0.1047874 |     2.1439850 |         2.1439850 |        0.0000000 |         0.0000000 |        2.2784817 |         1.7804329 |         0.3226334 |          0.3051595 |
|  697 | head.anchor_encoder.size_fc.8                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8989801 |    0.2208747 |     4.2476926 |         4.2476926 |       -0.7985746 |        -0.7851589 |        3.5890164 |         3.4368558 |         0.0114186 |          0.0127270 |
|  698 | head.anchor_encoder.size_fc.9                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7076653 |    0.4251493 |     3.6162395 |         3.6162395 |       -2.5844393 |         0.0000000 |        2.9754043 |         2.9894173 |         0.1329567 |          0.4341532 |
|  699 | head.anchor_encoder.size_fc.10                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9267235 |    0.1357739 |     2.7599297 |         2.7599297 |        0.0000000 |         0.0000000 |        2.9754043 |         2.9894173 |         0.4223321 |          0.4341532 |
|  700 | head.anchor_encoder.size_fc.11                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8159190 |    0.2853814 |     4.3487692 |         4.3487692 |       -1.0876920 |        -1.0979377 |        3.9652030 |         3.9328833 |         0.0273770 |          0.0361786 |
|  701 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 2])          | qint16        | 1.0000000 |  0.9771133 |    0.0886255 |     0.5709612 |         0.5709612 |       -1.6839497 |        -1.2335992 |        0.2178700 |         0.1352192 |        -0.5333836 |         -0.5105293 |
|  702 | head.anchor_encoder.yaw_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7870626 |    0.1732617 |     1.4430871 |         1.4430871 |       -1.4430871 |         0.0000000 |        1.5734828 |         1.3364344 |         0.1374090 |          0.2584700 |
|  703 | head.anchor_encoder.yaw_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9762573 |    0.0360801 |     0.5527335 |         0.5527335 |        0.0000000 |         0.0000000 |        1.5734828 |         1.3364344 |         0.2745906 |          0.2584700 |
|  704 | head.anchor_encoder.yaw_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9768466 |    0.0870173 |     1.1895604 |         1.1895604 |       -1.2617612 |        -1.2145599 |        3.0454416 |         3.0171630 |        -0.0004305 |         -0.0006270 |
|  705 | head.anchor_encoder.yaw_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.4489885 |    0.4817568 |     3.2784445 |         3.2784445 |       -3.2784445 |         0.0000000 |        1.8475673 |         1.9136672 |        -0.1674129 |          0.2148397 |
|  706 | head.anchor_encoder.yaw_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8678802 |    0.0823300 |     1.1747248 |         1.1747248 |        0.0000000 |         0.0000000 |        1.8475673 |         1.9136672 |         0.2320139 |          0.2148397 |
|  707 | head.anchor_encoder.yaw_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8827002 |    0.2281844 |     3.4903438 |         3.4903438 |       -0.9146271 |        -0.9138878 |        4.0644321 |         4.1201916 |         0.0087560 |          0.0056796 |
|  708 | head.anchor_encoder.yaw_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3689743 |    0.6626659 |     3.7709863 |         3.7709863 |       -3.7709863 |         0.0000000 |        2.5383365 |         2.1152964 |        -0.2358205 |          0.2786312 |
|  709 | head.anchor_encoder.yaw_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8051730 |    0.1279934 |     2.0380871 |         2.0380871 |        0.0000000 |         0.0000000 |        2.5383365 |         2.1152964 |         0.2988519 |          0.2786312 |
|  710 | head.anchor_encoder.yaw_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8112699 |    0.2996655 |     4.4585404 |         4.4585404 |       -0.9109438 |        -0.9107445 |        3.7920723 |         3.7311146 |         0.0124275 |          0.0103905 |
|  711 | head.anchor_encoder.yaw_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3467321 |    0.8665622 |     5.1873498 |         5.1873498 |       -5.1873498 |         0.0000000 |        2.6718163 |         2.6573081 |        -0.4685702 |          0.2511249 |
|  712 | head.anchor_encoder.yaw_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8674985 |    0.1297525 |     2.6572256 |         2.6572256 |        0.0000000 |         0.0000000 |        2.6718163 |         2.6573081 |         0.2682395 |          0.2511249 |
|  713 | head.anchor_encoder.yaw_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7825811 |    0.2898282 |     4.7069674 |         4.7069674 |       -0.9925305 |        -0.8749314 |        4.9367743 |         4.9025941 |         0.0864746 |          0.0802271 |
|  714 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.9272401 |    1.5651225 |    56.1994209 |        56.1994209 |      -47.4792290 |       -44.8239098 |       14.7914333 |        12.0109043 |        -4.7931376 |         -4.4104447 |
|  715 | head.anchor_encoder.vel_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.6488178 |    3.2480695 |    31.3728447 |        31.3728447 |      -28.3697357 |         0.0000000 |       25.7480602 |        26.6680717 |        -0.2468670 |          2.5187070 |
|  716 | head.anchor_encoder.vel_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9437660 |    0.4939511 |    26.1964684 |        26.1964684 |        0.0000000 |         0.0000000 |       25.7480602 |        26.6680717 |         2.5072515 |          2.5187070 |
|  717 | head.anchor_encoder.vel_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8901136 |    0.1798230 |     4.2682166 |         4.2682166 |       -0.8931753 |        -0.8927098 |        3.5567503 |         3.5660436 |         0.0190192 |          0.0173419 |
|  718 | head.anchor_encoder.vel_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.4712663 |    0.4419120 |     3.5126357 |         3.5126357 |       -2.9767330 |         0.0000000 |        3.0539699 |         3.2575307 |        -0.1385484 |          0.2207356 |
|  719 | head.anchor_encoder.vel_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8609933 |    0.0891549 |     2.8849823 |         2.8849823 |        0.0000000 |         0.0000000 |        3.0539699 |         3.2575307 |         0.2142087 |          0.2207356 |
|  720 | head.anchor_encoder.vel_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8527972 |    0.2296948 |     4.8520255 |         4.8520255 |       -0.9140434 |        -0.8969926 |        4.7490230 |         4.2121830 |         0.0361232 |          0.0330460 |
|  721 | head.anchor_encoder.vel_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.5461804 |    0.7046979 |     5.1627192 |         5.1627192 |       -4.8113146 |         0.0000000 |        3.5884485 |         3.6267841 |        -0.1636669 |          0.3791220 |
|  722 | head.anchor_encoder.vel_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8721459 |    0.1509086 |     3.6267841 |         3.6267841 |        0.0000000 |         0.0000000 |        3.5884485 |         3.6267841 |         0.3901224 |          0.3791220 |
|  723 | head.anchor_encoder.vel_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8439759 |    0.2344827 |     5.7563848 |         5.7563848 |       -0.8648613 |        -0.8640830 |        5.3526421 |         5.0154991 |         0.0206223 |          0.0202108 |
|  724 | head.anchor_encoder.vel_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.5813733 |    0.6940547 |     5.8178005 |         5.8178005 |       -4.8300200 |         0.0000000 |        4.5202107 |         4.7200894 |        -0.2062706 |          0.3368580 |
|  725 | head.anchor_encoder.vel_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8813016 |    0.1399565 |     4.3815265 |         4.3815265 |        0.0000000 |         0.0000000 |        4.5202107 |         4.7200894 |         0.3478276 |          0.3368580 |
|  726 | head.anchor_encoder.vel_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8441011 |    0.2088944 |     5.8336248 |         5.8336248 |       -0.8044084 |        -0.7224922 |        5.4330044 |         5.5621414 |         0.0246362 |          0.0271589 |
|  727 | head.anchor_encoder.cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9013314 |    0.1594650 |     5.8336248 |         5.8336248 |       -1.0876920 |        -1.0979377 |        7.4495039 |         7.4234867 |         0.0340376 |          0.0348553 |
|  728 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 256, 512])        | qint16        | 0.0001526 |  0.4880881 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
|  729 | head.layers.14.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.8656500 |    0.2779683 |     5.8336248 |     12743.5534655 |       -4.2579255 |        -4.1427016 |        7.4495039 |         7.4234867 |         0.0160296 |          0.0172122 |
|  730 | head.layers.14.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 256, 512])        | qint16        | 0.0001786 |  0.9560770 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
|  731 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8657053 |    0.2779683 |     5.8336248 |     12743.5534655 |       -4.2579255 |        -4.1427016 |        7.4495039 |         7.4234867 |         0.0160296 |          0.0172122 |
|  732 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.9560735 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
|  733 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.4880882 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
|  734 | head.layers.14.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8657053 |    0.2779683 |     5.8336248 |     12743.5534655 |       -4.2579255 |        -4.1427016 |        7.4495039 |         7.4234867 |         0.0160296 |          0.0172122 |
|  735 | head.layers.14.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.9560735 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
|  736 | head.layers.14.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.4880882 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
|  737 | head.layers.14.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9186697 |    0.6260327 |     6.7954264 |         6.7954264 |       -9.6720219 |        -8.5568504 |        9.9622250 |         9.0663939 |        -0.0369027 |         -0.0662642 |
|  738 | head.layers.14.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.9612759 |    0.1577078 |     9.3025351 |         9.3025351 |       -8.5375614 |        -7.8128047 |        8.0426655 |         7.7073140 |        -0.0193341 |         -0.0114971 |
|  739 | head.layers.14.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.5411231 |    0.0167763 |     1.3766992 |         1.3766992 |       -1.3209248 |        -0.8245173 |        1.1304812 |         0.7680836 |        -0.0014911 |         -0.0010245 |
|  740 | head.layers.14.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.9186697 |    0.6260327 |     6.7954264 |         6.7954264 |       -9.6720219 |        -8.5568504 |        9.9622250 |         9.0663939 |        -0.0369027 |         -0.0662642 |
|  741 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.9186657 |    0.6260327 |     6.7954264 |         6.7954264 |       -9.6720219 |        -8.5568504 |        9.9622250 |         9.0663939 |        -0.0369027 |         -0.0662642 |
|  742 | head.layers.14.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  743 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  744 | head.layers.14.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 208, 64])        | qint8         | 1.0000000 |  0.0329336 |    1.5158150 |     7.7487197 |         7.7487197 |       -1.3209248 |        -7.8128047 |        1.1304812 |         7.7073140 |        -0.0014911 |         -0.0114971 |
|  745 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 256, 64])        | qint8         | 1.0000000 |  0.0329331 |    1.5158150 |     7.7487197 |         7.7487197 |       -1.3209248 |        -7.8128047 |        1.1304812 |         7.7073140 |        -0.0014911 |         -0.0114971 |
|  746 | head.layers.14.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.9186657 |    0.0782541 |     0.8494283 |         6.7954264 |       -1.2090027 |        -1.0696063 |        1.2452781 |         1.1332992 |        -0.0046128 |         -0.0082830 |
|  747 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  748 | head.layers.14.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.9176192 |    3.6920114 |    85.7539673 |        85.7539673 |      -97.6130371 |       -75.6735306 |       93.3405838 |        61.4923820 |         1.6228878 |          1.1700131 |
|  749 | head.layers.14.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.2823446 |    0.0007014 |     0.9999987 |         0.9999987 |        0.0000000 |         0.0000000 |        1.0000000 |         0.0312500 |         0.0039062 |          0.0034968 |
|  750 | head.layers.14.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.2823446 |    0.0007014 |     0.9999987 |         0.9999987 |        0.0000000 |         0.0000000 |        1.0000000 |         0.0312500 |         0.0039062 |          0.0034968 |
|  751 | head.layers.14.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.4617642 |    0.0148506 |     1.0182521 |         1.0182521 |       -1.0506469 |        -0.4812885 |        0.8939777 |         0.5153387 |        -0.0005594 |         -0.0004960 |
|  752 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  753 | head.layers.14.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  754 | head.layers.14.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.7649240 |    0.0214377 |     1.2521191 |         1.2521191 |       -1.0594592 |        -0.4814988 |        1.2084419 |         0.4444591 |         0.0114629 |          0.0110400 |
|  755 | head.layers.14.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 256])     | qint8         | 1.0000000 |  0.2823446 |    0.0007014 |     0.9999987 |         0.9999987 |        0.0000000 |         0.0000000 |        1.0000000 |         0.0312500 |         0.0039062 |          0.0034968 |
|  756 | head.layers.14.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.4548263 |    0.0005851 |     0.6175427 |         0.6175427 |        0.0000000 |         0.0000000 |        0.6315945 |         0.0265775 |         0.0039062 |          0.0034968 |
|  757 | head.layers.14.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  758 | head.layers.14.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.7649294 |    0.0214377 |     1.2521191 |         1.2521191 |       -1.0594592 |        -0.4814988 |        1.2084419 |         0.4444591 |         0.0114629 |          0.0110400 |
|  759 | head.layers.14.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.8598316 |    0.2869329 |     5.9678125 |         5.9678125 |       -4.0995378 |        -3.9945893 |        7.3682256 |         7.3515611 |         0.0274925 |          0.0282522 |
|  760 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.8350473 |    0.3510733 |     6.6387796 |      4350.7242382 |      -12.5623875 |       -10.4409933 |       11.2689934 |         8.1019659 |         0.0336760 |          0.0462299 |
|  761 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 512])        | qint16        | 0.0001526 |  0.7625700 |    0.1095507 |     2.2225032 |     14565.1746281 |       -4.5123205 |        -4.1548786 |        3.9001446 |         3.2075975 |         0.0061143 |          0.0047533 |
|  762 | head.layers.15.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.8697361 |    0.2552692 |     6.6387796 |     14502.4141274 |      -12.5623875 |       -10.4409933 |       11.2689934 |         8.1019659 |         0.0338568 |          0.0405426 |
|  763 | head.layers.15.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0002464 |  0.8697361 |    0.2552692 |     6.6387796 |     26942.0952820 |      -12.5623875 |       -10.4409933 |       11.2689934 |         8.1019659 |         0.0338568 |          0.0405426 |
|  764 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8698345 |    0.2552692 |     6.6387796 |     14502.4141274 |      -12.5623875 |       -10.4409933 |       11.2689934 |         8.1019659 |         0.0338568 |          0.0405426 |
|  765 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0002464 |  0.8698345 |    0.2552692 |     6.6387796 |     26942.0952820 |      -12.5623875 |       -10.4409933 |       11.2689934 |         8.1019659 |         0.0338568 |          0.0405426 |
|  766 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.7626358 |    0.1095507 |     2.2225032 |     14565.1746281 |       -4.5123205 |        -4.1548786 |        3.9001446 |         3.2075975 |         0.0061143 |          0.0047533 |
|  767 | head.layers.15.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8698345 |    0.2552692 |     6.6387796 |     14502.4141274 |      -12.5623875 |       -10.4409933 |       11.2689934 |         8.1019659 |         0.0338568 |          0.0405426 |
|  768 | head.layers.15.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0002464 |  0.8698345 |    0.2552692 |     6.6387796 |     26942.0952820 |      -12.5623875 |       -10.4409933 |       11.2689934 |         8.1019659 |         0.0338568 |          0.0405426 |
|  769 | head.layers.15.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.7626358 |    0.1095507 |     2.2225032 |     14565.1746281 |       -4.5123205 |        -4.1548786 |        3.9001446 |         3.2075975 |         0.0061143 |          0.0047533 |
|  770 | head.layers.15.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9029770 |    0.4515842 |     7.7827320 |         7.7827320 |      -10.1272621 |        -9.5493288 |        7.5038614 |         7.6999025 |        -0.0336891 |         -0.0443064 |
|  771 | head.layers.15.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9445459 |    0.4091109 |     8.2986975 |         8.2986975 |      -11.5882988 |       -11.7330418 |       10.2617607 |         9.8789310 |         0.0310188 |          0.0245303 |
|  772 | head.layers.15.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.7314115 |    0.1325273 |     2.6105626 |         2.6105626 |       -2.3040166 |        -2.2447858 |        2.6856863 |         2.9505084 |         0.0006179 |          0.0004390 |
|  773 | head.layers.15.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.9029770 |    0.4515842 |     7.7827320 |         7.7827320 |      -10.1272621 |        -9.5493288 |        7.5038614 |         7.6999025 |        -0.0336891 |         -0.0443064 |
|  774 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.9029801 |    0.4515842 |     7.7827320 |         7.7827320 |      -10.1272621 |        -9.5493288 |        7.5038614 |         7.6999025 |        -0.0336891 |         -0.0443064 |
|  775 | head.layers.15.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.2779858 |    1.5602758 |    12.9059792 |        12.9059792 |      -11.5882988 |        -9.5493288 |       10.2617607 |         7.6999025 |         0.0310188 |         -0.0443064 |
|  776 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.2779896 |    1.5602758 |    12.9059792 |        12.9059792 |      -11.5882988 |        -9.5493288 |       10.2617607 |         7.6999025 |         0.0310188 |         -0.0443064 |
|  777 | head.layers.15.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 | -0.0180311 |    1.3816082 |    12.0046329 |        12.0046329 |       -2.3040166 |       -11.7330418 |        2.6856863 |         9.8789310 |         0.0006179 |          0.0245303 |
|  778 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 | -0.0180313 |    1.3816082 |    12.0046329 |        12.0046329 |       -2.3040166 |       -11.7330418 |        2.6856863 |         9.8789310 |         0.0006179 |          0.0245303 |
|  779 | head.layers.15.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.9029801 |    0.0564480 |     0.9728415 |         7.7827320 |       -1.2659078 |        -1.1936661 |        0.9379827 |         0.9624878 |        -0.0042111 |         -0.0055383 |
|  780 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  781 | head.layers.15.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.9337420 |    1.9612998 |    32.9754562 |        32.9754562 |      -47.4592781 |       -48.3080940 |       56.8442841 |        53.9152222 |        -0.3387333 |         -0.4633927 |
|  782 | head.layers.15.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.5115929 |    0.0014994 |     0.9837905 |         0.9837905 |        0.0000000 |         0.0000000 |        0.9994463 |         0.0312500 |         0.0019531 |          0.0012474 |
|  783 | head.layers.15.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.5115929 |    0.0014994 |     0.9837905 |         0.9837905 |        0.0000000 |         0.0000000 |        0.9994463 |         0.0312500 |         0.0019531 |          0.0012474 |
|  784 | head.layers.15.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.5226707 |    0.1158266 |     1.8423491 |         1.8423491 |       -1.9106425 |        -2.0219560 |        1.8993245 |         1.1613604 |        -0.0008279 |          0.0022558 |
|  785 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  786 | head.layers.15.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  787 | head.layers.15.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.5910118 |    0.1847938 |     1.8044307 |         1.8044307 |       -2.1017518 |        -0.8618459 |        1.6588008 |         0.9384975 |         0.0069889 |         -0.0014031 |
|  788 | head.layers.15.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 512])     | qint8         | 1.0000000 |  0.5115929 |    0.0014994 |     0.9837905 |         0.9837905 |        0.0000000 |         0.0000000 |        0.9994463 |         0.0312500 |         0.0019531 |          0.0012474 |
|  789 | head.layers.15.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.5726824 |    0.0012286 |     0.4948308 |         0.4948308 |        0.0000004 |         0.0000007 |        0.5123326 |         0.0197913 |         0.0019531 |          0.0012474 |
|  790 | head.layers.15.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  791 | head.layers.15.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.5910023 |    0.1847938 |     1.8044307 |         1.8044307 |       -2.1017518 |        -0.8618459 |        1.6588008 |         0.9384975 |         0.0069889 |         -0.0014031 |
|  792 | head.layers.15.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.8580623 |    0.3380980 |     6.6454453 |         6.6454453 |      -11.4353132 |       -10.1956377 |       11.2056570 |         8.0947256 |         0.0408457 |          0.0391395 |
|  793 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.9463784 |    0.5183176 |    11.2045908 |      7342.9285808 |      -30.0701771 |       -27.8019238 |       26.0134602 |        25.0212307 |        -0.0028451 |         -0.0070810 |
|  794 | head.layers.16                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9456731 |    0.1997769 |     3.6647043 |         3.6647043 |       -6.8994412 |        -6.5884376 |        6.3058200 |         6.3722897 |         0.0017012 |          0.0020097 |
|  795 | head.layers.17.kps_generator.offset            | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 24])         | qint16        | 1.0000000 |  0.9650872 |    0.2189352 |     3.9222484 |         3.9222484 |       -5.6545467 |        -5.1992030 |        4.3219333 |         5.7352042 |        -0.0785837 |         -0.0654964 |
|  796 | head.layers.17.kps_generator                   | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.9650872 |    0.2189352 |     3.9222484 |         3.9222484 |       -5.6545467 |        -5.1992030 |        4.3219333 |         5.7352042 |        -0.0785837 |         -0.0654964 |
|  797 | head.layers.17.kps_generator                   | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 3])       | qint16        | 1.0000000 |  0.9461623 |    1.5340674 |   119.9288483 |       119.9288483 |      -61.2288322 |       -60.4901466 |       63.4121399 |        62.5821571 |         3.5132000 |          4.0605092 |
|  798 | head.layers.17.kps_generator.keypoints_add     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.9449272 |    1.5901867 |   123.4970398 |       123.4970398 |      -65.2660751 |       -64.1395035 |       65.8225250 |        67.1462021 |         3.4346161 |          3.9950123 |
|  799 | head.layers.17.weight_add                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9166571 |    0.2947314 |     6.5795946 |         6.5795946 |       -7.3033996 |        -7.1121898 |        7.7890606 |         7.5996494 |         0.0357389 |          0.0368650 |
|  800 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 3, 4])         | qint16        | 1.0000000 |  0.9999999 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3522723 |          0.3522723 |
|  801 | head.layers.17                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 6, 12])           | qint16        | 1.0000000 |  0.9999999 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3522723 |          0.3522723 |
|  802 | head.layers.17.camera_encoder.0                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.6280841 |    0.4363261 |     6.1943283 |         6.1943283 |       -6.1943283 |         0.0000000 |        5.5422120 |         5.5422120 |        -0.1244024 |          0.3119237 |
|  803 | head.layers.17.camera_encoder.1                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  1.0000002 |    0.0000000 |     0.0000000 |         0.0000000 |        0.0000000 |         0.0000000 |        5.5422120 |         5.5422120 |         0.3119237 |          0.3119237 |
|  804 | head.layers.17.camera_encoder.2                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  1.0000010 |    0.0000001 |     0.0000007 |         0.0000007 |       -0.8844011 |        -0.8844013 |        4.6364288 |         4.6364288 |         0.0187403 |          0.0187403 |
|  805 | head.layers.17.camera_encoder.3                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.8456172 |    1.2005212 |    15.0362740 |        15.0362740 |      -15.0362740 |         0.0000000 |       33.1644135 |        33.1644096 |        -0.2752628 |          0.9252583 |
|  806 | head.layers.17.camera_encoder.4                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.9999997 |    0.0000001 |     0.0000038 |         0.0000038 |        0.0000000 |         0.0000000 |       33.1644135 |        33.1644096 |         0.9252583 |          0.9252583 |
|  807 | head.layers.17.camera_encoder.5                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  1.0000013 |    0.0000000 |     0.0000010 |         0.0000010 |       -0.8831643 |        -0.8831642 |        8.0350962 |         8.0350971 |         0.0272648 |          0.0272648 |
|  808 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 256])     | qint8         | 1.0000000 |  0.9166571 |    0.2947314 |     6.5795946 |         6.5795946 |       -7.3033996 |        -7.1121898 |        7.7890606 |         7.5996494 |         0.0357389 |          0.0368650 |
|  809 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 6, 256])       | qint8         | 1.0000000 |  1.0000013 |    0.0000000 |     0.0000010 |         0.0000010 |       -0.8831643 |        -0.8831642 |        8.0350962 |         8.0350971 |         0.0272648 |          0.0272648 |
|  810 | head.layers.17.cam_add                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 6, 256])     | qint8         | 1.0000000 |  0.9321718 |    0.2947313 |     6.5795951 |         6.5795951 |       -6.3705730 |        -6.3093061 |       14.1512852 |        14.2693596 |         0.0630037 |          0.0641298 |
|  811 | head.layers.17.weights_fc                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 6, 64])      | qint8         | 1.0000000 |  0.9777009 |    0.3409520 |     5.4201908 |         5.4201908 |      -10.8920231 |       -11.2572441 |        9.9132204 |        10.0129881 |        -0.6145036 |         -0.6987212 |
|  812 | head.layers.17                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.9777009 |    0.3409520 |     5.4201908 |         5.4201908 |      -10.8920231 |       -11.2572441 |        9.9132204 |        10.0129881 |        -0.6145036 |         -0.6987212 |
|  813 | head.layers.17.weight_softmax                  | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.8378586 |    0.0183049 |     0.7978753 |         0.7978753 |        0.0000002 |         0.0000000 |        0.8291253 |         0.0312500 |         0.0208333 |          0.0025499 |
|  814 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  0.9368299 |    2.9927866 |   123.4970398 |       123.4970398 |      -65.2660751 |       -64.1395035 |       65.8225250 |        67.1462021 |         6.0511255 |          7.4459867 |
|  815 | head.layers.17                                 | torch.ones_like                                                               | torch.ones_like                                                         | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
|  816 | head.layers.17.point_quant_stub                | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
|  817 | head.layers.17.point_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 8, 4])       | qint16        | 1.0000000 |  0.9449250 |    1.1926399 |   123.4970398 |       123.4970398 |      -65.2660751 |       -64.1395035 |       65.8225250 |        67.1462021 |         2.8259621 |          3.2462595 |
|  818 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 1, 1, 4, 4])   | qint16        | 1.0000000 |  0.9999996 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3267042 |          0.3267042 |
|  819 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 512, 8, 1, 4]) | qint16        | 1.0000000 |  0.9449250 |    1.1926399 |   123.4970398 |       123.4970398 |      -65.2660751 |       -64.1395035 |       65.8225250 |        67.1462021 |         2.8259621 |          3.2462595 |
|  820 | head.layers.17.point_matmul                    | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 4, 4]) | qint16        | 1.0000000 |  0.9660568 |    0.5295860 |   242.3570404 |       242.3570404 |     -175.3843079 |      -174.3983002 |      580.3463745 |       566.1422729 |         1.0055325 |          1.0141692 |
|  821 | head.layers.17.point_sum                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 6, 512, 8, 4])    | qint16        | 0.0244144 |  0.9660092 |    1.7625415 |   254.8898621 |     10440.1294463 |     -177.6119995 |      -176.7485962 |      628.8800659 |       634.9107056 |         4.0221300 |          4.0566769 |
|  822 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.9386753 |    2.1289709 |   123.2946243 |      5050.0707545 |      -71.4022293 |       -71.8462601 |       80.1010284 |        76.2051239 |         1.0270514 |          1.0353438 |
|  823 | head.layers.17                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.9605545 |    1.1588666 |    76.4619217 |      3131.8325245 |        0.0100000 |         0.0100000 |       80.1010284 |        76.2051239 |        11.3885317 |         11.3717737 |
|  824 | head.layers.17.reciprocal_op                   | horizon_plugin_pytorch.nn.reciprocal.Reciprocal                               | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.9365040 |   42.5349579 |    99.9840012 |     32762.2575876 |        0.0124842 |         0.0131225 |      100.0000000 |         1.2799804 |        43.2009277 |          0.6870760 |
|  825 | head.layers.17                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.9365040 |   42.5349579 |    99.9840012 |     32762.2575876 |        0.0124842 |         0.0131225 |      100.0000000 |         1.2799804 |        43.2009277 |          0.6870760 |
|  826 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0244144 |  0.9710517 |    2.4605978 |   254.8898621 |     10440.1294463 |     -177.6119995 |      -176.7485962 |      628.8800659 |       634.9107056 |         7.0307350 |          7.0956831 |
|  827 | head.layers.17.point_mul                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.8995845 | 1454.6943359 | 62221.5703125 | 203884530.5689549 |   -17761.1992188 |      -226.2347412 |    62888.0078125 |       812.6732788 |      1027.7742920 |         12.7107229 |
|  828 | head.layers.17                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.5681419 |  176.5191345 |   603.1536865 |   1976383.8427758 |     -500.0000000 |      -226.2347412 |      500.0000000 |       500.0000000 |       103.0093155 |         12.3209076 |
|  829 | head.layers.17                                 | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([156, 512, 8, 2])      | qint16        | 0.0003052 |  0.5681419 |  176.5191345 |   603.1536865 |   1976383.8427758 |     -500.0000000 |      -226.2347412 |      500.0000000 |       500.0000000 |       103.0093155 |         12.3209076 |
|  830 | head.layers.17                                 | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer            | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer      | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.7453675 |    0.3929844 |    33.3775406 |        33.3775406 |      -59.6232224 |       -55.0571556 |       48.0160789 |        52.4262428 |         0.0093888 |          0.0147048 |
|  831 | head.layers.17.feat_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.7453675 |    0.3929844 |    33.3775406 |        33.3775406 |      -59.6232224 |       -55.0571556 |       48.0160789 |        52.4262428 |         0.0093888 |          0.0147048 |
|  832 | head.layers.17                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 6, 256, 512, 8])  | qint8         | 1.0000000 |  0.7453675 |    0.3929844 |    33.3775406 |        33.3775406 |      -59.6232224 |       -55.0571556 |       48.0160789 |        52.4262428 |         0.0093888 |          0.0147048 |
|  833 | head.layers.17                                 | torch.Tensor.permute                                                          | torch.Tensor.permute                                                    | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.7452183 |    0.3929844 |    33.3775406 |        33.3775406 |      -59.6232224 |       -55.0571556 |       48.0160789 |        52.4262428 |         0.0093888 |          0.0147048 |
|  834 | head.layers.17                                 | torch.Tensor.contiguous                                                       | torch.Tensor.contiguous                                                 | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.7452183 |    0.3929843 |    33.3775406 |        33.3775406 |      -59.6232224 |       -55.0571556 |       48.0160789 |        52.4262428 |         0.0093888 |          0.0147048 |
|  835 | head.layers.17                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.7452183 |    0.3929843 |    33.3775406 |        33.3775406 |      -59.6232224 |       -55.0571556 |       48.0160789 |        52.4262428 |         0.0093888 |          0.0147048 |
|  836 | head.layers.17                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 48, 8, 1])   | qint8         | 1.0000000 |  0.8378586 |    0.0183049 |     0.7978753 |         0.7978753 |        0.0000002 |         0.0000000 |        0.8291253 |         0.0312500 |         0.0208333 |          0.0025499 |
|  837 | head.layers.17                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.7452183 |    0.3929843 |    33.3775406 |        33.3775406 |      -59.6232224 |       -55.0571556 |       48.0160789 |        52.4262428 |         0.0093888 |          0.0147048 |
|  838 | head.layers.17.feat_mul                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.5189161 |    0.0073291 |     6.2103500 |         6.2103500 |       -6.2166033 |        -0.5640429 |        5.2191348 |         0.5658653 |        -0.0000183 |          0.0000014 |
|  839 | head.layers.17                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.5189161 |    0.0073291 |     6.2103500 |         6.2103500 |       -6.2166033 |        -0.5640429 |        5.2191348 |         0.5658653 |        -0.0000183 |          0.0000014 |
|  840 | head.layers.17.feat_sum                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6737654 |    0.2695101 |     7.1271253 |         7.1271253 |       -7.2123656 |        -1.2277672 |        7.6106024 |         1.7451510 |        -0.0008780 |          0.0000651 |
|  841 | head.layers.17.output_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6411375 |    0.3731322 |     9.0030403 |         9.0030403 |       -9.4673042 |        -1.3134321 |        8.6754427 |         1.2705754 |         0.0221906 |          0.0047271 |
|  842 | head.layers.17.proj_drop                       | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6411375 |    0.3731322 |     9.0030403 |         9.0030403 |       -9.4673042 |        -1.3134321 |        8.6754427 |         1.2705754 |         0.0221906 |          0.0047271 |
|  843 | head.layers.17.residual_op                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.7811755 |    0.2864545 |     9.0030403 |         9.0030403 |       -9.4673042 |        -6.5884376 |        8.6754427 |         6.3722897 |         0.0119459 |          0.0033684 |
|  844 | head.layers.18.pre_norm                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.8297296 |    0.3417283 |     5.2205825 |         5.2205825 |       -8.0386229 |        -7.9011259 |        7.9814348 |         8.0651674 |        -0.0014659 |         -0.0002232 |
|  845 | head.layers.18.layers.0.0                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.1458487 |    2.1651165 |    11.3476715 |        11.3476715 |      -11.3476715 |         0.0000000 |       10.0940132 |        11.4789734 |        -1.6479913 |          0.2025129 |
|  846 | head.layers.18.layers.0.2                      | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.5421840 |    0.2535576 |    10.9276590 |        10.9276590 |        0.0000000 |         0.0000000 |       10.0940132 |        11.4789734 |         0.2635678 |          0.2025129 |
|  847 | head.layers.18.layers.1                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.5830857 |    1.8703417 |    30.8988991 |        30.8988991 |      -26.4594040 |       -37.7168274 |       27.9108200 |        36.0714493 |         0.0709837 |          0.0225694 |
|  848 | head.layers.18.layers.2                        | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.5830857 |    1.8703417 |    30.8988991 |        30.8988991 |      -26.4594040 |       -37.7168274 |       27.9108200 |        36.0714493 |         0.0709837 |          0.0225694 |
|  849 | head.layers.18.identity_fc                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  850 | head.layers.18.short_add                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.linear.LinearAdd                          | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6481836 |    2.3475428 |    37.8020515 |        37.8020515 |      -27.4008598 |       -37.1474380 |       28.1620750 |        39.3196640 |         0.0755827 |          0.0385601 |
|  851 | head.layers.19                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8386769 |    0.3988826 |     4.1153102 |         4.1153102 |       -4.3299904 |        -4.5285997 |        4.2881312 |         4.1192503 |        -0.0036711 |         -0.0033507 |
|  852 | head.layers.20.add1                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8475578 |    0.4632587 |     7.3131447 |         7.3131447 |       -4.0825052 |        -4.2122469 |        7.9972210 |         7.4748197 |         0.0303665 |          0.0315046 |
|  853 | head.layers.20.layers.0                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.4261815 |    1.1542869 |    10.5964794 |        10.5964794 |       -8.1629314 |         0.0000000 |        6.9345350 |         6.4255567 |        -0.4508908 |          0.4340766 |
|  854 | head.layers.20.layers.1                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8123557 |    0.2731821 |     5.6255698 |         5.6255698 |        0.0000000 |         0.0000000 |        6.9345350 |         6.4255567 |         0.4302140 |          0.4340766 |
|  855 | head.layers.20.layers.2                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.4147983 |    0.8654157 |     8.3728600 |         8.3728600 |       -8.3728600 |         0.0000000 |        6.4940114 |         5.4712787 |        -0.3396150 |          0.3023591 |
|  856 | head.layers.20.layers.3                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8078114 |    0.2021117 |     6.3753042 |         6.3753042 |        0.0000000 |         0.0000000 |        6.4940114 |         5.4712787 |         0.3236890 |          0.3023591 |
|  857 | head.layers.20.layers.4                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7645888 |    0.3531571 |     6.9489250 |         6.9489250 |       -0.7820206 |        -0.7725231 |        7.1311121 |         7.3163223 |         0.0461468 |          0.0477376 |
|  858 | head.layers.20.layers.5                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.4010863 |    1.2059869 |     7.8079782 |         7.8079782 |       -7.8079782 |         0.0000000 |        7.1935453 |         6.6699982 |        -0.5328702 |          0.3765273 |
|  859 | head.layers.20.layers.6                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7899072 |    0.2705198 |     6.0775542 |         6.0775542 |        0.0000000 |         0.0000000 |        7.1935453 |         6.6699982 |         0.4025970 |          0.3765273 |
|  860 | head.layers.20.layers.7                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.4853362 |    1.1444564 |     8.6574078 |         8.6574078 |       -6.3946404 |         0.0000000 |       19.1877708 |        13.9219446 |        -0.6597867 |          0.2863911 |
|  861 | head.layers.20.layers.8                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8828164 |    0.1804820 |     8.6574078 |         8.6574078 |        0.0000000 |         0.0000000 |       19.1877708 |        13.9219446 |         0.3041877 |          0.2863911 |
|  862 | head.layers.20.layers.9                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8221852 |    0.2554404 |     9.2097301 |         9.2097301 |       -0.8038625 |        -0.8235870 |       11.5821629 |        10.7275743 |         0.0331788 |          0.0319185 |
|  863 | head.layers.20.layers.10                       | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.4679327 |    0.7535705 |     8.7926998 |         8.7926998 |       -8.2503700 |        -4.8412566 |        6.9291296 |         6.1741776 |        -0.1666361 |          0.0110634 |
|  864 | head.layers.20.layers.11.scale_quant_stub      | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([11])                  | qint16        | 1.0000000 |  0.9999999 |    0.0000000 |     0.0000000 |         0.0000000 |        0.0143771 |         0.0143771 |        1.0027264 |         1.0027264 |         0.2091208 |          0.2091208 |
|  865 | head.layers.20.layers.11.mul                   | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.6743839 |    0.1140160 |     3.1468191 |         3.1468191 |       -3.3436906 |        -3.3931222 |        1.6222110 |         1.4050461 |        -0.0436542 |         -0.0460995 |
|  866 | head.layers.20.add2                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.9408759 |    0.9499465 |   120.3426056 |       120.3426056 |      -61.4114532 |       -61.2451820 |       63.8156509 |        63.4035683 |        -0.2811545 |          0.0151261 |
|  867 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 11])         | torch.float32 |           |  0.9408759 |    0.9499465 |   120.3426056 |                   |      -61.4114532 |       -61.2451820 |       63.8156509 |        63.4035683 |        -0.2811545 |          0.0151261 |
|  868 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.9457929 |    1.6067276 |   120.3426056 |       120.3426056 |      -61.4114532 |       -61.2451820 |       63.8156509 |        63.4035683 |         3.4511399 |          3.9868870 |
|  869 | head.anchor_encoder.pos_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6662903 |    5.4413862 |    81.2708969 |        81.2708969 |      -81.2708969 |         0.0000000 |       90.3814316 |        77.5268631 |        -0.0707683 |          4.8230281 |
|  870 | head.anchor_encoder.pos_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9559772 |    0.5279758 |    74.6690292 |        74.6690292 |        0.0000000 |         0.0000000 |       90.3814316 |        77.5268631 |         4.8426423 |          4.8230281 |
|  871 | head.anchor_encoder.pos_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9287415 |    0.1212320 |     3.9822969 |         3.9822969 |       -0.9516843 |        -0.9631965 |        3.9192297 |         3.4822176 |         0.0103544 |          0.0102417 |
|  872 | head.anchor_encoder.pos_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6157752 |    0.6060709 |     4.6541924 |         4.6541924 |       -4.6541924 |         0.0000000 |        5.5000858 |         5.6533303 |        -0.1940164 |          0.3398240 |
|  873 | head.anchor_encoder.pos_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9483148 |    0.0783730 |     3.7482383 |         3.7482383 |        0.0000000 |         0.0000000 |        5.5000858 |         5.6533303 |         0.3336815 |          0.3398240 |
|  874 | head.anchor_encoder.pos_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9737279 |    0.0845373 |     3.3746800 |         3.3746800 |       -0.9994006 |        -0.9982221 |        6.3527932 |         6.3497281 |         0.0788118 |          0.0803683 |
|  875 | head.anchor_encoder.pos_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6245921 |    0.6869606 |     6.1278539 |         6.1278539 |       -6.1278539 |         0.0000000 |        5.9457550 |         5.8732820 |        -0.0873979 |          0.5225943 |
|  876 | head.anchor_encoder.pos_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9786042 |    0.0774935 |     3.8415279 |         3.8415279 |        0.0000000 |         0.0000000 |        5.9457550 |         5.8732820 |         0.5220692 |          0.5225943 |
|  877 | head.anchor_encoder.pos_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9789562 |    0.0908957 |     4.2194066 |         4.2194066 |       -0.8710874 |        -0.8392401 |        5.6882563 |         5.7133956 |         0.0260892 |          0.0259493 |
|  878 | head.anchor_encoder.pos_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6193979 |    0.8951263 |     5.5694566 |         5.5694566 |       -5.5694566 |         0.0000000 |        8.3998251 |         8.3798695 |        -0.3400991 |          0.4856344 |
|  879 | head.anchor_encoder.pos_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9865916 |    0.0671846 |     3.0188124 |         3.0188124 |        0.0000000 |         0.0000000 |        8.3998251 |         8.3798695 |         0.4878426 |          0.4856344 |
|  880 | head.anchor_encoder.pos_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9816399 |    0.0749932 |     3.8585458 |         3.8585458 |       -0.8259857 |        -0.8309129 |        7.4459457 |         7.4205937 |         0.0266029 |          0.0264513 |
|  881 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.9146768 |    0.2227771 |     1.8436557 |         1.8436557 |       -0.5917235 |         0.1163842 |        2.6177289 |         2.6139724 |         0.7545840 |          0.9343166 |
|  882 | head.anchor_encoder.size_fc.0                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3398853 |    0.4837066 |     2.7918489 |         2.7918489 |       -2.7918489 |         0.0000000 |        1.4943837 |         1.4838407 |        -0.3150853 |          0.1270864 |
|  883 | head.anchor_encoder.size_fc.1                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9308348 |    0.0328701 |     1.4239479 |         1.4239479 |        0.0000000 |         0.0000000 |        1.4943837 |         1.4838407 |         0.1357512 |          0.1270864 |
|  884 | head.anchor_encoder.size_fc.2                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9309498 |    0.1274294 |     4.2712789 |         4.2712789 |       -0.7245209 |        -0.6214619 |        3.9325361 |         3.9253945 |         0.0216174 |          0.0221447 |
|  885 | head.anchor_encoder.size_fc.3                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.5254207 |    0.3000733 |     2.0490496 |         2.0490496 |       -1.7734113 |         0.0000000 |        1.6176540 |         1.1328273 |        -0.0188177 |          0.2053949 |
|  886 | head.anchor_encoder.size_fc.4                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8742398 |    0.0700332 |     1.4277484 |         1.4277484 |        0.0000000 |         0.0000000 |        1.6176540 |         1.1328273 |         0.2112224 |          0.2053949 |
|  887 | head.anchor_encoder.size_fc.5                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8746064 |    0.2376021 |     3.8898766 |         3.8898766 |       -0.9084286 |        -0.9232904 |        3.3141389 |         3.2634864 |         0.0077311 |          0.0093573 |
|  888 | head.anchor_encoder.size_fc.6                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.5987380 |    0.4837955 |     2.8362179 |         2.8362179 |       -2.1092327 |         0.0000000 |        2.2705166 |         1.7658021 |        -0.0414776 |          0.3058710 |
|  889 | head.anchor_encoder.size_fc.7                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8933774 |    0.1189839 |     2.1266274 |         2.1266274 |        0.0000000 |         0.0000000 |        2.2705166 |         1.7658021 |         0.3233340 |          0.3058710 |
|  890 | head.anchor_encoder.size_fc.8                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8621680 |    0.2499082 |     4.2342815 |         4.2342815 |       -0.7934900 |        -0.7577391 |        3.5861773 |         3.4018760 |         0.0113513 |          0.0124808 |
|  891 | head.anchor_encoder.size_fc.9                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.6822186 |    0.4519136 |     3.6402483 |         3.6402483 |       -2.5896089 |         0.0000000 |        2.9839382 |         2.9447837 |         0.1368447 |          0.4463114 |
|  892 | head.anchor_encoder.size_fc.10                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9028763 |    0.1635347 |     2.8441770 |         2.8441770 |        0.0000000 |         0.0000000 |        2.9839382 |         2.9447837 |         0.4252236 |          0.4463114 |
|  893 | head.anchor_encoder.size_fc.11                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7579433 |    0.3395864 |     4.5583711 |         4.5583711 |       -1.1276072 |        -1.1708668 |        3.9533150 |         3.8964591 |         0.0269888 |          0.0342139 |
|  894 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 2])          | qint16        | 1.0000000 |  0.9812081 |    0.0749857 |     0.5302731 |         0.5302731 |       -1.5940011 |        -1.1709018 |        0.1751247 |         0.1162078 |        -0.5374374 |         -0.5164018 |
|  895 | head.anchor_encoder.yaw_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7899683 |    0.1672976 |     1.4104571 |         1.4104571 |       -1.4104571 |         0.0000000 |        1.4950578 |         1.2849262 |         0.1378799 |          0.2603890 |
|  896 | head.anchor_encoder.yaw_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9809589 |    0.0307144 |     0.4921702 |         0.4921702 |        0.0000000 |         0.0000000 |        1.4950578 |         1.2849262 |         0.2744632 |          0.2603890 |
|  897 | head.anchor_encoder.yaw_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9804043 |    0.0741193 |     1.1067793 |         1.1067793 |       -1.2578577 |        -1.2197527 |        3.0007057 |         2.9929183 |        -0.0002069 |         -0.0004603 |
|  898 | head.anchor_encoder.yaw_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.4464929 |    0.4656790 |     3.2632475 |         3.2632475 |       -3.2632475 |         0.0000000 |        1.8493295 |         1.9062634 |        -0.1651650 |          0.2150241 |
|  899 | head.anchor_encoder.yaw_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8823533 |    0.0731931 |     1.0485559 |         1.0485559 |        0.0000000 |         0.0000000 |        1.8493295 |         1.9062634 |         0.2273209 |          0.2150241 |
|  900 | head.anchor_encoder.yaw_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8920458 |    0.2090399 |     3.2659729 |         3.2659729 |       -0.9142419 |        -0.8787364 |        4.0649195 |         4.0889006 |         0.0089576 |          0.0063147 |
|  901 | head.anchor_encoder.yaw_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3539858 |    0.6571617 |     3.7682583 |         3.7682583 |       -3.7682583 |         0.0000000 |        2.5286319 |         1.9377078 |        -0.2451560 |          0.2636159 |
|  902 | head.anchor_encoder.yaw_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7956181 |    0.1240438 |     2.2386844 |         2.2386844 |        0.0000000 |         0.0000000 |        2.5286319 |         1.9377078 |         0.2879619 |          0.2636159 |
|  903 | head.anchor_encoder.yaw_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8061285 |    0.2951278 |     4.2963595 |         4.2963595 |       -0.9107977 |        -0.9075544 |        3.6726184 |         3.5626533 |         0.0131945 |          0.0110467 |
|  904 | head.anchor_encoder.yaw_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3347160 |    0.8986202 |     5.1870527 |         5.1870527 |       -5.1870527 |         0.0000000 |        2.4999964 |         2.6191106 |        -0.4997087 |          0.2458751 |
|  905 | head.anchor_encoder.yaw_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8590425 |    0.1333440 |     2.3649197 |         2.3649197 |        0.0000000 |         0.0000000 |        2.4999964 |         2.6191106 |         0.2655676 |          0.2458751 |
|  906 | head.anchor_encoder.yaw_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7479300 |    0.2961020 |     4.3707428 |         4.3707428 |       -0.9791188 |        -0.8451878 |        4.9246840 |         4.9035611 |         0.0789727 |          0.0693044 |
|  907 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.9267346 |    1.6036417 |    56.4928474 |        56.4928474 |      -47.7116699 |       -44.7710266 |       14.7879381 |        11.9606915 |        -4.8783312 |         -4.5214729 |
|  908 | head.anchor_encoder.vel_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.6471172 |    3.2675962 |    31.0316238 |        31.0316238 |      -28.3913002 |         0.0000000 |       25.7778225 |        26.4156742 |        -0.2606266 |          2.5081594 |
|  909 | head.anchor_encoder.vel_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9427844 |    0.5040427 |    25.9349976 |        25.9349976 |        0.0000000 |         0.0000000 |       25.7778225 |        26.4156742 |         2.5029268 |          2.5081594 |
|  910 | head.anchor_encoder.vel_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8808650 |    0.1897332 |     4.2446985 |         4.2446985 |       -0.9007550 |        -0.8914598 |        3.5551860 |         3.5342040 |         0.0194083 |          0.0176342 |
|  911 | head.anchor_encoder.vel_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.4578798 |    0.4464912 |     3.4469709 |         3.4469709 |       -2.8872979 |         0.0000000 |        3.1711819 |         2.8406754 |        -0.1362938 |          0.2173219 |
|  912 | head.anchor_encoder.vel_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8434453 |    0.0952109 |     2.9284244 |         2.9284244 |        0.0000000 |         0.0000000 |        3.1711819 |         2.8406754 |         0.2149865 |          0.2173219 |
|  913 | head.anchor_encoder.vel_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8371875 |    0.2508913 |     4.8940840 |         4.8940840 |       -0.9139361 |        -0.8778539 |        4.7337422 |         4.1780982 |         0.0365555 |          0.0332260 |
|  914 | head.anchor_encoder.vel_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.5354795 |    0.7211871 |     5.1821218 |         5.1821218 |       -4.7908802 |         0.0000000 |        3.6071508 |         3.6171730 |        -0.1640110 |          0.3819852 |
|  915 | head.anchor_encoder.vel_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8573098 |    0.1650359 |     3.6171730 |         3.6171730 |        0.0000000 |         0.0000000 |        3.6071508 |         3.6171730 |         0.3921402 |          0.3819852 |
|  916 | head.anchor_encoder.vel_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8261048 |    0.2561660 |     5.6455603 |         5.6455603 |       -0.8897209 |        -0.8696393 |        5.2060728 |         4.9442825 |         0.0209293 |          0.0200726 |
|  917 | head.anchor_encoder.vel_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.5696287 |    0.7112867 |     5.9017711 |         5.9017711 |       -4.7296329 |         0.0000000 |        4.4680443 |         4.8385744 |        -0.2079535 |          0.3399459 |
|  918 | head.anchor_encoder.vel_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8722224 |    0.1517266 |     4.4076200 |         4.4076200 |        0.0000000 |         0.0000000 |        4.4680443 |         4.8385744 |         0.3516066 |          0.3399459 |
|  919 | head.anchor_encoder.vel_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8320302 |    0.2301921 |     5.7950344 |         5.7950344 |       -0.7911164 |        -0.7159860 |        5.3595438 |         5.7064395 |         0.0252123 |          0.0275159 |
|  920 | head.anchor_encoder.cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8860939 |    0.1745057 |     5.7950344 |         5.7950344 |       -1.1276072 |        -1.1708668 |        7.4459457 |         7.4205937 |         0.0328497 |          0.0330444 |
|  921 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 256, 512])        | qint16        | 0.0001526 |  0.4880881 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
|  922 | head.layers.21.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.8628103 |    0.2866941 |     5.7950344 |     12659.2526685 |       -4.3299904 |        -4.5285997 |        7.4459457 |         7.4205937 |         0.0145893 |          0.0148469 |
|  923 | head.layers.21.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 256, 512])        | qint16        | 0.0001786 |  0.9560770 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
|  924 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8628223 |    0.2866941 |     5.7950344 |     12659.2526685 |       -4.3299904 |        -4.5285997 |        7.4459457 |         7.4205937 |         0.0145893 |          0.0148469 |
|  925 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.9560735 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
|  926 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.4880882 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
|  927 | head.layers.21.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8628223 |    0.2866941 |     5.7950344 |     12659.2526685 |       -4.3299904 |        -4.5285997 |        7.4459457 |         7.4205937 |         0.0145893 |          0.0148469 |
|  928 | head.layers.21.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.9560735 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
|  929 | head.layers.21.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.4880882 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
|  930 | head.layers.21.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9229110 |    0.6831219 |     7.8726835 |         7.8726835 |       -9.7280598 |        -9.1553411 |       12.3472891 |        11.6906776 |         0.0724805 |          0.0830354 |
|  931 | head.layers.21.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.9446145 |    0.1713149 |     9.9460716 |         9.9460716 |      -10.4969025 |        -7.0630894 |        9.8413448 |         9.3779888 |        -0.0799089 |         -0.0812439 |
|  932 | head.layers.21.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.5036409 |    0.0175903 |     1.1606131 |         1.1606131 |       -1.1415849 |        -0.6970952 |        1.1397589 |         0.6307871 |        -0.0002788 |         -0.0002424 |
|  933 | head.layers.21.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.9229110 |    0.6831219 |     7.8726835 |         7.8726835 |       -9.7280598 |        -9.1553411 |       12.3472891 |        11.6906776 |         0.0724805 |          0.0830354 |
|  934 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.9229633 |    0.6831219 |     7.8726835 |         7.8726835 |       -9.7280598 |        -9.1553411 |       12.3472891 |        11.6906776 |         0.0724805 |          0.0830354 |
|  935 | head.layers.21.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  936 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  937 | head.layers.21.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 208, 64])        | qint8         | 1.0000000 |  0.0110092 |    1.4320352 |     9.4638205 |         9.4638205 |       -1.1415849 |        -7.0630894 |        1.1397589 |         9.3779888 |        -0.0002788 |         -0.0812439 |
|  938 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 256, 64])        | qint8         | 1.0000000 |  0.0110101 |    1.4320352 |     9.4638205 |         9.4638205 |       -1.1415849 |        -7.0630894 |        1.1397589 |         9.3779888 |        -0.0002788 |         -0.0812439 |
|  939 | head.layers.21.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.9229633 |    0.0853902 |     0.9840854 |         7.8726835 |       -1.2160075 |        -1.1444176 |        1.5434111 |         1.4613347 |         0.0090601 |          0.0103794 |
|  940 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  941 | head.layers.21.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.8220216 |    4.4715476 |   165.6216278 |       165.6216278 |      -92.0512543 |       -87.6348724 |      150.9754791 |        84.2201462 |         1.8143772 |          1.3761357 |
|  942 | head.layers.21.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.2375555 |    0.0007153 |     0.9999915 |         0.9999915 |        0.0000000 |         0.0000000 |        0.9999915 |         0.0312500 |         0.0039062 |          0.0035063 |
|  943 | head.layers.21.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.2375555 |    0.0007153 |     0.9999915 |         0.9999915 |        0.0000000 |         0.0000000 |        0.9999915 |         0.0312500 |         0.0039062 |          0.0035063 |
|  944 | head.layers.21.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.4302758 |    0.0151486 |     0.8621557 |         0.8621557 |       -0.7978977 |        -0.4857439 |        0.8341381 |         0.4596162 |        -0.0000765 |         -0.0006610 |
|  945 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  946 | head.layers.21.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  947 | head.layers.21.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.7601025 |    0.0210867 |     0.8331994 |         0.8331994 |       -0.8459857 |        -0.4365775 |        1.0013692 |         0.4269225 |         0.0062259 |          0.0066074 |
|  948 | head.layers.21.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 256])     | qint8         | 1.0000000 |  0.2375555 |    0.0007153 |     0.9999915 |         0.9999915 |        0.0000000 |         0.0000000 |        0.9999915 |         0.0312500 |         0.0039062 |          0.0035063 |
|  949 | head.layers.21.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.4376347 |    0.0006100 |     0.5414749 |         0.5414749 |        0.0000000 |         0.0000000 |        0.5453861 |         0.0144771 |         0.0039062 |          0.0035063 |
|  950 | head.layers.21.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  951 | head.layers.21.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.7602104 |    0.0210867 |     0.8331994 |         0.8331994 |       -0.8459857 |        -0.4365775 |        1.0013692 |         0.4269225 |         0.0062259 |          0.0066074 |
|  952 | head.layers.21.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.8580279 |    0.2957306 |     5.7922440 |         5.7922440 |       -4.7941599 |        -4.6880941 |        7.4059176 |         7.4592676 |         0.0208152 |          0.0214543 |
|  953 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.8486995 |    0.3500187 |     5.5700531 |      3650.3343003 |      -11.7414236 |       -10.4825926 |       12.3587284 |         9.8582373 |         0.0120853 |          0.0098125 |
|  954 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 512])        | qint16        | 0.0001526 |  0.7488167 |    0.1168981 |     2.2436492 |     14703.7553260 |       -3.8849745 |        -3.4000278 |        3.4935579 |         3.1680784 |        -0.0003597 |          0.0028658 |
|  955 | head.layers.22.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.8689835 |    0.2622622 |     5.7950344 |     12659.2526685 |      -11.7414236 |       -10.4825926 |       12.3587284 |         9.8582373 |         0.0224675 |          0.0214285 |
|  956 | head.layers.22.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0002278 |  0.8689835 |    0.2622622 |     5.7950344 |     25444.5123636 |      -11.7414236 |       -10.4825926 |       12.3587284 |         9.8582373 |         0.0224675 |          0.0214285 |
|  957 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8690383 |    0.2622622 |     5.7950344 |     12659.2526685 |      -11.7414236 |       -10.4825926 |       12.3587284 |         9.8582373 |         0.0224675 |          0.0214285 |
|  958 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0002278 |  0.8690383 |    0.2622622 |     5.7950344 |     25444.5123636 |      -11.7414236 |       -10.4825926 |       12.3587284 |         9.8582373 |         0.0224675 |          0.0214285 |
|  959 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.7488836 |    0.1168981 |     2.2436492 |     14703.7553260 |       -3.8849745 |        -3.4000278 |        3.4935579 |         3.1680784 |        -0.0003597 |          0.0028658 |
|  960 | head.layers.22.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8690383 |    0.2622622 |     5.7950344 |     12659.2526685 |      -11.7414236 |       -10.4825926 |       12.3587284 |         9.8582373 |         0.0224675 |          0.0214285 |
|  961 | head.layers.22.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0002278 |  0.8690383 |    0.2622622 |     5.7950344 |     25444.5123636 |      -11.7414236 |       -10.4825926 |       12.3587284 |         9.8582373 |         0.0224675 |          0.0214285 |
|  962 | head.layers.22.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.7488836 |    0.1168981 |     2.2436492 |     14703.7553260 |       -3.8849745 |        -3.4000278 |        3.4935579 |         3.1680784 |        -0.0003597 |          0.0028658 |
|  963 | head.layers.22.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9117771 |    0.4625884 |     6.7102079 |         6.7102079 |       -9.8667269 |        -9.7368813 |        9.0970459 |         8.7560349 |         0.0302589 |          0.0200406 |
|  964 | head.layers.22.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9441436 |    0.4477644 |     7.4589677 |         7.4589677 |      -14.4415436 |       -14.6856642 |       14.7910118 |        14.5077915 |         0.0326098 |          0.0154725 |
|  965 | head.layers.22.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.7332408 |    0.1329738 |     3.9711611 |         3.9711611 |       -4.0362515 |        -2.5306711 |        3.2902493 |         2.7986026 |         0.0067824 |          0.0081619 |
|  966 | head.layers.22.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.9117771 |    0.4625884 |     6.7102079 |         6.7102079 |       -9.8667269 |        -9.7368813 |        9.0970459 |         8.7560349 |         0.0302589 |          0.0200406 |
|  967 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.9117553 |    0.4625884 |     6.7102079 |         6.7102079 |       -9.8667269 |        -9.7368813 |        9.0970459 |         8.7560349 |         0.0302589 |          0.0200406 |
|  968 | head.layers.22.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.3893849 |    1.5494993 |    12.5896330 |        12.5896330 |      -14.4415436 |        -9.7368813 |       14.7910118 |         8.7560349 |         0.0326098 |          0.0200406 |
|  969 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.3893942 |    1.5494993 |    12.5896330 |        12.5896330 |      -14.4415436 |        -9.7368813 |       14.7910118 |         8.7560349 |         0.0326098 |          0.0200406 |
|  970 | head.layers.22.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0041621 |    1.4198445 |    15.3922443 |        15.3922443 |       -4.0362515 |       -14.6856642 |        3.2902493 |        14.5077915 |         0.0067824 |          0.0154725 |
|  971 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0041620 |    1.4198445 |    15.3922443 |        15.3922443 |       -4.0362515 |       -14.6856642 |        3.2902493 |        14.5077915 |         0.0067824 |          0.0154725 |
|  972 | head.layers.22.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.9117553 |    0.0578236 |     0.8387760 |         6.7102079 |       -1.2333409 |        -1.2171102 |        1.1371307 |         1.0945044 |         0.0037824 |          0.0025051 |
|  973 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  974 | head.layers.22.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.9738427 |    2.3080120 |    54.6063995 |        54.6063995 |      -80.0654144 |       -74.4190750 |      126.3224258 |       120.4988708 |         1.3015550 |          1.7933092 |
|  975 | head.layers.22.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.4731957 |    0.0014770 |     0.9891287 |         0.9891287 |        0.0000000 |         0.0000000 |        0.9997348 |         0.0312500 |         0.0019531 |          0.0012285 |
|  976 | head.layers.22.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.4731957 |    0.0014770 |     0.9891287 |         0.9891287 |        0.0000000 |         0.0000000 |        0.9997348 |         0.0312500 |         0.0019531 |          0.0012285 |
|  977 | head.layers.22.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.5122814 |    0.1252349 |     1.7641037 |         1.7641037 |       -2.1900775 |        -1.5012517 |        1.6629556 |         2.1263106 |         0.0048753 |          0.0047139 |
|  978 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  979 | head.layers.22.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  980 | head.layers.22.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.5935094 |    0.1945754 |     1.5094202 |         1.5094202 |       -1.6154630 |        -1.1449928 |        1.5701588 |         1.1167853 |         0.0083483 |          0.0014632 |
|  981 | head.layers.22.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 512])     | qint8         | 1.0000000 |  0.4731957 |    0.0014770 |     0.9891287 |         0.9891287 |        0.0000000 |         0.0000000 |        0.9997348 |         0.0312500 |         0.0019531 |          0.0012285 |
|  982 | head.layers.22.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.6146889 |    0.0011336 |     0.2300375 |         0.2300375 |        0.0000001 |         0.0000001 |        0.2454242 |         0.0165723 |         0.0019531 |          0.0012285 |
|  983 | head.layers.22.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
|  984 | head.layers.22.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.5935187 |    0.1945754 |     1.5094202 |         1.5094202 |       -1.6154630 |        -1.1449928 |        1.5701588 |         1.1167853 |         0.0083483 |          0.0014632 |
|  985 | head.layers.22.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.8524272 |    0.3492086 |     6.1359644 |         6.1359644 |      -11.4163361 |       -10.2843170 |       11.8673887 |         9.0332108 |         0.0308158 |          0.0228916 |
|  986 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.9460892 |    0.5186262 |    10.4608965 |      6855.5485176 |      -30.0457478 |       -26.5911045 |       24.0933380 |        23.3509464 |        -0.0269025 |         -0.0337257 |
|  987 | head.layers.23                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9481273 |    0.2025692 |     3.5312738 |         3.5312738 |       -6.8424716 |        -6.6440630 |        5.5381179 |         5.4979649 |        -0.0016154 |         -0.0021584 |
|  988 | head.layers.24.kps_generator.offset            | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 24])         | qint16        | 1.0000000 |  0.9713609 |    0.1969279 |     4.1172953 |         4.1172953 |       -5.8927641 |        -7.3424683 |        4.7394843 |         5.0581098 |        -0.2875610 |         -0.3128073 |
|  989 | head.layers.24.kps_generator                   | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.9713609 |    0.1969279 |     4.1172953 |         4.1172953 |       -5.8927641 |        -7.3424683 |        4.7394843 |         5.0581098 |        -0.2875610 |         -0.3128073 |
|  990 | head.layers.24.kps_generator                   | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 3])       | qint16        | 1.0000000 |  0.9457929 |    1.6067276 |   120.3426056 |       120.3426056 |      -61.4114532 |       -61.2451820 |       63.8156509 |        63.4035683 |         3.4511399 |          3.9868870 |
|  991 | head.layers.24.kps_generator.keypoints_add     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.9444564 |    1.6543185 |   123.8740082 |       123.8740082 |      -65.4252319 |       -66.7167435 |       66.4458466 |        66.2905426 |         3.1635785 |          3.6740794 |
|  992 | head.layers.24.weight_add                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9082689 |    0.3136970 |     6.6604209 |         6.6604209 |       -7.3953581 |        -7.1717091 |        7.7906666 |         7.5441785 |         0.0312343 |          0.0308860 |
|  993 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 3, 4])         | qint16        | 1.0000000 |  0.9999999 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3522723 |          0.3522723 |
|  994 | head.layers.24                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 6, 12])           | qint16        | 1.0000000 |  0.9999999 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3522723 |          0.3522723 |
|  995 | head.layers.24.camera_encoder.0                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.6457888 |    0.3940585 |     6.2766805 |         6.2766805 |       -6.2766805 |         0.0000000 |        6.1695027 |         6.1695027 |        -0.0835410 |          0.3105174 |
|  996 | head.layers.24.camera_encoder.1                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  1.0000010 |    0.0000000 |     0.0000000 |         0.0000000 |        0.0000000 |         0.0000000 |        6.1695027 |         6.1695027 |         0.3105174 |          0.3105174 |
|  997 | head.layers.24.camera_encoder.2                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  1.0000005 |    0.0000000 |     0.0000005 |         0.0000005 |       -0.8781247 |        -0.8781247 |        4.2228565 |         4.2228570 |         0.0260929 |          0.0260929 |
|  998 | head.layers.24.camera_encoder.3                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.8347589 |    1.2374737 |    14.0334568 |        14.0334568 |      -14.0334568 |         0.0000000 |       33.1354103 |        33.1354103 |        -0.2934401 |          0.9440337 |
|  999 | head.layers.24.camera_encoder.4                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.9999996 |    0.0000000 |     0.0000029 |         0.0000029 |        0.0000000 |         0.0000000 |       33.1354103 |        33.1354103 |         0.9440337 |          0.9440337 |
| 1000 | head.layers.24.camera_encoder.5                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.9999995 |    0.0000000 |     0.0000010 |         0.0000010 |       -1.1285971 |        -1.1285970 |        7.7228999 |         7.7228999 |         0.0302501 |          0.0302501 |
| 1001 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 256])     | qint8         | 1.0000000 |  0.9082689 |    0.3136970 |     6.6604209 |         6.6604209 |       -7.3953581 |        -7.1717091 |        7.7906666 |         7.5441785 |         0.0312343 |          0.0308860 |
| 1002 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 6, 256])       | qint8         | 1.0000000 |  0.9999995 |    0.0000000 |     0.0000010 |         0.0000010 |       -1.1285971 |        -1.1285970 |        7.7228999 |         7.7228999 |         0.0302501 |          0.0302501 |
| 1003 | head.layers.24.cam_add                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 6, 256])     | qint8         | 1.0000000 |  0.9243074 |    0.3136970 |     6.6604209 |         6.6604209 |       -5.9532933 |        -5.7296443 |       14.7214832 |        14.5108013 |         0.0614844 |          0.0611361 |
| 1004 | head.layers.24.weights_fc                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 6, 64])      | qint8         | 1.0000000 |  0.9628303 |    0.3627266 |     5.8058138 |         5.8058138 |      -10.4819231 |       -10.0961800 |        7.9159746 |         8.0423632 |        -0.3607157 |         -0.4477729 |
| 1005 | head.layers.24                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.9628303 |    0.3627266 |     5.8058138 |         5.8058138 |      -10.4819231 |       -10.0961800 |        7.9159746 |         8.0423632 |        -0.3607157 |         -0.4477729 |
| 1006 | head.layers.24.weight_softmax                  | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.8430026 |    0.0184553 |     0.7398361 |         0.7398361 |        0.0000001 |         0.0000000 |        0.7710861 |         0.0312500 |         0.0208333 |          0.0024418 |
| 1007 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  0.9359101 |    3.1231716 |   123.8740082 |       123.8740082 |      -65.4252319 |       -66.7167435 |       66.4458466 |        66.2905426 |         5.3505440 |          6.5968666 |
| 1008 | head.layers.24                                 | torch.ones_like                                                               | torch.ones_like                                                         | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
| 1009 | head.layers.24.point_quant_stub                | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
| 1010 | head.layers.24.point_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 8, 4])       | qint16        | 1.0000000 |  0.9444600 |    1.2407387 |   123.8740082 |       123.8740082 |      -65.4252319 |       -66.7167435 |       66.4458466 |        66.2905426 |         2.6226840 |          3.0055597 |
| 1011 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 1, 1, 4, 4])   | qint16        | 1.0000000 |  0.9999996 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3267042 |          0.3267042 |
| 1012 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 512, 8, 1, 4]) | qint16        | 1.0000000 |  0.9444600 |    1.2407387 |   123.8740082 |       123.8740082 |      -65.4252319 |       -66.7167435 |       66.4458466 |        66.2905426 |         2.6226840 |          3.0055597 |
| 1013 | head.layers.24.point_matmul                    | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 4, 4]) | qint16        | 1.0000000 |  0.9653527 |    0.5477552 |   234.4601440 |       234.4601440 |     -178.1869659 |      -176.7801361 |      574.7498779 |       551.1538696 |         1.1166037 |          1.1233448 |
| 1014 | head.layers.24.point_sum                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 6, 512, 8, 4])    | qint16        | 0.0244144 |  0.9660270 |    1.8166327 |   247.4318237 |     10134.6528575 |     -180.2084503 |      -178.8238525 |      621.8524780 |       619.4092407 |         4.4664145 |          4.4933791 |
| 1015 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.9381572 |    2.2129130 |   123.6708374 |      5065.4802069 |      -71.6775208 |       -72.9383316 |       78.7740479 |        75.6960602 |         1.2317526 |          1.2382271 |
| 1016 | head.layers.24                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.9602607 |    1.2004343 |    75.9844818 |      3112.2768854 |        0.0100000 |         0.0100000 |       78.7740479 |        75.6960602 |        11.5530539 |         11.5829239 |
| 1017 | head.layers.24.reciprocal_op                   | horizon_plugin_pytorch.nn.reciprocal.Reciprocal                               | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.9532312 |   40.9982643 |    99.9836731 |     32762.1500893 |        0.0126945 |         0.0132107 |      100.0000000 |         1.2799804 |        41.6254539 |          0.6453857 |
| 1018 | head.layers.24                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.9532312 |   40.9982643 |    99.9836731 |     32762.1500893 |        0.0126945 |         0.0132107 |      100.0000000 |         1.2799804 |        41.6254539 |          0.6453857 |
| 1019 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0244144 |  0.9711590 |    2.5268090 |   247.4318237 |     10134.6528575 |     -180.2084503 |      -178.8238525 |      621.8524780 |       619.4092407 |         7.8169532 |          7.8676448 |
| 1020 | head.layers.24.point_mul                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.9019801 | 1497.9995117 | 61538.2421875 | 201645435.1348399 |   -18020.8457031 |      -228.8910370 |    62185.2460938 |       792.8317261 |      1069.8862305 |         13.2730150 |
| 1021 | head.layers.24                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.5843124 |  174.1085052 |   603.5291748 |   1977614.2240017 |     -500.0000000 |      -228.8910370 |      500.0000000 |       500.0000000 |       104.2830048 |         12.9153614 |
| 1022 | head.layers.24                                 | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([156, 512, 8, 2])      | qint16        | 0.0003052 |  0.5843124 |  174.1085052 |   603.5291748 |   1977614.2240017 |     -500.0000000 |      -228.8910370 |      500.0000000 |       500.0000000 |       104.2830048 |         12.9153614 |
| 1023 | head.layers.24                                 | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer            | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer      | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.6976991 |    0.4900137 |    40.5277824 |        40.5277824 |      -58.8192863 |       -61.6767960 |       47.2005920 |        49.6896744 |         0.0207807 |          0.0232080 |
| 1024 | head.layers.24.feat_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.6976991 |    0.4900137 |    40.5277824 |        40.5277824 |      -58.8192863 |       -61.6767960 |       47.2005920 |        49.6896744 |         0.0207807 |          0.0232080 |
| 1025 | head.layers.24                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 6, 256, 512, 8])  | qint8         | 1.0000000 |  0.6976991 |    0.4900137 |    40.5277824 |        40.5277824 |      -58.8192863 |       -61.6767960 |       47.2005920 |        49.6896744 |         0.0207807 |          0.0232080 |
| 1026 | head.layers.24                                 | torch.Tensor.permute                                                          | torch.Tensor.permute                                                    | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.6975061 |    0.4900137 |    40.5277824 |        40.5277824 |      -58.8192863 |       -61.6767960 |       47.2005920 |        49.6896744 |         0.0207807 |          0.0232080 |
| 1027 | head.layers.24                                 | torch.Tensor.contiguous                                                       | torch.Tensor.contiguous                                                 | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.6975061 |    0.4900137 |    40.5277824 |        40.5277824 |      -58.8192863 |       -61.6767960 |       47.2005920 |        49.6896744 |         0.0207807 |          0.0232080 |
| 1028 | head.layers.24                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.6975061 |    0.4900137 |    40.5277824 |        40.5277824 |      -58.8192863 |       -61.6767960 |       47.2005920 |        49.6896744 |         0.0207807 |          0.0232080 |
| 1029 | head.layers.24                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 48, 8, 1])   | qint8         | 1.0000000 |  0.8430026 |    0.0184553 |     0.7398361 |         0.7398361 |        0.0000001 |         0.0000000 |        0.7710861 |         0.0312500 |         0.0208333 |          0.0024418 |
| 1030 | head.layers.24                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.6975061 |    0.4900137 |    40.5277824 |        40.5277824 |      -58.8192863 |       -61.6767960 |       47.2005920 |        49.6896744 |         0.0207807 |          0.0232080 |
| 1031 | head.layers.24.feat_mul                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.5715830 |    0.0091128 |     6.1726732 |         6.1726732 |       -6.6026678 |        -0.8209206 |        6.0349498 |         0.8734278 |         0.0001409 |          0.0000204 |
| 1032 | head.layers.24                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.5715830 |    0.0091128 |     6.1726732 |         6.1726732 |       -6.6026678 |        -0.8209206 |        6.0349498 |         0.8734278 |         0.0001409 |          0.0000204 |
| 1033 | head.layers.24.feat_sum                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7054950 |    0.3336383 |     7.8804269 |         7.8804269 |       -7.6226215 |        -1.4775844 |        8.1485882 |         1.5359006 |         0.0067653 |          0.0009809 |
| 1034 | head.layers.24.output_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6914578 |    0.4817154 |     8.9963198 |         8.9963198 |       -9.4317131 |        -1.4590271 |        9.1399231 |         1.5073469 |        -0.0066487 |         -0.0014726 |
| 1035 | head.layers.24.proj_drop                       | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6914578 |    0.4817154 |     8.9963198 |         8.9963198 |       -9.4317131 |        -1.4590271 |        9.1399231 |         1.5073469 |        -0.0066487 |         -0.0014726 |
| 1036 | head.layers.24.residual_op                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.7351879 |    0.3421423 |     8.9963198 |         8.9963198 |       -9.4317131 |        -6.6440630 |        9.1399231 |         5.4979649 |        -0.0041320 |         -0.0018155 |
| 1037 | head.layers.25.pre_norm                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.7883964 |    0.3726358 |     5.5854321 |         5.5854321 |       -6.7459044 |        -6.7396555 |        7.6086082 |         7.3051763 |         0.0015096 |          0.0008075 |
| 1038 | head.layers.25.layers.0.0                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.2532082 |    2.0745215 |    12.3273811 |        12.3273811 |      -12.3273811 |         0.0000000 |       13.0906343 |        13.7491446 |        -1.2311640 |          0.2826326 |
| 1039 | head.layers.25.layers.0.2                      | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.6030374 |    0.3886211 |    11.6022387 |        11.6022387 |        0.0000000 |         0.0000000 |       13.0906343 |        13.7491446 |         0.4547364 |          0.2826326 |
| 1040 | head.layers.25.layers.1                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7081113 |    3.4183612 |    35.5951958 |        35.5951958 |      -51.7966499 |       -59.4333954 |       48.4732132 |        57.9318047 |        -0.0617741 |         -0.0589359 |
| 1041 | head.layers.25.layers.2                        | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7081113 |    3.4183612 |    35.5951958 |        35.5951958 |      -51.7966499 |       -59.4333954 |       48.4732132 |        57.9318047 |        -0.0617741 |         -0.0589359 |
| 1042 | head.layers.25.identity_fc                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1043 | head.layers.25.short_add                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.linear.LinearAdd                          | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7157453 |    3.8905892 |    40.8215065 |        40.8215065 |      -53.9970856 |       -63.4938736 |       48.4311714 |        58.8109055 |        -0.0646309 |         -0.0415624 |
| 1044 | head.layers.26                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8640293 |    0.3813644 |     3.5540786 |         3.5540786 |       -5.0196371 |        -3.9441435 |        3.6160355 |         3.5104215 |        -0.0025196 |         -0.0009779 |
| 1045 | head.layers.27.add1                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8622503 |    0.4642262 |     5.7193537 |         5.7193537 |       -4.3831224 |        -3.5577016 |        8.0424423 |         7.4530840 |         0.0303301 |          0.0320665 |
| 1046 | head.layers.27.layers.0                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.3616081 |    1.2446103 |     7.7046127 |         7.7046127 |       -7.7046127 |         0.0000000 |        7.0916977 |         6.3396339 |        -0.6096495 |          0.3840659 |
| 1047 | head.layers.27.layers.1                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7968118 |    0.2495722 |     6.5265598 |         6.5265598 |        0.0000000 |         0.0000000 |        7.0916977 |         6.3396339 |         0.3853886 |          0.3840659 |
| 1048 | head.layers.27.layers.2                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.3995671 |    0.7988170 |     7.0394697 |         7.0394697 |       -7.0394697 |         0.0000000 |        5.3205628 |         5.3443007 |        -0.2734977 |          0.3038733 |
| 1049 | head.layers.27.layers.3                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7773287 |    0.2094914 |     4.3749471 |         4.3749471 |        0.0000000 |         0.0000000 |        5.3205628 |         5.3443007 |         0.3158279 |          0.3038733 |
| 1050 | head.layers.27.layers.4                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7193449 |    0.3987950 |     6.2699280 |         6.2699280 |       -0.7943422 |        -0.7816149 |        7.3012543 |         6.5059786 |         0.0450760 |          0.0437139 |
| 1051 | head.layers.27.layers.5                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.3206095 |    1.1657423 |     7.8192716 |         7.8192716 |       -7.8192716 |         0.0000000 |        8.7558870 |         7.3163943 |        -0.5356244 |          0.3543956 |
| 1052 | head.layers.27.layers.6                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7140211 |    0.2858213 |     7.1856537 |         7.1856537 |        0.0000000 |         0.0000000 |        8.7558870 |         7.3163943 |         0.3442967 |          0.3543956 |
| 1053 | head.layers.27.layers.7                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.4090248 |    1.0320758 |    11.0203085 |        11.0203085 |       -5.2401104 |         0.0000000 |       16.0399685 |        14.3179665 |        -0.5904789 |          0.2509447 |
| 1054 | head.layers.27.layers.8                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7967022 |    0.1950804 |    11.0203085 |        11.0203085 |        0.0000000 |         0.0000000 |       16.0399685 |        14.3179665 |         0.2465165 |          0.2509447 |
| 1055 | head.layers.27.layers.9                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7425171 |    0.3377547 |    11.2313919 |        11.2313919 |       -0.7326477 |        -0.7439281 |       12.6764984 |        12.3648987 |         0.0311046 |          0.0319105 |
| 1056 | head.layers.27.layers.10                       | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.5109836 |    0.8438687 |     9.0116100 |         9.0116100 |       -6.8275924 |        -5.2525949 |        5.6514730 |         5.2942429 |        -0.3400244 |         -0.2386942 |
| 1057 | head.layers.27.layers.11.scale_quant_stub      | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([11])                  | qint16        | 1.0000000 |  1.0000000 |    0.0000000 |     0.0000000 |         0.0000000 |        0.0068504 |         0.0068504 |        0.9011661 |         0.9011661 |         0.1431663 |          0.1431663 |
| 1058 | head.layers.27.layers.11.mul                   | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.7373483 |    0.0801154 |     2.2562373 |         2.2562373 |       -2.4859557 |        -1.8711957 |        1.0939751 |         0.9202507 |        -0.0663723 |         -0.0900743 |
| 1059 | head.layers.27.add2                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.9412441 |    0.9475492 |   120.2425308 |       120.2425308 |      -61.3783417 |       -61.3296623 |       64.8984909 |        63.4776192 |        -0.3475268 |         -0.0749481 |
| 1060 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 11])         | torch.float32 |           |  0.9412441 |    0.9475492 |   120.2425308 |                   |      -61.3783417 |       -61.3296623 |       64.8984909 |        63.4776192 |        -0.3475268 |         -0.0749481 |
| 1061 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.9456981 |    1.6420337 |   120.2425308 |       120.2425308 |      -61.3783417 |       -61.3296623 |       64.8984909 |        63.4776192 |         3.3694890 |          3.8756702 |
| 1062 | head.anchor_encoder.pos_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6662900 |    5.4459262 |    81.2101517 |        81.2101517 |      -81.2101517 |         0.0000000 |       90.3240509 |        77.4562149 |        -0.0692197 |          4.8209376 |
| 1063 | head.anchor_encoder.pos_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9559175 |    0.5359901 |    74.6021500 |        74.6021500 |        0.0000000 |         0.0000000 |       90.3240509 |        77.4562149 |         4.8407168 |          4.8209376 |
| 1064 | head.anchor_encoder.pos_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9184077 |    0.1286298 |     3.9834266 |         3.9834266 |       -0.9513882 |        -0.9630733 |        3.8195558 |         3.4846458 |         0.0100532 |          0.0096115 |
| 1065 | head.anchor_encoder.pos_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6075444 |    0.6120841 |     4.6511073 |         4.6511073 |       -4.6511073 |         0.0000000 |        5.4970312 |         5.6500993 |        -0.1936603 |          0.3241307 |
| 1066 | head.anchor_encoder.pos_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9376013 |    0.0848255 |     3.7502613 |         3.7502613 |        0.0000000 |         0.0000000 |        5.4970312 |         5.6500993 |         0.3335982 |          0.3241307 |
| 1067 | head.anchor_encoder.pos_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9666383 |    0.0949641 |     3.3797457 |         3.3797457 |       -0.9995837 |        -0.9984725 |        6.3509507 |         6.3510580 |         0.0785003 |          0.0804006 |
| 1068 | head.anchor_encoder.pos_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6215435 |    0.6914727 |     6.1288447 |         6.1288447 |       -6.1288447 |         0.0000000 |        5.9455128 |         5.8720565 |        -0.0864501 |          0.5200948 |
| 1069 | head.anchor_encoder.pos_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9722185 |    0.0853991 |     3.8440924 |         3.8440924 |        0.0000000 |         0.0000000 |        5.9455128 |         5.8720565 |         0.5196235 |          0.5200948 |
| 1070 | head.anchor_encoder.pos_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9736995 |    0.0993126 |     4.2240105 |         4.2240105 |       -0.8731638 |        -0.8353546 |        5.6917295 |         5.7139530 |         0.0260171 |          0.0258089 |
| 1071 | head.anchor_encoder.pos_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6183563 |    0.8959816 |     5.6197615 |         5.6197615 |       -5.6197615 |         0.0000000 |        8.3933687 |         8.3663769 |        -0.3380993 |          0.4829714 |
| 1072 | head.anchor_encoder.pos_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9837785 |    0.0714520 |     3.3833942 |         3.3833942 |        0.0000000 |         0.0000000 |        8.3933687 |         8.3663769 |         0.4864303 |          0.4829714 |
| 1073 | head.anchor_encoder.pos_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9781522 |    0.0792071 |     4.3414702 |         4.3414702 |       -0.8255141 |        -0.8292597 |        7.4406261 |         7.4071722 |         0.0266454 |          0.0264052 |
| 1074 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.9121499 |    0.2304696 |     1.8766313 |         1.8766313 |       -0.5995365 |         0.2015961 |        2.6148112 |         2.6692185 |         0.7532922 |          0.9341257 |
| 1075 | head.anchor_encoder.size_fc.0                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3404556 |    0.4861411 |     2.7607186 |         2.7607186 |       -2.7607186 |         0.0000000 |        1.4636744 |         1.5128576 |        -0.3157518 |          0.1270484 |
| 1076 | head.anchor_encoder.size_fc.1                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9286332 |    0.0335821 |     1.4170965 |         1.4170965 |        0.0000000 |         0.0000000 |        1.4636744 |         1.5128576 |         0.1368071 |          0.1270484 |
| 1077 | head.anchor_encoder.size_fc.2                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9286761 |    0.1282339 |     4.2883515 |         4.2883515 |       -0.7265327 |        -0.6192866 |        3.8414519 |         3.8803048 |         0.0214971 |          0.0219846 |
| 1078 | head.anchor_encoder.size_fc.3                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.5264789 |    0.2987962 |     2.0757225 |         2.0757225 |       -1.7600044 |         0.0000000 |        1.6219656 |         1.1421516 |        -0.0158370 |          0.2069093 |
| 1079 | head.anchor_encoder.size_fc.4                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8736289 |    0.0699208 |     1.4602526 |         1.4602526 |        0.0000000 |         0.0000000 |        1.6219656 |         1.1421516 |         0.2130383 |          0.2069093 |
| 1080 | head.anchor_encoder.size_fc.5                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8720555 |    0.2374626 |     3.7753770 |         3.7753770 |       -0.9080749 |        -0.9226026 |        3.2465158 |         3.2043920 |         0.0077279 |          0.0093381 |
| 1081 | head.anchor_encoder.size_fc.6                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.5986738 |    0.4833945 |     2.8907304 |         2.8907304 |       -2.1231875 |         0.0000000 |        2.2711363 |         1.7735702 |        -0.0411019 |          0.3082412 |
| 1082 | head.anchor_encoder.size_fc.7                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8915465 |    0.1198917 |     2.1620481 |         2.1620481 |        0.0000000 |         0.0000000 |        2.2711363 |         1.7735702 |         0.3224009 |          0.3082412 |
| 1083 | head.anchor_encoder.size_fc.8                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8595085 |    0.2523430 |     4.2532191 |         4.2532191 |       -0.7878673 |        -0.7553185 |        3.5841453 |         3.4334927 |         0.0118955 |          0.0129496 |
| 1084 | head.anchor_encoder.size_fc.9                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.6731855 |    0.4592580 |     3.8558593 |         3.8558593 |       -2.6814220 |         0.0000000 |        2.9946098 |         2.9938774 |         0.1377642 |          0.4533899 |
| 1085 | head.anchor_encoder.size_fc.10                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8953225 |    0.1700242 |     2.7618306 |         2.7618306 |        0.0000000 |         0.0000000 |        2.9946098 |         2.9938774 |         0.4269981 |          0.4533899 |
| 1086 | head.anchor_encoder.size_fc.11                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7476487 |    0.3460889 |     4.7232909 |         4.7232909 |       -1.0833137 |        -1.1504105 |        4.0238962 |         3.9463553 |         0.0279144 |          0.0347925 |
| 1087 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 2])          | qint16        | 1.0000000 |  0.9841852 |    0.0635889 |     0.5129867 |         0.5129867 |       -1.5529903 |        -1.1300989 |        0.1597417 |         0.1052414 |        -0.5416470 |         -0.5145456 |
| 1088 | head.anchor_encoder.yaw_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7922776 |    0.1627172 |     1.3925474 |         1.3925474 |       -1.3925474 |         0.0000000 |        1.4680992 |         1.2434340 |         0.1384308 |          0.2598860 |
| 1089 | head.anchor_encoder.yaw_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9840542 |    0.0260666 |     0.4556990 |         0.4556990 |        0.0000000 |         0.0000000 |        1.4680992 |         1.2434340 |         0.2750813 |          0.2598860 |
| 1090 | head.anchor_encoder.yaw_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9837863 |    0.0615951 |     1.0123425 |         1.0123425 |       -1.2217413 |        -1.2274448 |        2.9764147 |         2.9847472 |        -0.0000963 |         -0.0004633 |
| 1091 | head.anchor_encoder.yaw_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.4480241 |    0.4533151 |     3.2077050 |         3.2077050 |       -3.2077050 |         0.0000000 |        1.8326147 |         1.8422877 |        -0.1643472 |          0.2130129 |
| 1092 | head.anchor_encoder.yaw_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8922486 |    0.0627633 |     1.0101273 |         1.0101273 |        0.0000000 |         0.0000000 |        1.8326147 |         1.8422877 |         0.2262046 |          0.2130129 |
| 1093 | head.anchor_encoder.yaw_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8993243 |    0.1762754 |     3.0241053 |         3.0241053 |       -0.9141520 |        -0.8692596 |        4.0547457 |         4.0546160 |         0.0089406 |          0.0056001 |
| 1094 | head.anchor_encoder.yaw_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3531473 |    0.6346055 |     3.7683010 |         3.7683010 |       -3.7683010 |         0.0000000 |        2.5160465 |         1.7529428 |        -0.2453293 |          0.2607684 |
| 1095 | head.anchor_encoder.yaw_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8005638 |    0.1059557 |     2.3519604 |         2.3519604 |        0.0000000 |         0.0000000 |        2.5160465 |         1.7529428 |         0.2833205 |          0.2607684 |
| 1096 | head.anchor_encoder.yaw_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8266763 |    0.2441985 |     4.2326164 |         4.2326164 |       -0.9105286 |        -0.9061244 |        3.5488267 |         3.3240004 |         0.0131545 |          0.0101471 |
| 1097 | head.anchor_encoder.yaw_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3382944 |    0.8972611 |     5.1873331 |         5.1873331 |       -5.1873331 |         0.0000000 |        2.4999609 |         2.7039852 |        -0.5182370 |          0.2374867 |
| 1098 | head.anchor_encoder.yaw_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8783934 |    0.1174368 |     1.7469354 |         1.7469354 |        0.0000000 |         0.0000000 |        2.4999609 |         2.7039852 |         0.2615872 |          0.2374867 |
| 1099 | head.anchor_encoder.yaw_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7739764 |    0.2653474 |     4.0999503 |         4.0999503 |       -0.9668822 |        -0.7981352 |        4.9163489 |         4.8672829 |         0.0718481 |          0.0616645 |
| 1100 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.9285216 |    1.5594510 |    55.5577774 |        55.5577774 |      -47.7339172 |       -44.6797676 |       14.7491932 |        11.6762524 |        -5.0359483 |         -4.7415757 |
| 1101 | head.anchor_encoder.vel_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.6464213 |    3.2801535 |    30.6491966 |        30.6491966 |      -28.3978615 |         0.0000000 |       25.7832451 |        26.3660030 |        -0.2851496 |          2.5115094 |
| 1102 | head.anchor_encoder.vel_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9435467 |    0.4877439 |    25.9040737 |        25.9040737 |        0.0000000 |         0.0000000 |       25.7832451 |        26.3660030 |         2.5072598 |          2.5115094 |
| 1103 | head.anchor_encoder.vel_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8989974 |    0.1699897 |     4.2187529 |         4.2187529 |       -0.9009796 |        -0.8857724 |        3.5534286 |         3.5586908 |         0.0198846 |          0.0183247 |
| 1104 | head.anchor_encoder.vel_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.4770809 |    0.4375359 |     3.4000940 |         3.4000940 |       -2.6861732 |         0.0000000 |        3.1581845 |         2.5523324 |        -0.1398958 |          0.2117510 |
| 1105 | head.anchor_encoder.vel_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8728124 |    0.0854282 |     2.9633715 |         2.9633715 |        0.0000000 |         0.0000000 |        3.1581845 |         2.5523324 |         0.2122120 |          0.2117510 |
| 1106 | head.anchor_encoder.vel_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8622223 |    0.2338545 |     4.8872118 |         4.8872118 |       -0.9132700 |        -0.8793554 |        4.7235742 |         4.1948218 |         0.0365540 |          0.0340045 |
| 1107 | head.anchor_encoder.vel_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.5604171 |    0.7081639 |     5.1507750 |         5.1507750 |       -4.7864795 |         0.0000000 |        3.5961025 |         3.6148252 |        -0.1675116 |          0.3800448 |
| 1108 | head.anchor_encoder.vel_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8850489 |    0.1499331 |     3.6148252 |         3.6148252 |        0.0000000 |         0.0000000 |        3.5961025 |         3.6148252 |         0.3907193 |          0.3800448 |
| 1109 | head.anchor_encoder.vel_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8582733 |    0.2335953 |     5.7015772 |         5.7015772 |       -0.8898288 |        -0.8566211 |        5.2060061 |         4.9951515 |         0.0211219 |          0.0201834 |
| 1110 | head.anchor_encoder.vel_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.5942865 |    0.7002361 |     5.8825760 |         5.8825760 |       -4.6135840 |         0.0000000 |        4.4535503 |         4.8428516 |        -0.2081536 |          0.3408910 |
| 1111 | head.anchor_encoder.vel_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8992817 |    0.1413009 |     4.3922162 |         4.3922162 |        0.0000000 |         0.0000000 |        4.4535503 |         4.8428516 |         0.3507816 |          0.3408910 |
| 1112 | head.anchor_encoder.vel_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8647416 |    0.2171430 |     5.8000097 |         5.8000097 |       -0.7922733 |        -0.7171837 |        5.3857894 |         5.6940074 |         0.0251390 |          0.0264938 |
| 1113 | head.anchor_encoder.cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8949743 |    0.1703188 |     5.8000097 |         5.8000097 |       -1.0833137 |        -1.1504105 |        7.4406261 |         7.4071722 |         0.0320778 |          0.0318832 |
| 1114 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 256, 512])        | qint16        | 0.0001526 |  0.4880881 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
| 1115 | head.layers.28.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.8795890 |    0.2758416 |     5.8000097 |     12670.1212526 |       -5.0196371 |        -3.9441435 |        7.4406261 |         7.4071722 |         0.0147791 |          0.0154526 |
| 1116 | head.layers.28.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 256, 512])        | qint16        | 0.0001786 |  0.9560770 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
| 1117 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8796095 |    0.2758416 |     5.8000097 |     12670.1212526 |       -5.0196371 |        -3.9441435 |        7.4406261 |         7.4071722 |         0.0147791 |          0.0154526 |
| 1118 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.9560735 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
| 1119 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.4880882 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
| 1120 | head.layers.28.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8796095 |    0.2758416 |     5.8000097 |     12670.1212526 |       -5.0196371 |        -3.9441435 |        7.4406261 |         7.4071722 |         0.0147791 |          0.0154526 |
| 1121 | head.layers.28.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.9560735 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
| 1122 | head.layers.28.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.4880882 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
| 1123 | head.layers.28.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9354614 |    0.6892232 |     8.7405682 |         8.7405682 |      -11.8903866 |       -11.5306520 |       11.9750786 |        12.6898470 |         0.0439699 |          0.0499871 |
| 1124 | head.layers.28.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.9590322 |    0.1633165 |    11.5773058 |        11.5773058 |      -11.8387251 |       -11.2035933 |        8.7867546 |         8.4608831 |        -0.1423275 |         -0.1454205 |
| 1125 | head.layers.28.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.5333869 |    0.0201712 |     1.1934315 |         1.1934315 |       -1.3774500 |        -0.9835217 |        1.2707971 |         0.7805259 |         0.0010142 |          0.0001221 |
| 1126 | head.layers.28.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.9354614 |    0.6892232 |     8.7405682 |         8.7405682 |      -11.8903866 |       -11.5306520 |       11.9750786 |        12.6898470 |         0.0439699 |          0.0499871 |
| 1127 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.9354762 |    0.6892232 |     8.7405682 |         8.7405682 |      -11.8903866 |       -11.5306520 |       11.9750786 |        12.6898470 |         0.0439699 |          0.0499871 |
| 1128 | head.layers.28.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1129 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1130 | head.layers.28.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 208, 64])        | qint8         | 1.0000000 |  0.0302142 |    1.5705390 |    11.3180323 |        11.3180323 |       -1.3774500 |       -11.2035933 |        1.2707971 |         8.4608831 |         0.0010142 |         -0.1454205 |
| 1131 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 256, 64])        | qint8         | 1.0000000 |  0.0302157 |    1.5705390 |    11.3180323 |        11.3180323 |       -1.3774500 |       -11.2035933 |        1.2707971 |         8.4608831 |         0.0010142 |         -0.1454205 |
| 1132 | head.layers.28.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.9354762 |    0.0861529 |     1.0925710 |         8.7405682 |       -1.4862983 |        -1.4413315 |        1.4968848 |         1.5862309 |         0.0054962 |          0.0062484 |
| 1133 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1134 | head.layers.28.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.9297415 |    4.8775272 |   147.1398621 |       147.1398621 |     -119.4366989 |      -122.6513672 |       89.6204147 |        65.8594894 |         4.1627860 |          4.3950448 |
| 1135 | head.layers.28.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.2429842 |    0.0007142 |     0.9996335 |         0.9996335 |        0.0000000 |         0.0000000 |        0.9998909 |         0.0312500 |         0.0039062 |          0.0034997 |
| 1136 | head.layers.28.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.2429842 |    0.0007142 |     0.9996335 |         0.9996335 |        0.0000000 |         0.0000000 |        0.9998909 |         0.0312500 |         0.0039062 |          0.0034997 |
| 1137 | head.layers.28.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.4493650 |    0.0180180 |     1.1425502 |         1.1425502 |       -1.2534136 |        -0.7969149 |        1.1593416 |         0.4609065 |         0.0009520 |         -0.0002085 |
| 1138 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1139 | head.layers.28.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1140 | head.layers.28.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.7302136 |    0.0281599 |     1.2201226 |         1.2201226 |       -1.3280975 |        -0.6721978 |        1.3295521 |         0.7887988 |         0.0092964 |          0.0092032 |
| 1141 | head.layers.28.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 256])     | qint8         | 1.0000000 |  0.2429842 |    0.0007142 |     0.9996335 |         0.9996335 |        0.0000000 |         0.0000000 |        0.9998909 |         0.0312500 |         0.0039062 |          0.0034997 |
| 1142 | head.layers.28.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.4215011 |    0.0006107 |     0.5894068 |         0.5894068 |        0.0000000 |         0.0000000 |        0.5962552 |         0.0181298 |         0.0039062 |          0.0034997 |
| 1143 | head.layers.28.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1144 | head.layers.28.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.7302375 |    0.0281599 |     1.2201226 |         1.2201226 |       -1.3280975 |        -0.6721978 |        1.3295521 |         0.7887988 |         0.0092964 |          0.0092032 |
| 1145 | head.layers.28.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.8749071 |    0.2893544 |     5.7936087 |         5.7936087 |       -5.0219288 |        -3.9729514 |        7.6950755 |         7.5337286 |         0.0240755 |          0.0246558 |
| 1146 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.8593548 |    0.3514434 |     5.4361796 |      3562.6003265 |      -12.7860050 |        -9.5363960 |       12.2122726 |         8.9881916 |         0.0427855 |          0.0346722 |
| 1147 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 512])        | qint16        | 0.0001526 |  0.7701039 |    0.1130409 |     2.1222920 |     13908.4408990 |       -4.3227072 |        -3.8602495 |        3.5512011 |         3.3961532 |        -0.0019531 |         -0.0021315 |
| 1148 | head.layers.29.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.8775311 |    0.2608811 |     5.8000097 |     12670.1212526 |      -12.7860050 |        -9.5363960 |       12.2122726 |         8.9881916 |         0.0374316 |          0.0332777 |
| 1149 | head.layers.29.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0002277 |  0.8775311 |    0.2608811 |     5.8000097 |     25467.1566662 |      -12.7860050 |        -9.5363960 |       12.2122726 |         8.9881916 |         0.0374316 |          0.0332777 |
| 1150 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8776140 |    0.2608811 |     5.8000097 |     12670.1212526 |      -12.7860050 |        -9.5363960 |       12.2122726 |         8.9881916 |         0.0374316 |          0.0332777 |
| 1151 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0002277 |  0.8776140 |    0.2608811 |     5.8000097 |     25467.1566662 |      -12.7860050 |        -9.5363960 |       12.2122726 |         8.9881916 |         0.0374316 |          0.0332777 |
| 1152 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.7701675 |    0.1130409 |     2.1222920 |     13908.4408990 |       -4.3227072 |        -3.8602495 |        3.5512011 |         3.3961532 |        -0.0019531 |         -0.0021315 |
| 1153 | head.layers.29.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8776140 |    0.2608811 |     5.8000097 |     12670.1212526 |      -12.7860050 |        -9.5363960 |       12.2122726 |         8.9881916 |         0.0374316 |          0.0332777 |
| 1154 | head.layers.29.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0002277 |  0.8776140 |    0.2608811 |     5.8000097 |     25467.1566662 |      -12.7860050 |        -9.5363960 |       12.2122726 |         8.9881916 |         0.0374316 |          0.0332777 |
| 1155 | head.layers.29.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.7701675 |    0.1130409 |     2.1222920 |     13908.4408990 |       -4.3227072 |        -3.8602495 |        3.5512011 |         3.3961532 |        -0.0019531 |         -0.0021315 |
| 1156 | head.layers.29.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9154109 |    0.4875895 |     6.6250749 |         6.6250749 |       -7.7293367 |        -7.2219777 |        8.6010065 |         8.1535721 |         0.0315424 |          0.0242651 |
| 1157 | head.layers.29.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9451057 |    0.5022659 |     7.8168211 |         7.8168211 |      -14.7972946 |       -14.0133867 |       13.6866961 |        13.9686146 |         0.0261277 |          0.0368283 |
| 1158 | head.layers.29.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.7639374 |    0.1395872 |     2.9076672 |         2.9076672 |       -2.7349772 |        -2.2079439 |        3.3299572 |         2.4865084 |         0.0009756 |         -0.0002581 |
| 1159 | head.layers.29.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.9154109 |    0.4875895 |     6.6250749 |         6.6250749 |       -7.7293367 |        -7.2219777 |        8.6010065 |         8.1535721 |         0.0315424 |          0.0242651 |
| 1160 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.9154490 |    0.4875895 |     6.6250749 |         6.6250749 |       -7.7293367 |        -7.2219777 |        8.6010065 |         8.1535721 |         0.0315424 |          0.0242651 |
| 1161 | head.layers.29.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.3455563 |    1.8201165 |    12.1359444 |        12.1359444 |      -14.7972946 |        -7.2219777 |       13.6866961 |         8.1535721 |         0.0261277 |          0.0242651 |
| 1162 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.3455682 |    1.8201165 |    12.1359444 |        12.1359444 |      -14.7972946 |        -7.2219777 |       13.6866961 |         8.1535721 |         0.0261277 |          0.0242651 |
| 1163 | head.layers.29.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0006282 |    1.6376082 |    14.1499529 |        14.1499529 |       -2.7349772 |       -14.0133867 |        3.3299572 |        13.9686146 |         0.0009756 |          0.0368283 |
| 1164 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0006282 |    1.6376082 |    14.1499529 |        14.1499529 |       -2.7349772 |       -14.0133867 |        3.3299572 |        13.9686146 |         0.0009756 |          0.0368283 |
| 1165 | head.layers.29.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.9154490 |    0.0609487 |     0.8281344 |         6.6250749 |       -0.9661671 |        -0.9027472 |        1.0751258 |         1.0191965 |         0.0039428 |          0.0030331 |
| 1166 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1167 | head.layers.29.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.9635173 |    2.5571971 |    51.0177422 |        51.0177422 |      -69.0314484 |       -66.5390472 |       94.2592926 |        81.9041290 |         0.6253366 |          0.5512771 |
| 1168 | head.layers.29.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.3844037 |    0.0016810 |     0.9969095 |         0.9969095 |        0.0000000 |         0.0000000 |        0.9987237 |         0.0312500 |         0.0019531 |          0.0011353 |
| 1169 | head.layers.29.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.3844037 |    0.0016810 |     0.9969095 |         0.9969095 |        0.0000000 |         0.0000000 |        0.9987237 |         0.0312500 |         0.0019531 |          0.0011353 |
| 1170 | head.layers.29.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.5701573 |    0.1420102 |     1.6578809 |         1.6578809 |       -1.8340108 |        -1.6519204 |        1.7175989 |         1.4503721 |         0.0029553 |         -0.0001422 |
| 1171 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1172 | head.layers.29.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1173 | head.layers.29.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.6126519 |    0.2497943 |     1.7371175 |         1.7371175 |       -1.9602994 |        -1.6853302 |        2.2296731 |         1.7646734 |        -0.0062624 |         -0.0034911 |
| 1174 | head.layers.29.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 512])     | qint8         | 1.0000000 |  0.3844037 |    0.0016810 |     0.9969095 |         0.9969095 |        0.0000000 |         0.0000000 |        0.9987237 |         0.0312500 |         0.0019531 |          0.0011353 |
| 1175 | head.layers.29.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.5102069 |    0.0014290 |     0.3950592 |         0.3950592 |        0.0000000 |         0.0000001 |        0.4033993 |         0.0175031 |         0.0019531 |          0.0011353 |
| 1176 | head.layers.29.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1177 | head.layers.29.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.6125653 |    0.2497943 |     1.7371175 |         1.7371175 |       -1.9602994 |        -1.6853302 |        2.2296731 |         1.7646734 |        -0.0062624 |         -0.0034911 |
| 1178 | head.layers.29.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.8455381 |    0.3964874 |     5.9536209 |         5.9536209 |      -11.7107601 |        -8.9126215 |       12.1235027 |         8.9401417 |         0.0311692 |          0.0297866 |
| 1179 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.9248614 |    0.5693119 |    12.2581167 |      8033.3567957 |      -29.6981831 |       -24.2751789 |       23.4228687 |        21.4588795 |        -0.0291742 |         -0.0285584 |
| 1180 | head.layers.30                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9246183 |    0.2438658 |     4.5258808 |         4.5258808 |       -6.8041563 |        -6.6641359 |        6.0889411 |         6.1395116 |         0.0020975 |          0.0021090 |
| 1181 | head.layers.31.kps_generator.offset            | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 24])         | qint16        | 1.0000000 |  0.9494529 |    0.2679785 |     7.0239868 |         7.0239868 |       -6.1963248 |        -6.9800181 |        4.7640347 |         4.7019553 |        -0.5936720 |         -0.6612890 |
| 1182 | head.layers.31.kps_generator                   | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.9494529 |    0.2679785 |     7.0239868 |         7.0239868 |       -6.1963248 |        -6.9800181 |        4.7640347 |         4.7019553 |        -0.5936720 |         -0.6612890 |
| 1183 | head.layers.31.kps_generator                   | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 3])       | qint16        | 1.0000000 |  0.9456981 |    1.6420337 |   120.2425308 |       120.2425308 |      -61.3783417 |       -61.3296623 |       64.8984909 |        63.4776192 |         3.3694890 |          3.8756702 |
| 1184 | head.layers.31.kps_generator.keypoints_add     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.9439137 |    1.7079200 |   126.7845764 |       126.7845764 |      -64.9129333 |       -66.2718811 |       66.4953232 |        66.2078934 |         2.7758167 |          3.2143812 |
| 1185 | head.layers.31.weight_add                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9045485 |    0.3392259 |     6.7772555 |         6.7772555 |       -7.2321424 |        -7.1484184 |        7.5422053 |         7.5804467 |         0.0341753 |          0.0339922 |
| 1186 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 3, 4])         | qint16        | 1.0000000 |  0.9999999 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3522723 |          0.3522723 |
| 1187 | head.layers.31                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 6, 12])           | qint16        | 1.0000000 |  0.9999999 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3522723 |          0.3522723 |
| 1188 | head.layers.31.camera_encoder.0                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.6448321 |    0.4029789 |     6.3953948 |         6.3953948 |       -6.3953948 |         0.0000000 |        6.0321255 |         6.0321255 |        -0.1030675 |          0.2999114 |
| 1189 | head.layers.31.camera_encoder.1                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.9999994 |    0.0000000 |     0.0000000 |         0.0000000 |        0.0000000 |         0.0000000 |        6.0321255 |         6.0321255 |         0.2999114 |          0.2999114 |
| 1190 | head.layers.31.camera_encoder.2                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.9999993 |    0.0000000 |     0.0000005 |         0.0000005 |       -0.8076455 |        -0.8076455 |        4.9130092 |         4.9130092 |         0.0247915 |          0.0247915 |
| 1191 | head.layers.31.camera_encoder.3                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.7944613 |    1.3779744 |    19.5911980 |        19.5911980 |      -19.5911980 |         0.0000000 |       23.3326073 |        23.3326073 |        -0.4829063 |          0.8950680 |
| 1192 | head.layers.31.camera_encoder.4                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000038 |         0.0000038 |        0.0000000 |         0.0000000 |       23.3326073 |        23.3326073 |         0.8950680 |          0.8950680 |
| 1193 | head.layers.31.camera_encoder.5                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  1.0000004 |    0.0000000 |     0.0000010 |         0.0000010 |       -1.1679387 |        -1.1679387 |        7.1566195 |         7.1566195 |         0.0284684 |          0.0284684 |
| 1194 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 256])     | qint8         | 1.0000000 |  0.9045485 |    0.3392259 |     6.7772555 |         6.7772555 |       -7.2321424 |        -7.1484184 |        7.5422053 |         7.5804467 |         0.0341753 |          0.0339922 |
| 1195 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 6, 256])       | qint8         | 1.0000000 |  1.0000004 |    0.0000000 |     0.0000010 |         0.0000010 |       -1.1679387 |        -1.1679387 |        7.1566195 |         7.1566195 |         0.0284684 |          0.0284684 |
| 1196 | head.layers.31.cam_add                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 6, 256])     | qint8         | 1.0000000 |  0.9183319 |    0.3392259 |     6.7772560 |         6.7772560 |       -6.5941091 |        -6.1276388 |        9.9929256 |        10.0614605 |         0.0626437 |          0.0624606 |
| 1197 | head.layers.31.weights_fc                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 6, 64])      | qint8         | 1.0000000 |  0.9720936 |    0.3416498 |     4.9318056 |         4.9318056 |      -11.6583138 |       -11.7363596 |       12.3555145 |        12.3802261 |         0.0358706 |          0.0217842 |
| 1198 | head.layers.31                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.9720936 |    0.3416498 |     4.9318056 |         4.9318056 |      -11.6583138 |       -11.7363596 |       12.3555145 |        12.3802261 |         0.0358706 |          0.0217842 |
| 1199 | head.layers.31.weight_softmax                  | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.8875404 |    0.0182506 |     0.8634247 |         0.8634247 |        0.0000000 |         0.0000000 |        0.8946747 |         0.0312500 |         0.0208333 |          0.0026332 |
| 1200 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  0.9358483 |    3.2360468 |   126.7845764 |       126.7845764 |      -64.9129333 |       -66.2718811 |       66.4953232 |        66.2078934 |         5.0302644 |          6.1425576 |
| 1201 | head.layers.31                                 | torch.ones_like                                                               | torch.ones_like                                                         | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
| 1202 | head.layers.31.point_quant_stub                | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
| 1203 | head.layers.31.point_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 8, 4])       | qint16        | 1.0000000 |  0.9439144 |    1.2809399 |   126.7845764 |       126.7845764 |      -64.9129333 |       -66.2718811 |       66.4953232 |        66.2078934 |         2.3318627 |          2.6607862 |
| 1204 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 1, 1, 4, 4])   | qint16        | 1.0000000 |  0.9999996 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3267042 |          0.3267042 |
| 1205 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 512, 8, 1, 4]) | qint16        | 1.0000000 |  0.9439144 |    1.2809399 |   126.7845764 |       126.7845764 |      -64.9129333 |       -66.2718811 |       66.4953232 |        66.2078934 |         2.3318627 |          2.6607862 |
| 1206 | head.layers.31.point_matmul                    | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 4, 4]) | qint16        | 1.0000000 |  0.9649912 |    0.5701727 |   235.8503571 |       235.8503571 |     -174.1302643 |      -173.2463684 |      581.1029053 |       556.2365723 |         1.1349167 |          1.1445824 |
| 1207 | head.layers.31.point_sum                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 6, 512, 8, 4])    | qint16        | 0.0244144 |  0.9657597 |    1.8814048 |   251.3648071 |     10295.7453994 |     -176.3751984 |      -175.5257263 |      628.9711914 |       625.2307129 |         4.5396676 |          4.5783291 |
| 1208 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.9381520 |    2.2922139 |   126.5895767 |      5185.0299452 |      -71.0911560 |       -71.5697021 |       77.2982941 |        75.3324661 |         1.2446004 |          1.2555205 |
| 1209 | head.layers.31                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.9601960 |    1.2405125 |    75.8337173 |      3106.1016671 |        0.0100000 |         0.0100000 |       77.2982941 |        75.3324661 |        11.5794268 |         11.6324940 |
| 1210 | head.layers.31.reciprocal_op                   | horizon_plugin_pytorch.nn.reciprocal.Reciprocal                               | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.9533321 |   40.7677460 |    99.9832764 |     32762.0200912 |        0.0129369 |         0.0132745 |      100.0000000 |         1.2799804 |        41.3908577 |          0.6410713 |
| 1211 | head.layers.31                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.9533321 |   40.7677460 |    99.9832764 |     32762.0200912 |        0.0129369 |         0.0132745 |      100.0000000 |         1.2799804 |        41.3908577 |          0.6410713 |
| 1212 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0244144 |  0.9707868 |    2.6167023 |   251.3648071 |     10295.7453994 |     -176.3751984 |      -175.5257263 |      628.9711914 |       625.2307129 |         7.9570336 |          8.0288982 |
| 1213 | head.layers.31.point_mul                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.9008031 | 1488.7185059 | 62244.2187500 | 203958743.8365504 |   -17637.5195312 |      -224.6694946 |    62897.1171875 |       800.2830811 |      1075.4031982 |         13.3636475 |
| 1214 | head.layers.31                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.5810773 |  174.4606018 |   602.1967773 |   1973248.2906206 |     -500.0000000 |      -224.6694946 |      500.0000000 |       500.0000000 |       106.0179062 |         13.0019083 |
| 1215 | head.layers.31                                 | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([156, 512, 8, 2])      | qint16        | 0.0003052 |  0.5810773 |  174.4606018 |   602.1967773 |   1973248.2906206 |     -500.0000000 |      -224.6694946 |      500.0000000 |       500.0000000 |       106.0179062 |         13.0019083 |
| 1216 | head.layers.31                                 | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer            | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer      | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.7161584 |    0.4766045 |    34.4446487 |        34.4446487 |      -59.7485809 |       -57.2925186 |       50.8197594 |        53.4819145 |         0.0126257 |          0.0260797 |
| 1217 | head.layers.31.feat_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.7161584 |    0.4766045 |    34.4446487 |        34.4446487 |      -59.7485809 |       -57.2925186 |       50.8197594 |        53.4819145 |         0.0126257 |          0.0260797 |
| 1218 | head.layers.31                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 6, 256, 512, 8])  | qint8         | 1.0000000 |  0.7161584 |    0.4766045 |    34.4446487 |        34.4446487 |      -59.7485809 |       -57.2925186 |       50.8197594 |        53.4819145 |         0.0126257 |          0.0260797 |
| 1219 | head.layers.31                                 | torch.Tensor.permute                                                          | torch.Tensor.permute                                                    | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.7161025 |    0.4766045 |    34.4446487 |        34.4446487 |      -59.7485809 |       -57.2925186 |       50.8197594 |        53.4819145 |         0.0126257 |          0.0260797 |
| 1220 | head.layers.31                                 | torch.Tensor.contiguous                                                       | torch.Tensor.contiguous                                                 | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.7161025 |    0.4766045 |    34.4446487 |        34.4446487 |      -59.7485809 |       -57.2925186 |       50.8197594 |        53.4819145 |         0.0126257 |          0.0260797 |
| 1221 | head.layers.31                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.7161025 |    0.4766045 |    34.4446487 |        34.4446487 |      -59.7485809 |       -57.2925186 |       50.8197594 |        53.4819145 |         0.0126257 |          0.0260797 |
| 1222 | head.layers.31                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 48, 8, 1])   | qint8         | 1.0000000 |  0.8875404 |    0.0182506 |     0.8634247 |         0.8634247 |        0.0000000 |         0.0000000 |        0.8946747 |         0.0312500 |         0.0208333 |          0.0026332 |
| 1223 | head.layers.31                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.7161025 |    0.4766045 |    34.4446487 |        34.4446487 |      -59.7485809 |       -57.2925186 |       50.8197594 |        53.4819145 |         0.0126257 |          0.0260797 |
| 1224 | head.layers.31.feat_mul                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.4529083 |    0.0082020 |     9.1913042 |         9.1913042 |       -9.4706335 |        -0.5488678 |        8.5958147 |         0.5121123 |        -0.0000901 |          0.0000209 |
| 1225 | head.layers.31                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.4529083 |    0.0082020 |     9.1913042 |         9.1913042 |       -9.4706335 |        -0.5488678 |        8.5958147 |         0.5121123 |        -0.0000901 |          0.0000209 |
| 1226 | head.layers.31.feat_sum                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6059691 |    0.2956160 |     9.8958626 |         9.8958626 |      -10.2529316 |        -2.0867383 |       10.0522709 |         2.0189030 |        -0.0043251 |          0.0010028 |
| 1227 | head.layers.31.output_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6108598 |    0.4301839 |    10.2398720 |        10.2398720 |       -9.4126825 |        -1.2229491 |       10.7986374 |         1.2143043 |         0.0070755 |         -0.0020512 |
| 1228 | head.layers.31.proj_drop                       | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6108598 |    0.4301839 |    10.2398720 |        10.2398720 |       -9.4126825 |        -1.2229491 |       10.7986374 |         1.2143043 |         0.0070755 |         -0.0020512 |
| 1229 | head.layers.31.residual_op                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.7474298 |    0.3370249 |    10.2398720 |        10.2398720 |       -9.4126825 |        -6.6641359 |       10.7986374 |         6.1395116 |         0.0045865 |          0.0000289 |
| 1230 | head.layers.32.pre_norm                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.7943658 |    0.3834853 |     5.8903646 |         5.8903646 |       -7.5560036 |        -7.8976064 |        7.2351823 |         7.2770000 |        -0.0030164 |         -0.0040063 |
| 1231 | head.layers.32.layers.0.0                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.1860921 |    2.3606265 |    12.6427155 |        12.6427155 |      -12.6427155 |         0.0000000 |       12.9579401 |        11.9910946 |        -1.6987690 |          0.2284944 |
| 1232 | head.layers.32.layers.0.2                      | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.5767105 |    0.3181150 |    11.6938524 |        11.6938524 |        0.0000000 |         0.0000000 |       12.9579401 |        11.9910946 |         0.3437426 |          0.2284944 |
| 1233 | head.layers.32.layers.1                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6298847 |    3.0208755 |    34.4393311 |        34.4393311 |      -41.4778481 |       -54.7931252 |       39.3339081 |        39.9028778 |         0.0380710 |          0.0470976 |
| 1234 | head.layers.32.layers.2                        | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6298847 |    3.0208755 |    34.4393311 |        34.4393311 |      -41.4778481 |       -54.7931252 |       39.3339081 |        39.9028778 |         0.0380710 |          0.0470976 |
| 1235 | head.layers.32.identity_fc                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1236 | head.layers.32.short_add                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.linear.LinearAdd                          | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6746240 |    3.7201555 |    37.8367081 |        37.8367081 |      -44.1331406 |       -61.5537987 |       38.4727020 |        43.4626465 |         0.0332757 |          0.0194917 |
| 1237 | head.layers.33                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8360167 |    0.4276628 |     4.1718831 |         4.1718831 |       -4.4647574 |        -4.0695052 |        3.9052179 |         3.5719321 |        -0.0033510 |         -0.0028656 |
| 1238 | head.layers.34.add1                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8588096 |    0.5036563 |     5.3454576 |         5.3454576 |       -4.1913166 |        -3.7562432 |        8.5232067 |         8.4225254 |         0.0287268 |          0.0290176 |
| 1239 | head.layers.34.layers.0                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.3777705 |    1.1485881 |     8.8063450 |         8.8063450 |       -8.8063450 |         0.0000000 |        7.6191745 |         6.9605794 |        -0.4384140 |          0.3895442 |
| 1240 | head.layers.34.layers.1                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7678322 |    0.2899508 |     5.0490122 |         5.0490122 |        0.0000000 |         0.0000000 |        7.6191745 |         6.9605794 |         0.4202233 |          0.3895442 |
| 1241 | head.layers.34.layers.2                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.2606307 |    0.9486557 |     8.3206234 |         8.3206234 |       -8.3206234 |         0.0000000 |        5.6672363 |         4.7635627 |        -0.5168596 |          0.2181126 |
| 1242 | head.layers.34.layers.3                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6764575 |    0.1949168 |     5.5700107 |         5.5700107 |        0.0000000 |         0.0000000 |        5.6672363 |         4.7635627 |         0.2368794 |          0.2181126 |
| 1243 | head.layers.34.layers.4                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6369621 |    0.4262832 |     7.4109907 |         7.4109907 |       -0.6921440 |        -0.6719872 |        7.1927032 |         8.4970274 |         0.0367524 |          0.0368642 |
| 1244 | head.layers.34.layers.5                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.2432882 |    1.2215235 |     8.3003111 |         8.3003111 |       -5.9448891 |         0.0000000 |        7.9089246 |         5.7951784 |        -0.5687200 |          0.3189561 |
| 1245 | head.layers.34.layers.6                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6166263 |    0.3209374 |     6.4589634 |         6.4589634 |        0.0000000 |         0.0000000 |        7.9089246 |         5.7951784 |         0.3318662 |          0.3189561 |
| 1246 | head.layers.34.layers.7                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.3331786 |    0.9957645 |     9.3256531 |         9.3256531 |       -4.0794268 |         0.0000000 |       14.0659227 |        13.9588442 |        -0.5943154 |          0.2031097 |
| 1247 | head.layers.34.layers.8                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7340397 |    0.1913822 |     9.3256531 |         9.3256531 |        0.0000000 |         0.0000000 |       14.0659227 |        13.9588442 |         0.2100669 |          0.2031097 |
| 1248 | head.layers.34.layers.9                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7054970 |    0.3431077 |    12.5440607 |        12.5440607 |       -0.7410388 |        -0.7414351 |       12.6161442 |        12.5043154 |         0.0229114 |          0.0222692 |
| 1249 | head.layers.34.layers.10                       | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.4153166 |    1.0624678 |     8.1852465 |         8.1852465 |       -6.3212309 |        -5.8380547 |        6.6756659 |         6.8236871 |         0.1895968 |          0.1792932 |
| 1250 | head.layers.34.layers.11.scale_quant_stub      | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([11])                  | qint16        | 1.0000000 |  1.0000000 |    0.0000000 |     0.0000000 |         0.0000000 |        0.0028612 |         0.0028612 |        0.8379965 |         0.8379965 |         0.1018584 |          0.1018584 |
| 1251 | head.layers.34.layers.11.mul                   | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.6581582 |    0.0346849 |     1.2050216 |         1.2050216 |       -1.3175868 |        -0.8853523 |        0.7533941 |         0.6296943 |         0.0017164 |         -0.0011229 |
| 1252 | head.layers.34.add2                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.9411867 |    0.9500634 |   120.3307877 |       120.3307877 |      -61.4154358 |       -61.3173294 |       65.0450211 |        63.5444984 |        -0.3458104 |         -0.0760710 |
| 1253 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 11])         | torch.float32 |           |  0.9411867 |    0.9500634 |   120.3307877 |                   |      -61.4154358 |       -61.3173294 |       65.0450211 |        63.5444984 |        -0.3458104 |         -0.0760710 |
| 1254 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.9456390 |    1.6525263 |   120.3307877 |       120.3307877 |      -61.4154358 |       -61.3173294 |       65.0450211 |        63.5444984 |         3.3838532 |          3.8786988 |
| 1255 | head.anchor_encoder.pos_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6661137 |    5.4503350 |    81.1876984 |        81.1876984 |      -81.1876984 |         0.0000000 |       90.2443085 |        77.4092941 |        -0.0714875 |          4.8209300 |
| 1256 | head.anchor_encoder.pos_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9558508 |    0.5387409 |    74.5990677 |        74.5990677 |        0.0000000 |         0.0000000 |       90.2443085 |        77.4092941 |         4.8401065 |          4.8209300 |
| 1257 | head.anchor_encoder.pos_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9210201 |    0.1270192 |     3.9856343 |         3.9856343 |       -0.9515033 |        -0.9630887 |        3.8073273 |         3.4856253 |         0.0100821 |          0.0096918 |
| 1258 | head.anchor_encoder.pos_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6087747 |    0.6125072 |     4.6518054 |         4.6518054 |       -4.6518054 |         0.0000000 |        5.4980774 |         5.6508842 |        -0.1936710 |          0.3262698 |
| 1259 | head.anchor_encoder.pos_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9392053 |    0.0841895 |     3.7528269 |         3.7528269 |        0.0000000 |         0.0000000 |        5.4980774 |         5.6508842 |         0.3346467 |          0.3262698 |
| 1260 | head.anchor_encoder.pos_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9673597 |    0.0943672 |     3.3842466 |         3.3842466 |       -0.9995878 |        -0.9984738 |        6.3511982 |         6.3510261 |         0.0785111 |          0.0803603 |
| 1261 | head.anchor_encoder.pos_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6221521 |    0.6907353 |     6.1290221 |         6.1290221 |       -6.1290221 |         0.0000000 |        5.9456940 |         5.8726363 |        -0.0853515 |          0.5192075 |
| 1262 | head.anchor_encoder.pos_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9724049 |    0.0859654 |     3.8484268 |         3.8484268 |        0.0000000 |         0.0000000 |        5.9456940 |         5.8726363 |         0.5194185 |          0.5192075 |
| 1263 | head.anchor_encoder.pos_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9733542 |    0.1009139 |     4.2296557 |         4.2296557 |       -0.8731578 |        -0.8363924 |        5.6994514 |         5.7135196 |         0.0259932 |          0.0257641 |
| 1264 | head.anchor_encoder.pos_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.6188845 |    0.8955359 |     5.6300545 |         5.6300545 |       -5.6300545 |         0.0000000 |        8.3998594 |         8.3653193 |        -0.3353316 |          0.4817244 |
| 1265 | head.anchor_encoder.pos_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9831192 |    0.0737699 |     3.3553703 |         3.3553703 |        0.0000000 |         0.0000000 |        8.3998594 |         8.3653193 |         0.4864346 |          0.4817244 |
| 1266 | head.anchor_encoder.pos_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 128])        | qint8         | 1.0000000 |  0.9769847 |    0.0818458 |     4.2986636 |         4.2986636 |       -0.8255135 |        -0.8293330 |        7.4431357 |         7.4120479 |         0.0267675 |          0.0265781 |
| 1267 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.9040953 |    0.2395686 |     1.8455094 |         1.8455094 |       -0.5729252 |         0.2937635 |        2.6009233 |         2.6599708 |         0.7520581 |          0.9386057 |
| 1268 | head.anchor_encoder.size_fc.0                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3357438 |    0.4850038 |     2.7640905 |         2.7640905 |       -2.7640905 |         0.0000000 |        1.4709975 |         1.5056030 |        -0.3149296 |          0.1251416 |
| 1269 | head.anchor_encoder.size_fc.1                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9207641 |    0.0346366 |     1.4067181 |         1.4067181 |        0.0000000 |         0.0000000 |        1.4709975 |         1.5056030 |         0.1354377 |          0.1251416 |
| 1270 | head.anchor_encoder.size_fc.2                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9196578 |    0.1320714 |     4.2910509 |         4.2910509 |       -0.7263469 |        -0.6176413 |        3.8696783 |         3.9134855 |         0.0214128 |          0.0219545 |
| 1271 | head.anchor_encoder.size_fc.3                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.5173225 |    0.2995791 |     2.0831931 |         2.0831931 |       -1.7738487 |         0.0000000 |        1.6336277 |         1.1397977 |        -0.0154793 |          0.2047651 |
| 1272 | head.anchor_encoder.size_fc.4                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8625417 |    0.0705387 |     1.4603913 |         1.4603913 |        0.0000000 |         0.0000000 |        1.6336277 |         1.1397977 |         0.2135611 |          0.2047651 |
| 1273 | head.anchor_encoder.size_fc.5                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8604508 |    0.2360840 |     3.8834596 |         3.8834596 |       -0.9037968 |        -0.9209311 |        3.2461894 |         3.2248540 |         0.0078092 |          0.0094099 |
| 1274 | head.anchor_encoder.size_fc.6                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.5872797 |    0.4877573 |     2.9081054 |         2.9081054 |       -2.1175549 |         0.0000000 |        2.2769725 |         1.7680835 |        -0.0432350 |          0.3100460 |
| 1275 | head.anchor_encoder.size_fc.7                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8796253 |    0.1225712 |     2.1715703 |         2.1715703 |        0.0000000 |         0.0000000 |        2.2769725 |         1.7680835 |         0.3219511 |          0.3100460 |
| 1276 | head.anchor_encoder.size_fc.8                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8442664 |    0.2552055 |     4.2576919 |         4.2576919 |       -0.7862651 |        -0.7537680 |        3.5824080 |         3.4172575 |         0.0115312 |          0.0123424 |
| 1277 | head.anchor_encoder.size_fc.9                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.6583465 |    0.4695072 |     3.7878754 |         3.7878754 |       -2.6061873 |         0.0000000 |        2.9555063 |         2.9653122 |         0.1401060 |          0.4599210 |
| 1278 | head.anchor_encoder.size_fc.10                 | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8806243 |    0.1794328 |     2.7863047 |         2.7863047 |        0.0000000 |         0.0000000 |        2.9555063 |         2.9653122 |         0.4301805 |          0.4599210 |
| 1279 | head.anchor_encoder.size_fc.11                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7272527 |    0.3551647 |     4.8153486 |         4.8153486 |       -1.0995392 |        -1.1195819 |        4.0583930 |         3.9060359 |         0.0265384 |          0.0322649 |
| 1280 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 2])          | qint16        | 1.0000000 |  0.9855565 |    0.0585194 |     0.4920501 |         0.4920501 |       -1.5263916 |        -1.1040827 |        0.1446715 |         0.0950820 |        -0.5391757 |         -0.5137799 |
| 1281 | head.anchor_encoder.yaw_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7932105 |    0.1601305 |     1.3800282 |         1.3800282 |       -1.3800282 |         0.0000000 |        1.4532351 |         1.2141361 |         0.1380680 |          0.2597174 |
| 1282 | head.anchor_encoder.yaw_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9854839 |    0.0239204 |     0.4362160 |         0.4362160 |        0.0000000 |         0.0000000 |        1.4532351 |         1.2141361 |         0.2742781 |          0.2597174 |
| 1283 | head.anchor_encoder.yaw_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9851080 |    0.0557147 |     1.0061992 |         1.0061992 |       -1.2215934 |        -1.2326161 |        2.9579587 |         2.9735940 |        -0.0000934 |         -0.0004657 |
| 1284 | head.anchor_encoder.yaw_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.4483400 |    0.4469489 |     3.1797924 |         3.1797924 |       -3.1797924 |         0.0000000 |        1.8205017 |         1.8155472 |        -0.1639526 |          0.2122723 |
| 1285 | head.anchor_encoder.yaw_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8972189 |    0.0576497 |     0.9939376 |         0.9939376 |        0.0000000 |         0.0000000 |        1.8205017 |         1.8155472 |         0.2253465 |          0.2122723 |
| 1286 | head.anchor_encoder.yaw_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.9024049 |    0.1609419 |     2.9286613 |         2.9286613 |       -0.9034926 |        -0.8409972 |        4.0241303 |         4.0268130 |         0.0091453 |          0.0053506 |
| 1287 | head.anchor_encoder.yaw_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3508224 |    0.6243656 |     3.7609212 |         3.7609212 |       -3.7609212 |         0.0000000 |        2.5074804 |         1.6283798 |        -0.2490562 |          0.2600846 |
| 1288 | head.anchor_encoder.yaw_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8036476 |    0.0953480 |     2.3794396 |         2.3794396 |        0.0000000 |         0.0000000 |        2.5074804 |         1.6283798 |         0.2799616 |          0.2600846 |
| 1289 | head.anchor_encoder.yaw_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8352005 |    0.2125576 |     4.1412377 |         4.1412377 |       -0.9106075 |        -0.9060069 |        3.5720036 |         3.0450163 |         0.0132037 |          0.0099102 |
| 1290 | head.anchor_encoder.yaw_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.3385556 |    0.8886303 |     5.1872282 |         5.1872282 |       -5.1872282 |         0.0000000 |        2.4994709 |         2.6543946 |        -0.5310659 |          0.2340356 |
| 1291 | head.anchor_encoder.yaw_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.8868967 |    0.1022658 |     1.7448064 |         1.7448064 |        0.0000000 |         0.0000000 |        2.4994709 |         2.6543946 |         0.2552986 |          0.2340356 |
| 1292 | head.anchor_encoder.yaw_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 32])         | qint8         | 1.0000000 |  0.7888838 |    0.2287152 |     4.1820979 |         4.1820979 |       -0.9535290 |        -0.7893574 |        4.9159632 |         4.8750744 |         0.0692666 |          0.0587316 |
| 1293 | head.anchor_encoder                            | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 3])          | qint16        | 1.0000000 |  0.9285017 |    1.5524579 |    55.5478058 |        55.5478058 |      -47.7422600 |       -44.6803055 |       14.6941509 |        11.6387129 |        -5.0444326 |         -4.7537117 |
| 1294 | head.anchor_encoder.vel_fc.0                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.6465322 |    3.2768621 |    30.4683266 |        30.4683266 |      -28.4623699 |         0.0000000 |       25.8396416 |        26.4046764 |        -0.2859682 |          2.5075645 |
| 1295 | head.anchor_encoder.vel_fc.1                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9435682 |    0.4861810 |    25.9380112 |        25.9380112 |        0.0000000 |         0.0000000 |       25.8396416 |        26.4046764 |         2.5047131 |          2.5075645 |
| 1296 | head.anchor_encoder.vel_fc.2                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9004208 |    0.1680646 |     4.2221370 |         4.2221370 |       -0.9045011 |        -0.8907852 |        3.5515747 |         3.5641067 |         0.0199599 |          0.0183843 |
| 1297 | head.anchor_encoder.vel_fc.3                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.4819177 |    0.4348050 |     3.4213495 |         3.4213495 |       -2.5141940 |         0.0000000 |        3.1400752 |         2.5125215 |        -0.1404278 |          0.2119002 |
| 1298 | head.anchor_encoder.vel_fc.4                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8805008 |    0.0824327 |     2.9357975 |         2.9357975 |        0.0000000 |         0.0000000 |        3.1400752 |         2.5125215 |         0.2119445 |          0.2119002 |
| 1299 | head.anchor_encoder.vel_fc.5                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8674863 |    0.2266362 |     4.8789644 |         4.8789644 |       -0.9133517 |        -0.8796721 |        4.7071753 |         4.1728287 |         0.0365095 |          0.0341366 |
| 1300 | head.anchor_encoder.vel_fc.6                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.5671916 |    0.7014434 |     5.1053767 |         5.1053767 |       -4.7890606 |         0.0000000 |        3.6513975 |         3.6084099 |        -0.1671992 |          0.3802987 |
| 1301 | head.anchor_encoder.vel_fc.7                   | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8927708 |    0.1437963 |     3.6084099 |         3.6084099 |        0.0000000 |         0.0000000 |        3.6513975 |         3.6084099 |         0.3904482 |          0.3802987 |
| 1302 | head.anchor_encoder.vel_fc.8                   | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8676253 |    0.2262441 |     5.6452675 |         5.6452675 |       -0.8886357 |        -0.8634924 |        5.1923189 |         4.9472480 |         0.0210529 |          0.0201787 |
| 1303 | head.anchor_encoder.vel_fc.9                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.6003493 |    0.6950305 |     5.8914766 |         5.8914766 |       -4.5669951 |         0.0000000 |        4.4848666 |         4.7178097 |        -0.2073602 |          0.3416508 |
| 1304 | head.anchor_encoder.vel_fc.10                  | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.9049423 |    0.1357431 |     4.3958683 |         4.3958683 |        0.0000000 |         0.0000000 |        4.4848666 |         4.7178097 |         0.3519274 |          0.3416508 |
| 1305 | head.anchor_encoder.vel_fc.11                  | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 64])         | qint8         | 1.0000000 |  0.8724720 |    0.2105637 |     5.8048687 |         5.8048687 |       -0.7895043 |        -0.7134889 |        5.4814744 |         5.5435214 |         0.0251213 |          0.0260748 |
| 1306 | head.anchor_encoder.cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8959370 |    0.1665488 |     5.8048687 |         5.8048687 |       -1.0995392 |        -1.1195819 |        7.4431357 |         7.4120479 |         0.0316397 |          0.0311823 |
| 1307 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 256, 512])        | qint16        | 0.0001526 |  0.4880881 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
| 1308 | head.layers.35.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.8660116 |    0.2971058 |     5.8048687 |     12680.7356740 |       -4.4647574 |        -4.0695052 |        7.4431357 |         7.4120479 |         0.0141443 |          0.0141584 |
| 1309 | head.layers.35.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 256, 512])        | qint16        | 0.0001786 |  0.9560770 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
| 1310 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8660588 |    0.2971058 |     5.8048687 |     12680.7356740 |       -4.4647574 |        -4.0695052 |        7.4431357 |         7.4120479 |         0.0141443 |          0.0141584 |
| 1311 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.9560735 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
| 1312 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.4880882 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
| 1313 | head.layers.35.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8660588 |    0.2971058 |     5.8048687 |     12680.7356740 |       -4.4647574 |        -4.0695052 |        7.4431357 |         7.4120479 |         0.0141443 |          0.0141584 |
| 1314 | head.layers.35.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001786 |  0.9560735 |    0.0497444 |     5.8518438 |     32764.2596189 |       -4.7621317 |        -3.6930170 |        5.2373042 |         5.2219696 |         0.0160664 |          0.0160799 |
| 1315 | head.layers.35.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 26, 512])        | qint16        | 0.0001526 |  0.4880882 |    0.0165006 |     1.5584624 |     10213.3832186 |       -1.4418186 |        -1.1844705 |        1.1682864 |         0.7369443 |        -0.0007690 |         -0.0008103 |
| 1316 | head.layers.35.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9230598 |    0.8009691 |     8.8224993 |         8.8224993 |      -12.6036081 |       -12.7502766 |       12.6302118 |        12.4478397 |         0.0226275 |          0.0070341 |
| 1317 | head.layers.35.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.9668116 |    0.1668052 |    10.5402470 |        10.5402470 |       -9.2692156 |        -9.3197546 |       11.0505905 |        11.0505896 |        -0.0374636 |         -0.0370824 |
| 1318 | head.layers.35.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([256, 26, 512])        | qint8         | 1.0000000 |  0.5472631 |    0.0216601 |     1.5019729 |         1.5019729 |       -1.1970694 |        -0.8642538 |        1.3380075 |         1.1766198 |        -0.0001742 |         -0.0008964 |
| 1319 | head.layers.35.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.9230598 |    0.8009691 |     8.8224993 |         8.8224993 |      -12.6036081 |       -12.7502766 |       12.6302118 |        12.4478397 |         0.0226275 |          0.0070341 |
| 1320 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.9230867 |    0.8009691 |     8.8224993 |         8.8224993 |      -12.6036081 |       -12.7502766 |       12.6302118 |        12.4478397 |         0.0226275 |          0.0070341 |
| 1321 | head.layers.35.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1322 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1323 | head.layers.35.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([256, 208, 64])        | qint8         | 1.0000000 | -0.0119769 |    1.7554133 |    11.0397434 |        11.0397434 |       -1.1970694 |        -9.3197546 |        1.3380075 |        11.0505896 |        -0.0001742 |         -0.0370824 |
| 1324 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 256, 64])        | qint8         | 1.0000000 | -0.0119769 |    1.7554133 |    11.0397434 |        11.0397434 |       -1.1970694 |        -9.3197546 |        1.3380075 |        11.0505896 |        -0.0001742 |         -0.0370824 |
| 1325 | head.layers.35.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.9230867 |    0.1001211 |     1.1028124 |         8.8224993 |       -1.5754510 |        -1.5937846 |        1.5787765 |         1.5559800 |         0.0028284 |          0.0008793 |
| 1326 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1327 | head.layers.35.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.8915691 |    6.5603514 |   162.1132965 |       162.1132965 |     -131.9800262 |       -96.1774750 |      152.2860718 |       128.0946960 |         4.3385024 |          3.4990182 |
| 1328 | head.layers.35.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.2538313 |    0.0006898 |     1.0000000 |         1.0000000 |        0.0000000 |         0.0000000 |        1.0000000 |         0.0312500 |         0.0039062 |          0.0034716 |
| 1329 | head.layers.35.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 256])       | qint8         | 1.0000000 |  0.2538313 |    0.0006898 |     1.0000000 |         1.0000000 |        0.0000000 |         0.0000000 |        1.0000000 |         0.0312500 |         0.0039062 |          0.0034716 |
| 1330 | head.layers.35.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.3582836 |    0.0207071 |     1.4554229 |         1.4554229 |       -1.0324014 |        -0.6341072 |        1.1311046 |         0.9279677 |         0.0000492 |         -0.0003411 |
| 1331 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1332 | head.layers.35.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1333 | head.layers.35.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.6719071 |    0.0370298 |     1.5193603 |         1.5193603 |       -1.5777169 |        -0.8223813 |        1.4785497 |         0.8945708 |         0.0102922 |          0.0101157 |
| 1334 | head.layers.35.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 256])     | qint8         | 1.0000000 |  0.2538313 |    0.0006898 |     1.0000000 |         1.0000000 |        0.0000000 |         0.0000000 |        1.0000000 |         0.0312500 |         0.0039062 |          0.0034716 |
| 1335 | head.layers.35.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.4337797 |    0.0006058 |     0.5227761 |         0.5227761 |        0.0000000 |         0.0000000 |        0.5295901 |         0.0160163 |         0.0039062 |          0.0034716 |
| 1336 | head.layers.35.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1337 | head.layers.35.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.6719153 |    0.0370298 |     1.5193603 |         1.5193603 |       -1.5777169 |        -0.8223813 |        1.4785497 |         0.8945708 |         0.0102922 |          0.0101157 |
| 1338 | head.layers.35.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.8514432 |    0.3173919 |     6.1073394 |         6.1073394 |       -4.4879236 |        -4.0405946 |        7.1594553 |         7.3053994 |         0.0244366 |          0.0242741 |
| 1339 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.8360500 |    0.3734674 |     5.8526444 |      3835.5305369 |      -11.0516729 |        -9.9905806 |        9.8683167 |        10.0127926 |         0.0375619 |          0.0269776 |
| 1340 | head.fc_before                                 | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 512])        | qint16        | 0.0001526 |  0.7576840 |    0.1206251 |     2.2858949 |     14980.6120390 |       -4.3410230 |        -3.7042270 |        3.1071923 |         2.7358105 |        -0.0021305 |         -0.0013931 |
| 1341 | head.layers.36.query_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0004578 |  0.8685206 |    0.2700081 |     5.8526444 |     12785.1017898 |      -11.0516729 |        -9.9905806 |        9.8683167 |        10.0127926 |         0.0346008 |          0.0290800 |
| 1342 | head.layers.36.key_cat                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint16        | 0.0002277 |  0.8685206 |    0.2700081 |     5.8526444 |     25698.1331924 |      -11.0516729 |        -9.9905806 |        9.8683167 |        10.0127926 |         0.0346008 |          0.0290800 |
| 1343 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8686054 |    0.2700081 |     5.8526444 |     12785.1017898 |      -11.0516729 |        -9.9905806 |        9.8683167 |        10.0127926 |         0.0346008 |          0.0290800 |
| 1344 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0002277 |  0.8686054 |    0.2700081 |     5.8526444 |     25698.1331924 |      -11.0516729 |        -9.9905806 |        9.8683167 |        10.0127926 |         0.0346008 |          0.0290800 |
| 1345 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.7577472 |    0.1206251 |     2.2858949 |     14980.6120390 |       -4.3410230 |        -3.7042270 |        3.1071923 |         2.7358105 |        -0.0021305 |         -0.0013931 |
| 1346 | head.layers.36.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0004578 |  0.8686054 |    0.2700081 |     5.8526444 |     12785.1017898 |      -11.0516729 |        -9.9905806 |        9.8683167 |        10.0127926 |         0.0346008 |          0.0290800 |
| 1347 | head.layers.36.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0002277 |  0.8686054 |    0.2700081 |     5.8526444 |     25698.1331924 |      -11.0516729 |        -9.9905806 |        9.8683167 |        10.0127926 |         0.0346008 |          0.0290800 |
| 1348 | head.layers.36.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 26, 512])        | qint16        | 0.0001526 |  0.7577472 |    0.1206251 |     2.2858949 |     14980.6120390 |       -4.3410230 |        -3.7042270 |        3.1071923 |         2.7358105 |        -0.0021305 |         -0.0013931 |
| 1349 | head.layers.36.attn.q_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9093869 |    0.5088877 |     6.5177851 |         6.5177851 |       -7.5512915 |        -7.4918318 |        9.1193743 |         8.4129791 |         0.0259904 |          0.0284190 |
| 1350 | head.layers.36.attn.k_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.9124265 |    0.5543840 |     7.1546788 |         7.1546788 |      -12.3356323 |       -11.9347086 |       13.6249218 |        14.0048389 |        -0.0220785 |         -0.0258790 |
| 1351 | head.layers.36.attn.v_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.7511192 |    0.1378151 |     5.7132950 |         5.7132950 |       -4.1191769 |        -4.0550170 |        2.5582688 |         2.6521969 |         0.0052296 |          0.0076859 |
| 1352 | head.layers.36.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.9093869 |    0.5088877 |     6.5177851 |         6.5177851 |       -7.5512915 |        -7.4918318 |        9.1193743 |         8.4129791 |         0.0259904 |          0.0284190 |
| 1353 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.9093674 |    0.5088877 |     6.5177851 |         6.5177851 |       -7.5512915 |        -7.4918318 |        9.1193743 |         8.4129791 |         0.0259904 |          0.0284190 |
| 1354 | head.layers.36.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0668795 |    1.9417013 |    12.0875549 |        12.0875549 |      -12.3356323 |        -7.4918318 |       13.6249218 |         8.4129791 |        -0.0220785 |          0.0284190 |
| 1355 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0668776 |    1.9417013 |    12.0875549 |        12.0875549 |      -12.3356323 |        -7.4918318 |       13.6249218 |         8.4129791 |        -0.0220785 |          0.0284190 |
| 1356 | head.layers.36.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([512, 208, 64])        | qint8         | 1.0000000 |  0.0072982 |    1.4474037 |    13.9808674 |        13.9808674 |       -4.1191769 |       -11.9347086 |        2.5582688 |        14.0048389 |         0.0052296 |         -0.0258790 |
| 1357 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.0072977 |    1.4474037 |    13.9808674 |        13.9808674 |       -4.1191769 |       -11.9347086 |        2.5582688 |        14.0048389 |         0.0052296 |         -0.0258790 |
| 1358 | head.layers.36.attn                            | torch.Tensor.mul                                                              | torch.Tensor.mul                                                        | torch.Size([208, 512, 64])        | qint8         | 0.1250000 |  0.9093674 |    0.0636110 |     0.8147231 |         6.5177851 |       -0.9439114 |        -0.9364790 |        1.1399218 |         1.0516224 |         0.0032488 |          0.0035524 |
| 1359 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1360 | head.layers.36.attn.matmul                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.8591559 |    3.3855739 |    57.2143326 |        57.2143326 |      -62.1858215 |       -60.0077477 |       49.4206123 |        56.2250900 |        -0.9614466 |         -1.3919129 |
| 1361 | head.layers.36.attn.softmax                    | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.4135608 |    0.0015838 |     0.9980757 |         0.9980757 |        0.0000000 |         0.0000000 |        0.9980763 |         0.0312500 |         0.0019531 |          0.0010095 |
| 1362 | head.layers.36.attn.attention_drop             | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([208, 512, 512])       | qint8         | 1.0000000 |  0.4135608 |    0.0015838 |     0.9980757 |         0.9980757 |        0.0000000 |         0.0000000 |        0.9980763 |         0.0312500 |         0.0019531 |          0.0010095 |
| 1363 | head.layers.36.attn.attn_matmul                | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.matmul | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.matmul | torch.Size([208, 512, 64])        | qint8         | 1.0000000 |  0.5567281 |    0.1335669 |     2.4728429 |         2.4728429 |       -2.2800524 |        -3.0259974 |        1.9861423 |         1.9022200 |         0.0101084 |          0.0053849 |
| 1364 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1365 | head.layers.36.attn                            | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1366 | head.layers.36.attn.out_proj                   | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([512, 26, 512])        | qint8         | 1.0000000 |  0.5748390 |    0.2356471 |     1.6860160 |         1.6860160 |       -1.7198184 |        -1.6399841 |        1.6096017 |         1.4362413 |         0.0059001 |          0.0113067 |
| 1367 | head.layers.36.attn                            | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 8, 512, 512])     | qint8         | 1.0000000 |  0.4135608 |    0.0015838 |     0.9980757 |         0.9980757 |        0.0000000 |         0.0000000 |        0.9980763 |         0.0312500 |         0.0019531 |          0.0010095 |
| 1368 | head.layers.36.attn.attn_weights_mean          | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mean   | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mean   | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.5704473 |    0.0013164 |     0.2481593 |         0.2481593 |        0.0000000 |         0.0000000 |        0.2599523 |         0.0126358 |         0.0019531 |          0.0010095 |
| 1369 | head.layers.36.attn                            | torch.Tensor.transpose                                                        | torch.Tensor.transpose                                                  | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1370 | head.layers.36.dropout                         | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.5747402 |    0.2356471 |     1.6860160 |         1.6860160 |       -1.7198184 |        -1.6399841 |        1.6096017 |         1.4362413 |         0.0059001 |          0.0113067 |
| 1371 | head.layers.36.add                             | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint16        | 1.0000000 |  0.8311077 |    0.3955697 |     6.0422506 |         6.0422506 |      -10.3223095 |        -9.7033768 |       10.2433662 |         9.8951216 |         0.0405009 |          0.0403867 |
| 1372 | head.fc_after                                  | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint16        | 0.0015259 |  0.9079217 |    0.6048120 |    11.0861645 |      7265.3178900 |      -24.5930691 |       -21.3553696 |       19.2949429 |        18.8503304 |         0.0108344 |          0.0006219 |
| 1373 | head.layers.37                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9048146 |    0.2903774 |     4.1642418 |         4.1642418 |       -6.5821633 |        -6.1967411 |        5.4629703 |         5.3824949 |        -0.0005451 |         -0.0019017 |
| 1374 | head.layers.38.kps_generator.offset            | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 24])         | qint16        | 1.0000000 |  0.9335077 |    0.3349579 |     4.1931906 |         4.1931906 |       -6.1264839 |        -6.8023157 |        5.4765606 |         5.2952695 |        -0.3537993 |         -0.3920669 |
| 1375 | head.layers.38.kps_generator                   | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.9335077 |    0.3349579 |     4.1931906 |         4.1931906 |       -6.1264839 |        -6.8023157 |        5.4765606 |         5.2952695 |        -0.3537993 |         -0.3920669 |
| 1376 | head.layers.38.kps_generator                   | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 3])       | qint16        | 1.0000000 |  0.9456390 |    1.6525263 |   120.3307877 |       120.3307877 |      -61.4154358 |       -61.3173294 |       65.0450211 |        63.5444984 |         3.3838532 |          3.8786988 |
| 1377 | head.layers.38.kps_generator.keypoints_add     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 8, 3])       | qint16        | 1.0000000 |  0.9446465 |    1.7718551 |   123.8387756 |       123.8387756 |      -66.1307831 |       -66.8389587 |       68.9452591 |        67.6826859 |         3.0300541 |          3.4866319 |
| 1378 | head.layers.38.weight_add                      | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8940197 |    0.3849826 |     7.1306992 |         7.1306992 |       -7.0509000 |        -6.6705055 |        8.4642706 |         8.0826006 |         0.0310946 |          0.0292806 |
| 1379 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 3, 4])         | qint16        | 1.0000000 |  0.9999999 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3522723 |          0.3522723 |
| 1380 | head.layers.38                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 6, 12])           | qint16        | 1.0000000 |  0.9999999 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3522723 |          0.3522723 |
| 1381 | head.layers.38.camera_encoder.0                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.6843171 |    0.4217487 |     5.6705537 |         5.6705537 |       -5.6705537 |         0.0000000 |        6.1848178 |         6.1848178 |        -0.0660872 |          0.3556615 |
| 1382 | head.layers.38.camera_encoder.1                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  1.0000005 |    0.0000000 |     0.0000000 |         0.0000000 |        0.0000000 |         0.0000000 |        6.1848178 |         6.1848178 |         0.3556615 |          0.3556615 |
| 1383 | head.layers.38.camera_encoder.2                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.9999989 |    0.0000000 |     0.0000007 |         0.0000007 |       -0.8415864 |        -0.8415864 |        5.2689471 |         5.2689471 |         0.0144101 |          0.0144101 |
| 1384 | head.layers.38.camera_encoder.3                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  0.8946725 |    1.1532757 |    11.8981352 |        11.8981352 |      -11.8981352 |         0.0000000 |       31.1488247 |        31.1488247 |         0.1222667 |          1.2755425 |
| 1385 | head.layers.38.camera_encoder.4                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  1.0000005 |    0.0000001 |     0.0000038 |         0.0000038 |        0.0000000 |         0.0000000 |       31.1488247 |        31.1488247 |         1.2755425 |          1.2755425 |
| 1386 | head.layers.38.camera_encoder.5                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 6, 256])          | qint8         | 1.0000000 |  1.0000017 |    0.0000000 |     0.0000010 |         0.0000010 |       -1.5707064 |        -1.5707064 |        7.8239322 |         7.8239322 |         0.0144880 |          0.0144880 |
| 1387 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 1, 256])     | qint8         | 1.0000000 |  0.8940197 |    0.3849826 |     7.1306992 |         7.1306992 |       -7.0509000 |        -6.6705055 |        8.4642706 |         8.0826006 |         0.0310946 |          0.0292806 |
| 1388 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 6, 256])       | qint8         | 1.0000000 |  1.0000017 |    0.0000000 |     0.0000010 |         0.0000010 |       -1.5707064 |        -1.5707064 |        7.8239322 |         7.8239322 |         0.0144880 |          0.0144880 |
| 1389 | head.layers.38.cam_add                         | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 6, 256])     | qint8         | 1.0000000 |  0.9048687 |    0.3849826 |     7.1306992 |         7.1306992 |       -6.6862745 |        -6.3695230 |       14.5647888 |        14.5636063 |         0.0455826 |          0.0437686 |
| 1390 | head.layers.38.weights_fc                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 6, 64])      | qint8         | 1.0000000 |  0.9661776 |    0.3688699 |     4.6803255 |         4.6803255 |       -6.8752646 |        -7.1708159 |       10.6873360 |        11.9221306 |         0.3255630 |          0.3439045 |
| 1391 | head.layers.38                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.9661776 |    0.3688699 |     4.6803255 |         4.6803255 |       -6.8752646 |        -7.1708159 |       10.6873360 |        11.9221306 |         0.3255630 |          0.3439045 |
| 1392 | head.layers.38.weight_softmax                  | torch.nn.modules.activation.Softmax                                           | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8])      | qint8         | 1.0000000 |  0.8697503 |    0.0182307 |     0.8476756 |         0.8476756 |        0.0000002 |         0.0000000 |        0.8789256 |         0.0312500 |         0.0208333 |          0.0026197 |
| 1393 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  0.9366653 |    3.2454629 |   123.8387756 |       123.8387756 |      -66.1307831 |       -66.8389587 |       68.9452591 |        67.6826859 |         5.0577798 |          6.1415877 |
| 1394 | head.layers.38                                 | torch.ones_like                                                               | torch.ones_like                                                         | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
| 1395 | head.layers.38.point_quant_stub                | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([26, 512, 8, 1])       | qint16        | 1.0000000 |  1.0000001 |    0.0000000 |     0.0000000 |         0.0000000 |        1.0000000 |         1.0000000 |        1.0000000 |         1.0000000 |         1.0000000 |          1.0000000 |
| 1396 | head.layers.38.point_cat                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 8, 4])       | qint16        | 1.0000000 |  0.9446542 |    1.3288913 |   123.8387756 |       123.8387756 |      -66.1307831 |       -66.8389587 |       68.9452591 |        67.6826859 |         2.5225406 |          2.8649740 |
| 1397 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 1, 1, 4, 4])   | qint16        | 1.0000000 |  0.9999996 |    0.0000000 |     0.0000000 |         0.0000000 |      -10.4531412 |       -10.4531412 |       27.1799469 |        27.1799469 |         0.3267042 |          0.3267042 |
| 1398 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 1, 512, 8, 1, 4]) | qint16        | 1.0000000 |  0.9446542 |    1.3288913 |   123.8387756 |       123.8387756 |      -66.1307831 |       -66.8389587 |       68.9452591 |        67.6826859 |         2.5225406 |          2.8649740 |
| 1399 | head.layers.38.point_matmul                    | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 4, 4]) | qint16        | 1.0000000 |  0.9652429 |    0.5970316 |   233.8100891 |       233.8100891 |     -173.2050323 |      -172.3297119 |      577.0599976 |       555.9821167 |         1.0636889 |          1.0640264 |
| 1400 | head.layers.38.point_sum                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 6, 512, 8, 4])    | qint16        | 0.0244144 |  0.9657287 |    1.9687290 |   257.4279785 |     10544.0891100 |     -175.4652100 |      -174.7689362 |      626.6557007 |       627.4861450 |         4.2547555 |          4.2561049 |
| 1401 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.9389012 |    2.3253419 |   123.6234589 |      5063.5396115 |      -71.3926392 |       -72.1058121 |       79.5332336 |        77.5108490 |         1.1365771 |          1.1333164 |
| 1402 | head.layers.38                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0244144 |  0.9605180 |    1.2635959 |    77.2583923 |      3164.4554642 |        0.0100000 |         0.0100000 |       79.5332336 |        77.5108490 |        11.5359221 |         11.5748405 |
| 1403 | head.layers.38.reciprocal_op                   | horizon_plugin_pytorch.nn.reciprocal.Reciprocal                               | horizon_plugin_pytorch.nn.qat.segment_lut.SegmentLUT                    | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.9525970 |   41.3113480 |    99.9838638 |     32762.2125883 |        0.0125734 |         0.0129014 |      100.0000000 |         1.2799804 |        41.9407501 |          0.6503441 |
| 1404 | head.layers.38                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 1])    | qint16        | 0.0030518 |  0.9525970 |   41.3113480 |    99.9838638 |     32762.2125883 |        0.0125734 |         0.0129014 |      100.0000000 |         1.2799804 |        41.9407501 |          0.6503441 |
| 1405 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0244144 |  0.9707067 |    2.7747874 |   257.4279785 |     10544.0891100 |     -175.4652100 |      -174.7689362 |      626.6557007 |       627.4861450 |         7.4412217 |          7.4455528 |
| 1406 | head.layers.38.point_mul                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.8997490 | 1475.5001221 | 62009.8632812 | 203190819.5541450 |   -17546.5214844 |      -223.7008209 |    62665.5703125 |       803.1699829 |      1054.1228027 |         12.9954033 |
| 1407 | head.layers.38                                 | torch.clamp                                                                   | torch.clamp                                                             | torch.Size([26, 6, 512, 8, 2])    | qint16        | 0.0003052 |  0.5727104 |  177.0679626 |   603.5178223 |   1977577.0245693 |     -500.0000000 |      -223.7008209 |      500.0000000 |       500.0000000 |       106.2985077 |         12.6178493 |
| 1408 | head.layers.38                                 | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([156, 512, 8, 2])      | qint16        | 0.0003052 |  0.5727104 |  177.0679626 |   603.5178223 |   1977577.0245693 |     -500.0000000 |      -223.7008209 |      500.0000000 |       500.0000000 |       106.2985077 |         12.6178493 |
| 1409 | head.layers.38                                 | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer            | horizon_plugin_pytorch.nn.grid_sample.autocasted_grid_sample_outer      | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.6607866 |    0.5469114 |    50.6698799 |        50.6698799 |      -56.7030945 |       -56.5155716 |       50.8610039 |        48.8257332 |         0.0161847 |          0.0362761 |
| 1410 | head.layers.38.feat_cat                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([156, 256, 512, 8])    | qint8         | 1.0000000 |  0.6607866 |    0.5469114 |    50.6698799 |        50.6698799 |      -56.7030945 |       -56.5155716 |       50.8610039 |        48.8257332 |         0.0161847 |          0.0362761 |
| 1411 | head.layers.38                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 6, 256, 512, 8])  | qint8         | 1.0000000 |  0.6607866 |    0.5469114 |    50.6698799 |        50.6698799 |      -56.7030945 |       -56.5155716 |       50.8610039 |        48.8257332 |         0.0161847 |          0.0362761 |
| 1412 | head.layers.38                                 | torch.Tensor.permute                                                          | torch.Tensor.permute                                                    | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.6606361 |    0.5469114 |    50.6698799 |        50.6698799 |      -56.7030945 |       -56.5155716 |       50.8610039 |        48.8257332 |         0.0161847 |          0.0362761 |
| 1413 | head.layers.38                                 | torch.Tensor.contiguous                                                       | torch.Tensor.contiguous                                                 | torch.Size([26, 512, 6, 8, 256])  | qint8         | 1.0000000 |  0.6606361 |    0.5469114 |    50.6698799 |        50.6698799 |      -56.7030945 |       -56.5155716 |       50.8610039 |        48.8257332 |         0.0161847 |          0.0362761 |
| 1414 | head.layers.38                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.6606361 |    0.5469114 |    50.6698799 |        50.6698799 |      -56.7030945 |       -56.5155716 |       50.8610039 |        48.8257332 |         0.0161847 |          0.0362761 |
| 1415 | head.layers.38                                 | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512, 48, 8, 1])   | qint8         | 1.0000000 |  0.8697503 |    0.0182307 |     0.8476756 |         0.8476756 |        0.0000002 |         0.0000000 |        0.8789256 |         0.0312500 |         0.0208333 |          0.0026197 |
| 1416 | head.layers.38                                 | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.6606361 |    0.5469114 |    50.6698799 |        50.6698799 |      -56.7030945 |       -56.5155716 |       50.8610039 |        48.8257332 |         0.0161847 |          0.0362761 |
| 1417 | head.layers.38.feat_mul                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 48, 8, 32])  | qint8         | 1.0000000 |  0.3468256 |    0.0089847 |     4.8245754 |         4.8245754 |       -4.6549964 |        -0.9142435 |        4.8212671 |         0.6509347 |        -0.0001267 |          0.0000638 |
| 1418 | head.layers.38                                 | torch.Tensor.view                                                             | torch.Tensor.view                                                       | torch.Size([26, 512, 48, 256])    | qint8         | 1.0000000 |  0.3468256 |    0.0089847 |     4.8245754 |         4.8245754 |       -4.6549964 |        -0.9142435 |        4.8212671 |         0.6509347 |        -0.0001267 |          0.0000638 |
| 1419 | head.layers.38.feat_sum                        | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.sum    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.sum    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.5145950 |    0.3356423 |     8.2296953 |         8.2296953 |       -8.5389776 |        -1.5812856 |        6.9833956 |         1.4325324 |        -0.0060829 |          0.0030606 |
| 1420 | head.layers.38.output_proj                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.5032809 |    0.4936974 |     6.5984097 |         6.5984097 |       -7.0878944 |        -1.5435343 |        7.0558782 |         1.2228863 |        -0.0036626 |          0.0001929 |
| 1421 | head.layers.38.proj_drop                       | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.5032809 |    0.4936974 |     6.5984097 |         6.5984097 |       -7.0878944 |        -1.5435343 |        7.0558782 |         1.2228863 |        -0.0036626 |          0.0001929 |
| 1422 | head.layers.38.residual_op                     | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.cat    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.cat    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.6812826 |    0.3920374 |     6.5984097 |         6.5984097 |       -7.0878944 |        -6.1967411 |        7.0558782 |         5.3824949 |        -0.0021038 |         -0.0008544 |
| 1423 | head.layers.39.pre_norm                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 512])        | qint8         | 1.0000000 |  0.7592800 |    0.4338402 |     6.1273561 |         6.1273561 |       -8.2480631 |        -8.2102909 |        7.2772179 |         6.8450541 |        -0.0012029 |         -0.0031129 |
| 1424 | head.layers.39.layers.0.0                      | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.1997430 |    2.4006758 |    12.4989586 |        12.4989586 |      -11.9220438 |         0.0000000 |       13.1719160 |        10.4406681 |        -1.5450952 |          0.3176708 |
| 1425 | head.layers.39.layers.0.2                      | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 1024])       | qint8         | 1.0000000 |  0.5726343 |    0.4153463 |    10.0505543 |        10.0505543 |        0.0000000 |         0.0000000 |       13.1719160 |        10.4406681 |         0.4402343 |          0.3176708 |
| 1426 | head.layers.39.layers.1                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6810189 |    3.3233504 |    45.9535179 |        45.9535179 |      -41.9120522 |       -54.5162277 |       38.4383736 |        46.7409248 |         0.2112903 |          0.2116069 |
| 1427 | head.layers.39.layers.2                        | torch.nn.modules.dropout.Dropout                                              | torch.nn.modules.dropout.Dropout                                        | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6810189 |    3.3233504 |    45.9535179 |        45.9535179 |      -41.9120522 |       -54.5162277 |       38.4383736 |        46.7409248 |         0.2112903 |          0.2116069 |
| 1428 | head.layers.39.identity_fc                     | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.linear.Identity                               | mismatch shape                    | qint8         | 1.0000000 |            |              |               |                   |                  |                   |                  |                   |                   |                    |
| 1429 | head.layers.39.short_add                       | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.linear.LinearAdd                          | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7287286 |    4.3105397 |    49.2313232 |        49.2313232 |      -44.3080902 |       -59.4530258 |       38.4768066 |        54.5619888 |         0.2451589 |          0.2688501 |
| 1430 | head.layers.40                                 | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8099501 |    0.4459920 |     4.7385025 |         4.7385025 |       -4.7168055 |        -3.4163883 |        3.2431469 |         2.9459968 |        -0.0070837 |         -0.0055078 |
| 1431 | head.layers.41.add1                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8527493 |    0.5191544 |     6.8656974 |         6.8656974 |       -4.3422904 |        -3.7749181 |        7.7487783 |         7.8148460 |         0.0245560 |          0.0256746 |
| 1432 | head.layers.41.layers.0                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.3672154 |    1.1215774 |     7.6528955 |         7.6528955 |       -7.6528955 |         0.0000000 |        8.2894030 |         7.6746831 |        -0.4189046 |          0.3437986 |
| 1433 | head.layers.41.layers.1                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7320881 |    0.2917697 |     5.6635208 |         5.6635208 |        0.0000000 |         0.0000000 |        8.2894030 |         7.6746831 |         0.4109031 |          0.3437986 |
| 1434 | head.layers.41.layers.2                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.1894092 |    1.1020440 |     9.0484180 |         9.0484180 |       -9.0484180 |         0.0000000 |        7.9541507 |         5.6615219 |        -0.7294859 |          0.1596343 |
| 1435 | head.layers.41.layers.3                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.5937434 |    0.1716640 |     6.1574292 |         6.1574292 |        0.0000000 |         0.0000000 |        7.9541507 |         5.6615219 |         0.2008942 |          0.1596343 |
| 1436 | head.layers.41.layers.4                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6398500 |    0.3940790 |     8.9356356 |         8.9356356 |       -0.6094676 |        -0.6084515 |        8.4958811 |         8.6124792 |         0.0315773 |          0.0326021 |
| 1437 | head.layers.41.layers.5                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.1963569 |    1.3321058 |     8.3704453 |         8.3704453 |       -6.7255950 |         0.0000000 |        6.1342793 |         5.7036715 |        -0.8591025 |          0.2352762 |
| 1438 | head.layers.41.layers.6                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.6262276 |    0.2287930 |     5.7036715 |         5.7036715 |        0.0000000 |         0.0000000 |        6.1342793 |         5.7036715 |         0.2442101 |          0.2352762 |
| 1439 | head.layers.41.layers.7                        | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.3647002 |    0.8320561 |     9.4536295 |         9.4536295 |       -4.3109875 |         0.0000000 |       17.1584873 |        14.2810211 |        -0.5319792 |          0.1498296 |
| 1440 | head.layers.41.layers.8                        | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7956219 |    0.1416968 |     9.4536295 |         9.4536295 |        0.0000000 |         0.0000000 |       17.1584873 |        14.2810211 |         0.1583801 |          0.1498296 |
| 1441 | head.layers.41.layers.9                        | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7203000 |    0.3330355 |    12.1896572 |        12.1896572 |       -0.9825494 |        -0.7651547 |       13.2662134 |        13.3164721 |         0.0210264 |          0.0204748 |
| 1442 | head.layers.41.layers.10                       | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 11])         | qint8         | 1.0000000 |  0.4903299 |    0.8226603 |    16.6280861 |        16.6280861 |      -10.4032784 |        -6.9201198 |        9.4736414 |         9.7409763 |         0.1117469 |          0.1191903 |
| 1443 | head.layers.41.layers.11.scale_quant_stub      | horizon_plugin_pytorch.quantization.stubs.QuantStub                           | horizon_plugin_pytorch.nn.qat.stubs.QuantStub                           | torch.Size([11])                  | qint16        | 1.0000000 |  1.0000000 |    0.0000000 |     0.0000000 |         0.0000000 |        0.0019809 |         0.0019809 |        0.9931466 |         0.9931466 |         0.1159605 |          0.1159605 |
| 1444 | head.layers.41.layers.11.mul                   | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.mul    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.mul    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.8261927 |    0.0340118 |     2.4167817 |         2.4167817 |       -1.5120472 |        -1.4328636 |        1.3769306 |         1.4157860 |        -0.0127853 |         -0.0160260 |
| 1445 | head.layers.41.add2                            | horizon_plugin_pytorch.nn.quantized.functional_modules.FloatFunctional.add    | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 11])         | qint16        | 1.0000000 |  0.9412391 |    0.9495513 |   120.3196182 |       120.3196182 |      -61.4028854 |       -61.3333740 |       65.1050568 |        63.5411987 |        -0.3585957 |         -0.0920971 |
| 1446 | head.layers.41.cls_layers.0                    | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.5628733 |    1.7641009 |    12.6710701 |        12.6710701 |       -9.3639059 |         0.0000000 |       10.5767813 |        11.0957060 |        -0.4791520 |          0.8571646 |
| 1447 | head.layers.41.cls_layers.1                    | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8562639 |    0.4958507 |     9.3160896 |         9.3160896 |        0.0000000 |         0.0000000 |       10.5767813 |        11.0957060 |         0.7890982 |          0.8571646 |
| 1448 | head.layers.41.cls_layers.2                    | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8196029 |    0.3110953 |     6.7520800 |         6.7520800 |       -0.7336861 |        -0.7249477 |        6.7383223 |         6.5134048 |         0.0412451 |          0.0438390 |
| 1449 | head.layers.41.cls_layers.3                    | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.4639918 |    1.6323586 |    12.1253080 |        12.1253080 |       -9.6074190 |         0.0000000 |       14.5188103 |        14.0975637 |        -0.8494029 |          0.3778844 |
| 1450 | head.layers.41.cls_layers.4                    | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8439459 |    0.2945450 |    11.2290630 |        11.2290630 |        0.0000000 |         0.0000000 |       14.5188103 |        14.0975637 |         0.4884108 |          0.3778844 |
| 1451 | head.layers.41.cls_layers.5                    | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8156661 |    0.2864792 |     7.8300552 |         7.8300552 |       -0.7931375 |        -0.5636569 |        9.4109316 |         9.3744783 |         0.0262400 |          0.0247908 |
| 1452 | head.layers.41.cls_layers.6                    | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 4])          | torch.float32 |           |  0.9730930 |    1.2423971 |     6.9754410 |                   |       -8.1858149 |        -7.5685663 |        3.6345825 |        -1.1262517 |        -5.8551311 |         -5.3033013 |
| 1453 | head.layers.41.quality_layers.0                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.4867732 |    1.7252045 |    12.4666853 |        12.4666853 |      -12.4666853 |         0.0000000 |       10.6710882 |         9.9156342 |        -0.5841814 |          0.7106777 |
| 1454 | head.layers.41.quality_layers.1                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8490651 |    0.4168126 |     8.4291277 |         8.4291277 |        0.0000000 |         0.0000000 |       10.6710882 |         9.9156342 |         0.7242106 |          0.7106777 |
| 1455 | head.layers.41.quality_layers.2                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.8004553 |    0.3304113 |     6.2745342 |         6.2745342 |       -0.9771113 |        -1.0013809 |        7.6363616 |         6.6215105 |         0.0157157 |          0.0154168 |
| 1456 | head.layers.41.quality_layers.3                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.LinearReLU                         | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.5037273 |    1.4639852 |    15.0159855 |        15.0159855 |      -11.6416178 |         0.0000000 |       29.3735619 |        22.4986343 |        -0.9723557 |          0.2712631 |
| 1457 | head.layers.41.quality_layers.4                | torch.nn.modules.activation.ReLU                                              | horizon_plugin_pytorch.nn.linear.Identity                               | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.9096013 |    0.1769004 |    11.7960062 |        11.7960062 |        0.0000000 |         0.0000000 |       29.3735619 |        22.4986343 |         0.3147294 |          0.2712631 |
| 1458 | head.layers.41.quality_layers.5                | torch.nn.modules.normalization.LayerNorm                                      | horizon_plugin_pytorch.nn.qat.functional_modules.FloatFunctional.add    | torch.Size([26, 512, 256])        | qint8         | 1.0000000 |  0.7697990 |    0.1804893 |     6.8674917 |         6.8674917 |       -1.1481067 |        -0.8703682 |        9.3048639 |         7.6022243 |         0.0488719 |          0.0476598 |
| 1459 | head.layers.41.quality_layers.6                | torch.nn.modules.linear.Linear                                                | horizon_plugin_pytorch.nn.qat.linear.Linear                             | torch.Size([26, 512, 2])          | torch.float32 |           |  0.9641055 |    0.8126020 |     7.5848341 |                   |       -4.1956558 |        -2.7318664 |       10.1384583 |         9.5270424 |         2.4316299 |          2.3218615 |
| 1460 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 11])         | torch.float32 |           |  0.9412391 |    0.9495513 |   120.3196182 |                   |      -61.4028854 |       -61.3333740 |       65.1050568 |        63.5411987 |        -0.3585957 |         -0.0920971 |
| 1461 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 4])          | torch.float32 |           |  0.9730930 |    1.2423971 |     6.9754410 |                   |       -8.1858149 |        -7.5685663 |        3.6345825 |        -1.1262517 |        -5.8551311 |         -5.3033013 |
| 1462 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 2])          | torch.float32 |           |  0.9641055 |    0.8126020 |     7.5848341 |                   |       -4.1956558 |        -2.7318664 |       10.1384583 |         9.5270424 |         2.4316299 |          2.3218615 |
| 1463 | head.dequant                                   | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 4])          | torch.float32 |           |  0.9730930 |    1.2423971 |     6.9754410 |                   |       -8.1858149 |        -7.5685663 |        3.6345825 |        -1.1262517 |        -5.8551311 |         -5.3033013 |
| 1464 | head.instance_bank.dequant                     | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 11])         | torch.float32 |           |  0.9412391 |    0.9495513 |   120.3196182 |                   |      -61.4028854 |       -61.3333740 |       65.1050568 |        63.5411987 |        -0.3585957 |         -0.0920971 |
| 1465 | head.instance_bank.dequant                     | torch.ao.quantization.stubs.DeQuantStub                                       | horizon_plugin_pytorch.nn.qat.stubs.DeQuantStub                         | torch.Size([26, 512, 256])        | torch.float32 |           |  0.8099501 |    0.4459920 |     4.7385025 |                   |       -4.7168055 |        -3.4163883 |        3.2431469 |         2.9459968 |        -0.0070837 |         -0.0055078 |
| 1466 | head                                           | torch.Tensor.detach                                                           | torch.Tensor.detach                                                     | torch.Size([26, 512, 256])        | torch.float32 |           |  0.8099501 |    0.4459920 |     4.7385025 |                   |       -4.7168055 |        -3.4163883 |        3.2431469 |         2.9459968 |        -0.0070837 |         -0.0055078 |
| 1467 | head                                           | torch.Tensor.detach                                                           | torch.Tensor.detach                                                     | torch.Size([26, 512, 11])         | torch.float32 |           |  0.9412391 |    0.9495513 |   120.3196182 |                   |      -61.4028854 |       -61.3333740 |       65.1050568 |        63.5411987 |        -0.3585957 |         -0.0920971 |
| 1468 | head                                           | torch.Tensor.detach                                                           | torch.Tensor.detach                                                     | torch.Size([26, 512, 4])          | torch.float32 |           |  0.9730930 |    1.2423971 |     6.9754410 |                   |       -8.1858149 |        -7.5685663 |        3.6345825 |        -1.1262517 |        -5.8551311 |         -5.3033013 |
| 1469 | head                                           | torch.Tensor.max                                                              | torch.Tensor.max                                                        | torch.Size([26, 512])             | torch.float32 |           |  0.9482369 |    1.5317476 |     6.9754410 |                   |       -7.3448629 |        -6.9203925 |        3.6345825 |        -1.1262517 |        -4.9900012 |         -4.1703539 |
| 1470 | head                                           | torch.Tensor.sigmoid                                                          | torch.Tensor.sigmoid                                                    | torch.Size([26, 512])             | torch.float32 |           |  0.6908469 |    0.0260463 |     0.9157676 |                   |        0.0006455 |         0.0009865 |        0.9742838 |         0.2448535 |         0.0307596 |          0.0246164 |
| 1471 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 128])             | torch.float32 |           |  0.7561914 |    0.0172814 |     0.0288735 |                   |        0.0009760 |         0.0016658 |        0.0099797 |         0.0302701 |         0.0016016 |          0.0188328 |
| 1472 | head                                           | torch.maximum                                                                 | torch.maximum                                                           | torch.Size([26, 128])             | torch.float32 |           |  0.8529068 |    0.0283118 |     0.5607021 |                   |        0.0156421 |         0.0064661 |        0.5884708 |         0.1697835 |         0.0615486 |          0.0349975 |
| 1473 | head                                           | torch.topk                                                                    | torch.topk                                                              | torch.Size([26, 128])             | torch.float32 |           |  0.8983343 |    0.0619711 |     0.8208282 |                   |        0.0243123 |         0.0115965 |        0.9742838 |         0.2448535 |         0.1203212 |          0.0584839 |
| 1474 | head                                           | torch.Tensor.add                                                              | torch.Tensor.add                                                        | torch.Size([26, 128])             | torch.int64   |           |  0.9995940 |  167.4173737 |   508.0000000 |                   |        0.0000000 |         0.0000000 |    13307.0000000 |     13308.0000000 |      6625.1962891 |       6635.2309570 |
| 1475 | head                                           | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([3328])                | torch.int64   |           |  0.9995940 |  167.4173737 |   508.0000000 |                   |        0.0000000 |         0.0000000 |    13307.0000000 |     13308.0000000 |      6625.1962891 |       6635.2309570 |
| 1476 | head                                           | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([13312, 256])          | torch.float32 |           |  0.8099501 |    0.4459920 |     4.7385025 |                   |       -4.7168055 |        -3.4163883 |        3.2431469 |         2.9459968 |        -0.0070837 |         -0.0055078 |
| 1477 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([3328, 256])           | torch.float32 |           |  0.6625192 |    0.5821832 |     5.9504137 |                   |       -4.7168055 |        -3.4163883 |        3.2431469 |         2.9459968 |        -0.0068622 |         -0.0050558 |
| 1478 | head                                           | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 128, 256])        | torch.float32 |           |  0.6625192 |    0.5821832 |     5.9504137 |                   |       -4.7168055 |        -3.4163883 |        3.2431469 |         2.9459968 |        -0.0068622 |         -0.0050558 |
| 1479 | head                                           | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([13312, 11])           | torch.float32 |           |  0.9412391 |    0.9495513 |   120.3196182 |                   |      -61.4028854 |       -61.3333740 |       65.1050568 |        63.5411987 |        -0.3585957 |         -0.0920971 |
| 1480 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([3328, 11])            | torch.float32 |           |  0.2552681 |    7.3500738 |   123.2854614 |                   |      -61.4028854 |       -60.4245796 |       64.0388107 |        63.5411987 |        -3.1804247 |         -1.1374385 |
| 1481 | head                                           | torch.Tensor.reshape                                                          | torch.Tensor.reshape                                                    | torch.Size([26, 128, 11])         | torch.float32 |           |  0.2552681 |    7.3500738 |   123.2854614 |                   |      -61.4028854 |       -60.4245796 |       64.0388107 |        63.5411987 |        -3.1804247 |         -1.1374385 |
| 1482 | head                                           | torch.cat                                                                     | torch.cat                                                               | torch.Size([26, 512])             | torch.float32 |           |  0.8576225 |    0.0477782 |     0.9345036 |                   |        0.0243123 |         0.0107497 |        0.9829622 |         0.2873255 |         0.0989149 |          0.0520301 |
| 1483 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 384])             | torch.float32 |           |  0.8685879 |    0.0510926 |     0.9345036 |                   |        0.0243123 |         0.0107497 |        0.9825479 |         0.2873255 |         0.1037568 |          0.0532589 |
| 1484 | head                                           | torch.cat                                                                     | torch.cat                                                               | torch.Size([26, 512, 256])        | torch.float32 |           |  0.7018991 |    0.2149934 |     5.9504137 |                   |       -4.7621317 |        -3.6930170 |        3.3936565 |         3.0826545 |        -0.0025481 |         -0.0020668 |
| 1485 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 384, 256])        | torch.float32 |           |  0.6909100 |    0.2636755 |     5.9504137 |                   |       -4.7621317 |        -3.6423066 |        3.3327384 |         3.0692720 |        -0.0031118 |         -0.0024552 |
| 1486 | head                                           | torch.cat                                                                     | torch.cat                                                               | torch.Size([26, 512, 11])         | torch.float32 |           |  0.5051459 |    3.1295767 |   198.2932281 |                   |     -147.1090393 |      -136.8962708 |       68.6145782 |        65.1100235 |        -2.8653555 |         -1.9119092 |
| 1487 | head                                           | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 384, 11])         | torch.float32 |           |  0.4574913 |    3.6965542 |   191.4260254 |                   |     -147.1090393 |      -136.8962708 |       68.6145782 |        64.9900055 |        -2.9144757 |         -1.8147252 |
| 1488 |                                                | torch.Tensor.sigmoid                                                          | torch.Tensor.sigmoid                                                    | torch.Size([26, 512, 4])          | torch.float32 |           |  0.6903323 |    0.0106708 |     0.9157676 |                   |        0.0002785 |         0.0005162 |        0.9742838 |         0.2448535 |         0.0122643 |          0.0097555 |
| 1489 |                                                | torch.Tensor.flatten                                                          | torch.Tensor.flatten                                                    | torch.Size([26, 2048])            | torch.float32 |           |  0.6903323 |    0.0106708 |     0.9157676 |                   |        0.0002785 |         0.0005162 |        0.9742838 |         0.2448535 |         0.0122643 |          0.0097555 |
| 1490 |                                                | torch.Tensor.topk                                                             | torch.Tensor.topk                                                       | torch.Size([26, 300])             | torch.float32 |           |  0.8820376 |    0.0260813 |     0.8208282 |                   |        0.0043375 |         0.0045361 |        0.9742838 |         0.2448535 |         0.0604636 |          0.0346655 |
| 1491 |                                                | torch.Tensor.remainder                                                        | torch.Tensor.remainder                                                  | torch.Size([26, 300])             | torch.int64   |           |  0.2919419 |    0.9843590 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9835897 |          0.2725641 |
| 1492 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([26, 512])             | torch.float32 |           |  0.6266054 |    0.6702462 |     2.7671461 |                   |       -2.5157180 |        -2.7318664 |        2.0793159 |         1.2760379 |        -0.2127234 |         -0.7017179 |
| 1493 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([26, 300])             | torch.int64   |           |  0.8251277 |  157.9738464 |   501.0000000 |                   |        0.0000000 |         0.0000000 |      510.0000000 |       511.0000000 |       320.3242188 |        264.2393494 |
| 1494 |                                                | torch.gather                                                                  | torch.gather                                                            | torch.Size([26, 300])             | torch.float32 |           |  0.2473955 |    0.6757410 |     3.0081410 |                   |       -2.4220190 |        -2.0808165 |        2.0793159 |         1.2760379 |        -0.2869598 |         -0.3891727 |
| 1495 |                                                | torch.Tensor.sigmoid                                                          | torch.Tensor.sigmoid                                                    | torch.Size([26, 300])             | torch.float32 |           |  0.9159131 |    0.1440181 |     0.6147016 |                   |        0.0815090 |         0.1109754 |        0.8888764 |         0.7817746 |         0.4328712 |          0.4233757 |
| 1496 |                                                | torch.Tensor.mul_                                                             | torch.Tensor.mul_                                                       | torch.Size([26, 300])             | torch.float32 |           |  0.7618514 |    0.0131701 |     0.7499396 |                   |        0.0005285 |         0.0006152 |        0.8314397 |         0.1580546 |         0.0279318 |          0.0158431 |
| 1497 |                                                | torch.sort                                                                    | torch.sort                                                              | torch.Size([26, 300])             | torch.float32 |           |  0.7786123 |    0.0123270 |     0.7472601 |                   |        0.0005285 |         0.0006152 |        0.8314397 |         0.1580546 |         0.0279318 |          0.0158431 |
| 1498 |                                                | torch.gather                                                                  | torch.gather                                                            | torch.Size([26, 300])             | torch.int64   |           |  0.3206320 |    0.9553846 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9835897 |          0.2725641 |
| 1499 |                                                | torch.gather                                                                  | torch.gather                                                            | torch.Size([26, 300])             | torch.int64   |           |  0.8201993 |  637.2433472 |  2008.0000000 |                   |        0.0000000 |         0.0000000 |     2040.0000000 |      2044.0000000 |      1282.2805176 |       1057.2299805 |
| 1500 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.1989508 |    0.9133334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9100000 |          0.1300000 |
| 1501 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.8166119 |    0.0182597 |     0.6785259 |                   |        0.0037337 |         0.0030944 |        0.7783458 |         0.1455895 |         0.0369369 |          0.0186772 |
| 1502 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8033233 |  658.2466431 |  1946.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      2044.0000000 |      1281.8967285 |        998.2766724 |
| 1503 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8033670 |  164.4766693 |   486.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       511.0000000 |       320.2466736 |        249.5366669 |
| 1504 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1233341 |    8.1458378 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.6429973 |          0.0932821 |
| 1505 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.0843650 |    0.0349252 |     0.1600514 |                   |       -0.0090122 |        -0.1396332 |        0.0204182 |         0.0102752 |         0.0003658 |         -0.0341987 |
| 1506 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9999402 |    0.0068505 |     0.0639479 |                   |       -1.0029564 |        -1.0242544 |       -0.9603065 |        -0.9958194 |        -0.9998611 |         -1.0061709 |
| 1507 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           | -0.0141387 |    3.1851006 |     6.2830973 |                   |       -3.1415846 |        -3.1415923 |        3.1415882 |         3.1415823 |         0.5441739 |         -1.3491100 |
| 1508 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0709090 |   19.5555096 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.1030698 |          4.7088423 |
| 1509 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9577749 |    0.1719118 |     1.0565728 |                   |        0.4474396 |         0.4148016 |        2.5905423 |         2.6011131 |         1.0030488 |          0.9261072 |
| 1510 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8896973 |    0.7002750 |     8.7612753 |                   |        1.5643018 |         1.5140703 |       13.3370028 |        13.4787331 |         3.2388759 |          2.9300835 |
| 1511 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           | -0.0141387 |    3.1851006 |     6.2830973 |                   |       -3.1415846 |        -3.1415923 |        3.1415882 |         3.1415823 |         0.5441739 |         -1.3491100 |
| 1512 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2320547 |   10.1267281 |    55.4087448 |                   |      -44.4999542 |       -43.2559662 |       11.1263418 |        11.6330271 |        -8.2578030 |         -4.9461255 |
| 1513 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.1335734 |    9.4332638 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.0821817 |          0.6729291 |
| 1514 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.1335734 |    9.4332638 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.0821815 |          0.6729292 |
| 1515 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.8166119 |    0.0182597 |     0.6785259 |                   |        0.0037337 |         0.0030944 |        0.7783458 |         0.1455895 |         0.0369369 |          0.0186772 |
| 1516 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.1989508 |    0.9133334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9100000 |          0.1300000 |
| 1517 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6575191 |    0.7466667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         1.1033334 |          0.7766667 |
| 1518 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.8000495 |    0.0086886 |     0.5172135 |                   |        0.0027326 |         0.0006495 |        0.5930055 |         0.0757921 |         0.0169545 |          0.0082659 |
| 1519 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8851986 |  536.6599731 |  1726.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      2043.0000000 |      1287.5034180 |       1261.3366699 |
| 1520 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8850596 |  134.1600037 |   431.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       510.0000000 |       321.6000061 |        315.1400146 |
| 1521 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.2823218 |    7.6270146 |   121.5112457 |                   |      -60.2010574 |       -60.7261200 |       65.1050568 |        62.9007950 |        -1.9817518 |          0.0078062 |
| 1522 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.0330885 |    0.0014797 |     0.0255509 |                   |       -0.0068403 |        -0.0037464 |        0.0126832 |         0.0258955 |         0.0003315 |          0.0001071 |
| 1523 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9999979 |    0.0014900 |     0.0106896 |                   |       -1.0039302 |        -1.0055159 |       -0.9958704 |        -0.9902000 |        -1.0000366 |         -0.9992668 |
| 1524 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0400406 |    3.0155976 |     6.2831197 |                   |       -3.1415763 |        -3.1415820 |        3.1415880 |         3.1415920 |         0.9212026 |         -0.0419964 |
| 1525 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1321783 |   19.8885212 |   121.5112457 |                   |      -60.2010574 |       -60.7261200 |       65.1050568 |        62.9007950 |         1.4475124 |         10.2973108 |
| 1526 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9833621 |    0.0854303 |     0.9952265 |                   |        0.4686526 |         0.4564798 |        2.5396595 |         2.5477037 |         0.9243045 |          0.9546240 |
| 1527 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.9507249 |    0.3349642 |     8.0274029 |                   |        1.5978398 |         1.5785074 |       12.6753550 |        12.7777290 |         2.8473761 |          2.9864562 |
| 1528 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0400406 |    3.0155976 |     6.2831197 |                   |       -3.1415763 |        -3.1415820 |        3.1415880 |         3.1415920 |         0.9212026 |         -0.0419964 |
| 1529 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.5557066 |    7.9907804 |    54.9948311 |                   |      -44.2241440 |       -43.5119247 |       12.0543098 |        10.9011688 |        -9.3050041 |        -10.8902588 |
| 1530 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.2874721 |    8.7658396 |   121.5112457 |                   |      -60.2010574 |       -60.7261200 |       65.1050568 |        62.9007950 |        -1.4109147 |          0.7138527 |
| 1531 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.2874721 |    8.7658396 |   121.5112457 |                   |      -60.2010574 |       -60.7261200 |       65.1050568 |        62.9007950 |        -1.4109147 |          0.7138529 |
| 1532 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.8000495 |    0.0086886 |     0.5172135 |                   |        0.0027326 |         0.0006495 |        0.5930055 |         0.0757921 |         0.0169545 |          0.0082659 |
| 1533 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.6575191 |    0.7466667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         1.1033334 |          0.7766666 |
| 1534 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.1400280 |    0.9733334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9533334 |          0.0933333 |
| 1535 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.8046563 |    0.0138567 |     0.4653685 |                   |        0.0032096 |         0.0035264 |        0.5447856 |         0.0794171 |         0.0323536 |          0.0184993 |
| 1536 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.7958323 |  669.5533447 |  1950.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      2044.0000000 |      1277.8066406 |        970.4533691 |
| 1537 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.7959262 |  167.2700043 |   487.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       511.0000000 |       319.2133484 |        242.5900116 |
| 1538 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1959176 |    7.2240777 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.8474393 |         -0.0288818 |
| 1539 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0409827 |    0.0380040 |     0.1400797 |                   |       -0.0324268 |        -0.1329119 |        0.0738843 |         0.0098130 |         0.0002546 |         -0.0373697 |
| 1540 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9997812 |    0.0103644 |     0.2276714 |                   |       -1.0025852 |        -1.0286509 |       -0.8009794 |        -0.9985102 |        -0.9983708 |         -1.0082958 |
| 1541 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0006087 |    3.1389205 |     6.2831593 |                   |       -3.1415908 |        -3.1415570 |        3.1415787 |         3.1415911 |         0.3139017 |         -1.3671111 |
| 1542 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1687804 |   17.3970661 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -4.2222977 |          3.2136111 |
| 1543 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9597427 |    0.1705891 |     1.0546170 |                   |        0.4500201 |         0.3953664 |        2.5881257 |         2.5805008 |         1.0092543 |          0.9095006 |
| 1544 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8968781 |    0.6812067 |     8.6704025 |                   |        1.5683436 |         1.4849281 |       13.3048115 |        13.2037497 |         3.2565849 |          2.8726444 |
| 1545 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0006087 |    3.1389205 |     6.2831593 |                   |       -3.1415908 |        -3.1415570 |        3.1415787 |         3.1415911 |         0.3139017 |         -1.3671111 |
| 1546 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2492932 |    8.9045086 |    55.2734604 |                   |      -44.9658356 |       -42.8884430 |       14.6895380 |        11.6104450 |        -6.8948622 |         -3.8804569 |
| 1547 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.2060634 |    8.4087276 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.3267822 |          0.5250285 |
| 1548 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.2060634 |    8.4087276 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.3267822 |          0.5250285 |
| 1549 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.8046563 |    0.0138567 |     0.4653685 |                   |        0.0032096 |         0.0035264 |        0.5447856 |         0.0794171 |         0.0323536 |          0.0184993 |
| 1550 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.1400280 |    0.9733334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9533333 |          0.0933333 |
| 1551 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.1989508 |    0.9133334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9100000 |          0.1300000 |
| 1552 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.8166119 |    0.0182597 |     0.6785259 |                   |        0.0037337 |         0.0030944 |        0.7783458 |         0.1455895 |         0.0369369 |          0.0186772 |
| 1553 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8033233 |  658.2466431 |  1946.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      2044.0000000 |      1281.8967285 |        998.2766724 |
| 1554 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8033670 |  164.4766693 |   486.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       511.0000000 |       320.2466736 |        249.5366669 |
| 1555 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1233341 |    8.1458378 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.6429973 |          0.0932821 |
| 1556 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.0843650 |    0.0349252 |     0.1600514 |                   |       -0.0090122 |        -0.1396332 |        0.0204182 |         0.0102752 |         0.0003658 |         -0.0341987 |
| 1557 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9999402 |    0.0068505 |     0.0639479 |                   |       -1.0029564 |        -1.0242544 |       -0.9603065 |        -0.9958194 |        -0.9998611 |         -1.0061709 |
| 1558 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           | -0.0141387 |    3.1851006 |     6.2830973 |                   |       -3.1415846 |        -3.1415923 |        3.1415882 |         3.1415823 |         0.5441739 |         -1.3491100 |
| 1559 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0709090 |   19.5555096 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.1030698 |          4.7088423 |
| 1560 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9577749 |    0.1719118 |     1.0565728 |                   |        0.4474396 |         0.4148016 |        2.5905423 |         2.6011131 |         1.0030488 |          0.9261072 |
| 1561 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8896973 |    0.7002750 |     8.7612753 |                   |        1.5643018 |         1.5140703 |       13.3370028 |        13.4787331 |         3.2388759 |          2.9300835 |
| 1562 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           | -0.0141387 |    3.1851006 |     6.2830973 |                   |       -3.1415846 |        -3.1415923 |        3.1415882 |         3.1415823 |         0.5441739 |         -1.3491100 |
| 1563 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2320547 |   10.1267281 |    55.4087448 |                   |      -44.4999542 |       -43.2559662 |       11.1263418 |        11.6330271 |        -8.2578030 |         -4.9461255 |
| 1564 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.1335734 |    9.4332638 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.0821817 |          0.6729291 |
| 1565 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.1335734 |    9.4332638 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.0821815 |          0.6729292 |
| 1566 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.8166119 |    0.0182597 |     0.6785259 |                   |        0.0037337 |         0.0030944 |        0.7783458 |         0.1455895 |         0.0369369 |          0.0186772 |
| 1567 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.1989508 |    0.9133334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9100000 |          0.1300000 |
| 1568 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0763763 |    0.9533333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9133334 |          0.0800000 |
| 1569 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.8422263 |    0.0131390 |     0.5189071 |                   |        0.0029829 |         0.0036528 |        0.6769617 |         0.1580546 |         0.0320869 |          0.0209099 |
| 1570 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.7795226 |  714.2466431 |  1918.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      2040.0000000 |      1289.0333252 |        897.4266968 |
| 1571 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.7796648 |  178.3999939 |   479.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       510.0000000 |       322.0299988 |        224.3366699 |
| 1572 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1725105 |    7.4012976 |   120.5957718 |                   |      -59.5478668 |       -59.3779068 |       64.9226151 |        62.4818993 |        -2.4317753 |          0.1917884 |
| 1573 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.1028324 |    0.0532062 |     0.2314897 |                   |       -0.0264124 |        -0.1440454 |        0.0874443 |         0.0112976 |         0.0004727 |         -0.0524095 |
| 1574 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9998807 |    0.0108123 |     0.1458083 |                   |       -1.0056723 |        -1.0265141 |       -0.8807058 |        -0.9984951 |        -0.9995952 |         -1.0098578 |
| 1575 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0774736 |    2.9014783 |     6.2830715 |                   |       -3.1415873 |        -3.1415880 |        3.1415830 |         3.1415739 |         0.8791361 |         -1.2687421 |
| 1576 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1114318 |   18.4970493 |   120.5957718 |                   |      -59.5478668 |       -59.3779068 |       64.9226151 |        62.4818993 |        -2.1675534 |          3.3689787 |
| 1577 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9600218 |    0.1682247 |     1.0691923 |                   |        0.4614190 |         0.4065219 |        2.5963767 |         2.5955029 |         1.0017194 |          0.8990809 |
| 1578 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8960829 |    0.6624244 |     8.8021564 |                   |        1.5863234 |         1.5015861 |       13.4150429 |        13.4033260 |         3.2233403 |          2.8412199 |
| 1579 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0774736 |    2.9014783 |     6.2830715 |                   |       -3.1415873 |        -3.1415880 |        3.1415830 |         3.1415739 |         0.8791361 |         -1.2687421 |
| 1580 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.3122393 |    8.4514790 |    55.3049431 |                   |      -43.9567947 |       -42.8560905 |       13.2100487 |        11.1066866 |        -7.4176359 |         -3.2107463 |
| 1581 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.1827942 |    8.5734329 |   120.5957718 |                   |      -59.5478668 |       -59.3779068 |       64.9226151 |        62.4818993 |        -1.8206410 |          0.7729615 |
| 1582 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.1827942 |    8.5734329 |   120.5957718 |                   |      -59.5478668 |       -59.3779068 |       64.9226151 |        62.4818993 |        -1.8206412 |          0.7729614 |
| 1583 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.8422263 |    0.0131390 |     0.5189071 |                   |        0.0029829 |         0.0036528 |        0.6769617 |         0.1580546 |         0.0320869 |          0.0209099 |
| 1584 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0763763 |    0.9533333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9133334 |          0.0800000 |
| 1585 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.1400280 |    0.9733334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9533334 |          0.0933333 |
| 1586 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.8046563 |    0.0138567 |     0.4653685 |                   |        0.0032096 |         0.0035264 |        0.5447856 |         0.0794171 |         0.0323536 |          0.0184993 |
| 1587 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.7958323 |  669.5533447 |  1950.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      2044.0000000 |      1277.8066406 |        970.4533691 |
| 1588 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.7959262 |  167.2700043 |   487.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       511.0000000 |       319.2133484 |        242.5900116 |
| 1589 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1959176 |    7.2240777 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.8474393 |         -0.0288818 |
| 1590 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0409827 |    0.0380040 |     0.1400797 |                   |       -0.0324268 |        -0.1329119 |        0.0738843 |         0.0098130 |         0.0002546 |         -0.0373697 |
| 1591 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9997812 |    0.0103644 |     0.2276714 |                   |       -1.0025852 |        -1.0286509 |       -0.8009794 |        -0.9985102 |        -0.9983708 |         -1.0082958 |
| 1592 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0006087 |    3.1389205 |     6.2831593 |                   |       -3.1415908 |        -3.1415570 |        3.1415787 |         3.1415911 |         0.3139017 |         -1.3671111 |
| 1593 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1687804 |   17.3970661 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -4.2222977 |          3.2136111 |
| 1594 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9597427 |    0.1705891 |     1.0546170 |                   |        0.4500201 |         0.3953664 |        2.5881257 |         2.5805008 |         1.0092543 |          0.9095006 |
| 1595 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8968781 |    0.6812067 |     8.6704025 |                   |        1.5683436 |         1.4849281 |       13.3048115 |        13.2037497 |         3.2565849 |          2.8726444 |
| 1596 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0006087 |    3.1389205 |     6.2831593 |                   |       -3.1415908 |        -3.1415570 |        3.1415787 |         3.1415911 |         0.3139017 |         -1.3671111 |
| 1597 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2492932 |    8.9045086 |    55.2734604 |                   |      -44.9658356 |       -42.8884430 |       14.6895380 |        11.6104450 |        -6.8948622 |         -3.8804569 |
| 1598 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.2060634 |    8.4087276 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.3267822 |          0.5250285 |
| 1599 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.2060634 |    8.4087276 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.3267822 |          0.5250285 |
| 1600 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.8046563 |    0.0138567 |     0.4653685 |                   |        0.0032096 |         0.0035264 |        0.5447856 |         0.0794171 |         0.0323536 |          0.0184993 |
| 1601 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.1400280 |    0.9733334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9533333 |          0.0933333 |
| 1602 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.4901177 |    0.9433333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9266667 |          0.7833334 |
| 1603 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.7783321 |    0.0089558 |     0.7194221 |                   |        0.0007084 |         0.0006152 |        0.7987369 |         0.0793148 |         0.0159316 |          0.0069758 |
| 1604 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8879494 |  530.2566528 |  1624.0000000 |                   |        8.0000000 |         0.0000000 |     2040.0000000 |      2040.0000000 |      1294.3266602 |       1249.3566895 |
| 1605 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8878511 |  132.5533295 |   406.0000000 |                   |        2.0000000 |         0.0000000 |      510.0000000 |       510.0000000 |       323.3500061 |        312.1433411 |
| 1606 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.4335753 |    6.8155861 |   119.0592651 |                   |      -60.2451172 |       -59.9069176 |       63.8917389 |        63.5410767 |        -1.8764471 |         -0.4647586 |
| 1607 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.1295341 |    0.0015730 |     0.0718954 |                   |       -0.0042052 |        -0.0028772 |        0.0709645 |         0.0356474 |         0.0005143 |          0.0002714 |
| 1608 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9999744 |    0.0017751 |     0.1168166 |                   |       -1.1136954 |        -1.0099347 |       -0.9731702 |        -0.9924585 |        -1.0000844 |         -0.9992657 |
| 1609 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0401421 |    3.0150831 |     6.2831125 |                   |       -3.1415923 |        -3.1415894 |        3.1415894 |         3.1415918 |         0.2927263 |         -0.3353740 |
| 1610 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2990348 |   17.6236458 |   119.0592651 |                   |      -60.2451172 |       -59.9069176 |       63.8917389 |        63.5410767 |         4.3095856 |          8.8333712 |
| 1611 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9967980 |    0.0424935 |     0.6354233 |                   |        0.4313343 |         0.4571938 |        1.6688511 |         2.1608827 |         0.9000384 |          0.9272118 |
| 1612 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.9943522 |    0.1297595 |     4.0178976 |                   |        1.5393102 |         1.5796349 |        5.3060684 |         8.6787958 |         2.7689166 |          2.8551965 |
| 1613 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0401421 |    3.0150831 |     6.2831125 |                   |       -3.1415923 |        -3.1415894 |        3.1415894 |         3.1415918 |         0.2927263 |         -0.3353740 |
| 1614 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.6458542 |    7.3232298 |    55.4474449 |                   |      -43.0340614 |       -43.6742020 |       12.3044767 |        10.7994738 |       -11.7567396 |        -11.1316996 |
| 1615 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.4367542 |    7.8244987 |   119.0592651 |                   |      -60.2451172 |       -59.9069176 |       63.8917389 |        63.5410767 |        -1.3741987 |          0.1335229 |
| 1616 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.4367542 |    7.8244987 |   119.0592651 |                   |      -60.2451172 |       -59.9069176 |       63.8917389 |        63.5410767 |        -1.3741986 |          0.1335229 |
| 1617 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.7783321 |    0.0089558 |     0.7194221 |                   |        0.0007084 |         0.0006152 |        0.7987369 |         0.0793148 |         0.0159316 |          0.0069758 |
| 1618 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.4901177 |    0.9433333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9266667 |          0.7833334 |
| 1619 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.1400280 |    0.9733334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9533334 |          0.0933333 |
| 1620 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.8046563 |    0.0138567 |     0.4653685 |                   |        0.0032096 |         0.0035264 |        0.5447856 |         0.0794171 |         0.0323536 |          0.0184993 |
| 1621 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.7958323 |  669.5533447 |  1950.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      2044.0000000 |      1277.8066406 |        970.4533691 |
| 1622 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.7959262 |  167.2700043 |   487.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       511.0000000 |       319.2133484 |        242.5900116 |
| 1623 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1959176 |    7.2240777 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.8474393 |         -0.0288818 |
| 1624 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0409827 |    0.0380040 |     0.1400797 |                   |       -0.0324268 |        -0.1329119 |        0.0738843 |         0.0098130 |         0.0002546 |         -0.0373697 |
| 1625 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9997812 |    0.0103644 |     0.2276714 |                   |       -1.0025852 |        -1.0286509 |       -0.8009794 |        -0.9985102 |        -0.9983708 |         -1.0082958 |
| 1626 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0006087 |    3.1389205 |     6.2831593 |                   |       -3.1415908 |        -3.1415570 |        3.1415787 |         3.1415911 |         0.3139017 |         -1.3671111 |
| 1627 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1687804 |   17.3970661 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -4.2222977 |          3.2136111 |
| 1628 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9597427 |    0.1705891 |     1.0546170 |                   |        0.4500201 |         0.3953664 |        2.5881257 |         2.5805008 |         1.0092543 |          0.9095006 |
| 1629 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8968781 |    0.6812067 |     8.6704025 |                   |        1.5683436 |         1.4849281 |       13.3048115 |        13.2037497 |         3.2565849 |          2.8726444 |
| 1630 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0006087 |    3.1389205 |     6.2831593 |                   |       -3.1415908 |        -3.1415570 |        3.1415787 |         3.1415911 |         0.3139017 |         -1.3671111 |
| 1631 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2492932 |    8.9045086 |    55.2734604 |                   |      -44.9658356 |       -42.8884430 |       14.6895380 |        11.6104450 |        -6.8948622 |         -3.8804569 |
| 1632 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.2060634 |    8.4087276 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.3267822 |          0.5250285 |
| 1633 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.2060634 |    8.4087276 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.3267822 |          0.5250285 |
| 1634 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.8046563 |    0.0138567 |     0.4653685 |                   |        0.0032096 |         0.0035264 |        0.5447856 |         0.0794171 |         0.0323536 |          0.0184993 |
| 1635 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.1400280 |    0.9733334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9533333 |          0.0933333 |
| 1636 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0548580 |    1.0300000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9566667 |          0.1000000 |
| 1637 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.7670916 |    0.0138573 |     0.7472601 |                   |        0.0038349 |         0.0032168 |        0.8314397 |         0.0841796 |         0.0334670 |          0.0196097 |
| 1638 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.7871439 |  697.3033447 |  2006.0000000 |                   |      512.0000000 |         0.0000000 |     2031.0000000 |      2044.0000000 |      1285.8366699 |       1010.2866821 |
| 1639 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.7871561 |  174.2466736 |   501.0000000 |                   |      128.0000000 |         0.0000000 |      507.0000000 |       511.0000000 |       321.2200012 |        252.5466766 |
| 1640 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1644198 |    7.7184267 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -2.7052436 |          0.0485694 |
| 1641 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.1689903 |    0.0322387 |     0.1506395 |                   |       -0.0024969 |        -0.1269483 |        0.0236912 |         0.0101548 |         0.0002993 |         -0.0315192 |
| 1642 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9998879 |    0.0084607 |     0.1394534 |                   |       -1.0034873 |        -1.0294310 |       -0.8899776 |        -0.9973001 |        -0.9996479 |         -1.0075480 |
| 1643 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.1535830 |    2.6633875 |     6.2830935 |                   |       -3.1415570 |        -3.1415877 |        3.1415756 |         3.1415925 |         0.3766811 |         -1.3308859 |
| 1644 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1268096 |   18.6543846 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -3.0642457 |          4.1176362 |
| 1645 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9560702 |    0.1801606 |     1.1138730 |                   |        0.4480730 |         0.3745154 |        2.6225846 |         2.5966232 |         1.0114361 |          0.9041463 |
| 1646 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8830637 |    0.7342767 |     9.2503681 |                   |        1.5652931 |         1.4542865 |       13.7712708 |        13.4183502 |         3.2674627 |          2.8456972 |
| 1647 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.1535830 |    2.6633875 |     6.2830935 |                   |       -3.1415570 |        -3.1415877 |        3.1415756 |         3.1415925 |         0.3766811 |         -1.3308859 |
| 1648 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2457193 |    9.4527864 |    55.4647331 |                   |      -43.1821747 |       -43.4529648 |       12.7721672 |        11.5965500 |        -7.5333009 |         -4.4973388 |
| 1649 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.1745368 |    8.9187727 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -2.1613572 |          0.6067097 |
| 1650 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.1745368 |    8.9187727 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -2.1613572 |          0.6067097 |
| 1651 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.7670916 |    0.0138573 |     0.7472601 |                   |        0.0038349 |         0.0032168 |        0.8314397 |         0.0841796 |         0.0334670 |          0.0196097 |
| 1652 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0548580 |    1.0300000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9566666 |          0.1000000 |
| 1653 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.5034865 |    0.8333333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9766667 |          0.5500000 |
| 1654 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.7347152 |    0.0029136 |     0.3920860 |                   |        0.0005285 |         0.0007033 |        0.4604718 |         0.0683858 |         0.0084343 |          0.0066495 |
| 1655 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8895273 |  514.8666382 |  2008.0000000 |                   |        0.0000000 |         0.0000000 |     2040.0000000 |      2044.0000000 |      1286.0833740 |       1260.8433838 |
| 1656 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8893996 |  128.7233276 |   502.0000000 |                   |        0.0000000 |         0.0000000 |      510.0000000 |       511.0000000 |       321.2766724 |        315.0733337 |
| 1657 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1435999 |    8.3228502 |   120.3164978 |                   |      -60.6556091 |       -60.6662750 |       59.4821014 |        61.3528862 |        -2.7284048 |          0.8318331 |
| 1658 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0765441 |    0.0016580 |     0.0694729 |                   |       -0.0101520 |        -0.0027593 |        0.0692994 |         0.0300672 |         0.0003532 |          0.0001967 |
| 1659 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9999697 |    0.0024138 |     0.1062355 |                   |       -1.1064489 |        -1.0019816 |       -0.9512901 |        -0.9916406 |        -0.9997451 |         -0.9993611 |
| 1660 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0800000 |    2.8900244 |     6.2830887 |                   |       -3.1415780 |        -3.1415846 |        3.1415656 |         3.1415880 |        -0.2935408 |         -0.7122919 |
| 1661 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0547061 |   21.6534081 |   120.3164978 |                   |      -60.6556091 |       -60.6662750 |       59.4821014 |        61.3528862 |        -0.6368890 |         11.8634748 |
| 1662 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9918309 |    0.0590938 |     1.0092795 |                   |        0.4587190 |         0.4523272 |        1.9287689 |         2.5326161 |         0.9201108 |          0.9312112 |
| 1663 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.9781594 |    0.2044005 |     7.9988847 |                   |        1.5820462 |         1.5719663 |        6.8810339 |        12.5863914 |         2.8224940 |          2.8937001 |
| 1664 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0800000 |    2.8900244 |     6.2830887 |                   |       -3.1415780 |        -3.1415846 |        3.1415656 |         3.1415880 |        -0.2935408 |         -0.7122919 |
| 1665 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.4801274 |    8.8032627 |    56.9444809 |                   |      -47.7367287 |       -43.4737625 |       14.4296532 |        10.9435291 |        -9.9542427 |         -9.4115772 |
| 1666 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.1507088 |    9.4873219 |   120.3164978 |                   |      -60.6556091 |       -60.6662750 |       59.4821014 |        61.3528862 |        -2.3599451 |          1.5324503 |
| 1667 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.1507088 |    9.4873219 |   120.3164978 |                   |      -60.6556091 |       -60.6662750 |       59.4821014 |        61.3528862 |        -2.3599451 |          1.5324502 |
| 1668 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.7347152 |    0.0029136 |     0.3920860 |                   |        0.0005285 |         0.0007033 |        0.4604718 |         0.0683858 |         0.0084343 |          0.0066495 |
| 1669 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.5034865 |    0.8333333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9766667 |          0.5500000 |
| 1670 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0548580 |    1.0300000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9566667 |          0.1000000 |
| 1671 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.7670916 |    0.0138573 |     0.7472601 |                   |        0.0038349 |         0.0032168 |        0.8314397 |         0.0841796 |         0.0334670 |          0.0196097 |
| 1672 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.7871439 |  697.3033447 |  2006.0000000 |                   |      512.0000000 |         0.0000000 |     2031.0000000 |      2044.0000000 |      1285.8366699 |       1010.2866821 |
| 1673 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.7871561 |  174.2466736 |   501.0000000 |                   |      128.0000000 |         0.0000000 |      507.0000000 |       511.0000000 |       321.2200012 |        252.5466766 |
| 1674 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1644198 |    7.7184267 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -2.7052436 |          0.0485694 |
| 1675 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.1689903 |    0.0322387 |     0.1506395 |                   |       -0.0024969 |        -0.1269483 |        0.0236912 |         0.0101548 |         0.0002993 |         -0.0315192 |
| 1676 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9998879 |    0.0084607 |     0.1394534 |                   |       -1.0034873 |        -1.0294310 |       -0.8899776 |        -0.9973001 |        -0.9996479 |         -1.0075480 |
| 1677 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.1535830 |    2.6633875 |     6.2830935 |                   |       -3.1415570 |        -3.1415877 |        3.1415756 |         3.1415925 |         0.3766811 |         -1.3308859 |
| 1678 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1268096 |   18.6543846 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -3.0642457 |          4.1176362 |
| 1679 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9560702 |    0.1801606 |     1.1138730 |                   |        0.4480730 |         0.3745154 |        2.6225846 |         2.5966232 |         1.0114361 |          0.9041463 |
| 1680 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8830637 |    0.7342767 |     9.2503681 |                   |        1.5652931 |         1.4542865 |       13.7712708 |        13.4183502 |         3.2674627 |          2.8456972 |
| 1681 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.1535830 |    2.6633875 |     6.2830935 |                   |       -3.1415570 |        -3.1415877 |        3.1415756 |         3.1415925 |         0.3766811 |         -1.3308859 |
| 1682 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2457193 |    9.4527864 |    55.4647331 |                   |      -43.1821747 |       -43.4529648 |       12.7721672 |        11.5965500 |        -7.5333009 |         -4.4973388 |
| 1683 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.1745368 |    8.9187727 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -2.1613572 |          0.6067097 |
| 1684 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.1745368 |    8.9187727 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -2.1613572 |          0.6067097 |
| 1685 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.7670916 |    0.0138573 |     0.7472601 |                   |        0.0038349 |         0.0032168 |        0.8314397 |         0.0841796 |         0.0334670 |          0.0196097 |
| 1686 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0548580 |    1.0300000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9566666 |          0.1000000 |
| 1687 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.1400280 |    0.9733334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9533334 |          0.0933333 |
| 1688 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.8046563 |    0.0138567 |     0.4653685 |                   |        0.0032096 |         0.0035264 |        0.5447856 |         0.0794171 |         0.0323536 |          0.0184993 |
| 1689 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.7958323 |  669.5533447 |  1950.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      2044.0000000 |      1277.8066406 |        970.4533691 |
| 1690 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.7959262 |  167.2700043 |   487.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       511.0000000 |       319.2133484 |        242.5900116 |
| 1691 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1959176 |    7.2240777 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.8474393 |         -0.0288818 |
| 1692 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0409827 |    0.0380040 |     0.1400797 |                   |       -0.0324268 |        -0.1329119 |        0.0738843 |         0.0098130 |         0.0002546 |         -0.0373697 |
| 1693 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9997812 |    0.0103644 |     0.2276714 |                   |       -1.0025852 |        -1.0286509 |       -0.8009794 |        -0.9985102 |        -0.9983708 |         -1.0082958 |
| 1694 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0006087 |    3.1389205 |     6.2831593 |                   |       -3.1415908 |        -3.1415570 |        3.1415787 |         3.1415911 |         0.3139017 |         -1.3671111 |
| 1695 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1687804 |   17.3970661 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -4.2222977 |          3.2136111 |
| 1696 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9597427 |    0.1705891 |     1.0546170 |                   |        0.4500201 |         0.3953664 |        2.5881257 |         2.5805008 |         1.0092543 |          0.9095006 |
| 1697 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8968781 |    0.6812067 |     8.6704025 |                   |        1.5683436 |         1.4849281 |       13.3048115 |        13.2037497 |         3.2565849 |          2.8726444 |
| 1698 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0006087 |    3.1389205 |     6.2831593 |                   |       -3.1415908 |        -3.1415570 |        3.1415787 |         3.1415911 |         0.3139017 |         -1.3671111 |
| 1699 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2492932 |    8.9045086 |    55.2734604 |                   |      -44.9658356 |       -42.8884430 |       14.6895380 |        11.6104450 |        -6.8948622 |         -3.8804569 |
| 1700 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.2060634 |    8.4087276 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.3267822 |          0.5250285 |
| 1701 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.2060634 |    8.4087276 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.3267822 |          0.5250285 |
| 1702 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.8046563 |    0.0138567 |     0.4653685 |                   |        0.0032096 |         0.0035264 |        0.5447856 |         0.0794171 |         0.0323536 |          0.0184993 |
| 1703 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.1400280 |    0.9733334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9533333 |          0.0933333 |
| 1704 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0548580 |    1.0300000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9566667 |          0.1000000 |
| 1705 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.7670916 |    0.0138573 |     0.7472601 |                   |        0.0038349 |         0.0032168 |        0.8314397 |         0.0841796 |         0.0334670 |          0.0196097 |
| 1706 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.7871439 |  697.3033447 |  2006.0000000 |                   |      512.0000000 |         0.0000000 |     2031.0000000 |      2044.0000000 |      1285.8366699 |       1010.2866821 |
| 1707 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.7871561 |  174.2466736 |   501.0000000 |                   |      128.0000000 |         0.0000000 |      507.0000000 |       511.0000000 |       321.2200012 |        252.5466766 |
| 1708 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1644198 |    7.7184267 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -2.7052436 |          0.0485694 |
| 1709 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.1689903 |    0.0322387 |     0.1506395 |                   |       -0.0024969 |        -0.1269483 |        0.0236912 |         0.0101548 |         0.0002993 |         -0.0315192 |
| 1710 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9998879 |    0.0084607 |     0.1394534 |                   |       -1.0034873 |        -1.0294310 |       -0.8899776 |        -0.9973001 |        -0.9996479 |         -1.0075480 |
| 1711 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.1535830 |    2.6633875 |     6.2830935 |                   |       -3.1415570 |        -3.1415877 |        3.1415756 |         3.1415925 |         0.3766811 |         -1.3308859 |
| 1712 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1268096 |   18.6543846 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -3.0642457 |          4.1176362 |
| 1713 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9560702 |    0.1801606 |     1.1138730 |                   |        0.4480730 |         0.3745154 |        2.6225846 |         2.5966232 |         1.0114361 |          0.9041463 |
| 1714 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8830637 |    0.7342767 |     9.2503681 |                   |        1.5652931 |         1.4542865 |       13.7712708 |        13.4183502 |         3.2674627 |          2.8456972 |
| 1715 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.1535830 |    2.6633875 |     6.2830935 |                   |       -3.1415570 |        -3.1415877 |        3.1415756 |         3.1415925 |         0.3766811 |         -1.3308859 |
| 1716 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2457193 |    9.4527864 |    55.4647331 |                   |      -43.1821747 |       -43.4529648 |       12.7721672 |        11.5965500 |        -7.5333009 |         -4.4973388 |
| 1717 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.1745368 |    8.9187727 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -2.1613572 |          0.6067097 |
| 1718 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.1745368 |    8.9187727 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -2.1613572 |          0.6067097 |
| 1719 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.7670916 |    0.0138573 |     0.7472601 |                   |        0.0038349 |         0.0032168 |        0.8314397 |         0.0841796 |         0.0334670 |          0.0196097 |
| 1720 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0548580 |    1.0300000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9566666 |          0.1000000 |
| 1721 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6575191 |    0.7466667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         1.1033334 |          0.7766667 |
| 1722 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.8000526 |    0.0086886 |     0.5172116 |                   |        0.0027326 |         0.0006495 |        0.5930038 |         0.0757922 |         0.0169544 |          0.0082659 |
| 1723 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8851986 |  536.6599731 |  1726.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      2043.0000000 |      1287.5034180 |       1261.3366699 |
| 1724 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8850596 |  134.1600037 |   431.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       510.0000000 |       321.6000061 |        315.1400146 |
| 1725 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.2823218 |    7.6270151 |   121.5112457 |                   |      -60.2010574 |       -60.7261200 |       65.1050568 |        62.9007797 |        -1.9817518 |          0.0078061 |
| 1726 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.0330881 |    0.0014797 |     0.0255509 |                   |       -0.0068403 |        -0.0037464 |        0.0126832 |         0.0258955 |         0.0003315 |          0.0001071 |
| 1727 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9999977 |    0.0014900 |     0.0106897 |                   |       -1.0039302 |        -1.0055158 |       -0.9958704 |        -0.9902000 |        -1.0000366 |         -0.9992668 |
| 1728 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0400406 |    3.0155976 |     6.2831197 |                   |       -3.1415763 |        -3.1415820 |        3.1415880 |         3.1415920 |         0.9212026 |         -0.0419964 |
| 1729 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1321783 |   19.8885212 |   121.5112457 |                   |      -60.2010574 |       -60.7261200 |       65.1050568 |        62.9007797 |         1.4475125 |         10.2973108 |
| 1730 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9833623 |    0.0854303 |     0.9952264 |                   |        0.4686526 |         0.4564798 |        2.5396597 |         2.5477040 |         0.9243045 |          0.9546240 |
| 1731 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.9507249 |    0.3349642 |     8.0274048 |                   |        1.5978398 |         1.5785077 |       12.6753578 |        12.7777319 |         2.8473761 |          2.9864562 |
| 1732 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0400406 |    3.0155976 |     6.2831197 |                   |       -3.1415763 |        -3.1415820 |        3.1415880 |         3.1415920 |         0.9212026 |         -0.0419964 |
| 1733 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.5557066 |    7.9907813 |    54.9948273 |                   |      -44.2241478 |       -43.5119247 |       12.0543089 |        10.9011698 |        -9.3050041 |        -10.8902588 |
| 1734 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.2874721 |    8.7658396 |   121.5112457 |                   |      -60.2010574 |       -60.7261200 |       65.1050568 |        62.9007797 |        -1.4109145 |          0.7138527 |
| 1735 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.2874721 |    8.7658396 |   121.5112457 |                   |      -60.2010574 |       -60.7261200 |       65.1050568 |        62.9007797 |        -1.4109147 |          0.7138526 |
| 1736 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.8000526 |    0.0086886 |     0.5172116 |                   |        0.0027326 |         0.0006495 |        0.5930038 |         0.0757922 |         0.0169544 |          0.0082659 |
| 1737 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.6575191 |    0.7466667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         1.1033334 |          0.7766666 |
| 1738 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.1851571 |    1.1633333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         1.1466666 |          0.1500000 |
| 1739 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.7507168 |    0.0109192 |     0.6312854 |                   |        0.0036544 |         0.0031332 |        0.7017068 |         0.0704214 |         0.0288880 |          0.0179687 |
| 1740 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.7990786 |  659.5233154 |  1998.0000000 |                   |      512.0000000 |         0.0000000 |     2031.0000000 |      2040.0000000 |      1266.8800049 |       1010.1900024 |
| 1741 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.7990773 |  164.7966614 |   499.0000000 |                   |      128.0000000 |         0.0000000 |      507.0000000 |       510.0000000 |       316.4333496 |        252.5100098 |
| 1742 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1497899 |    7.4072933 |   117.6053925 |                   |      -59.5478973 |       -59.3639908 |       45.1807823 |        58.9901695 |        -3.1615613 |         -0.0128361 |
| 1743 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.0087737 |    0.0321222 |     0.1675926 |                   |       -0.0340449 |        -0.1402231 |        0.0273695 |         0.0100579 |         0.0001850 |         -0.0315479 |
| 1744 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9997925 |    0.0069149 |     0.3156190 |                   |       -1.0025165 |        -1.0223303 |       -0.7067113 |        -0.9979798 |        -0.9987783 |         -1.0052916 |
| 1745 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0127833 |    3.1011651 |     6.2830744 |                   |       -3.1415701 |        -3.1415911 |        3.1415837 |         3.1415923 |         0.6071541 |         -1.6657861 |
| 1746 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1393610 |   17.8360023 |   117.6053925 |                   |      -59.5478973 |       -59.3639908 |       45.1807823 |        58.9901695 |        -5.9546819 |          3.1030779 |
| 1747 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9535685 |    0.1865412 |     1.0897222 |                   |        0.4180713 |         0.3781894 |        2.6034861 |         2.5753868 |         1.0085294 |          0.9166041 |
| 1748 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8793243 |    0.7625325 |     8.9669542 |                   |        1.5190289 |         1.4596394 |       13.5107555 |        13.1363974 |         3.2527215 |          2.8894293 |
| 1749 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0127833 |    3.1011651 |     6.2830744 |                   |       -3.1415701 |        -3.1415911 |        3.1415837 |         3.1415923 |         0.6071541 |         -1.6657861 |
| 1750 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1692019 |    9.1245213 |    54.7912292 |                   |      -42.0631218 |       -43.0789413 |       13.7897215 |        10.9900217 |        -6.3133745 |         -3.7211342 |
| 1751 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.1606185 |    8.6270342 |   117.6053925 |                   |      -59.5478973 |       -59.3639908 |       45.1807823 |        58.9901695 |        -2.6438851 |          0.5148333 |
| 1752 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.1606185 |    8.6270342 |   117.6053925 |                   |      -59.5478973 |       -59.3639908 |       45.1807823 |        58.9901695 |        -2.6438849 |          0.5148333 |
| 1753 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.7507168 |    0.0109192 |     0.6312854 |                   |        0.0036544 |         0.0031332 |        0.7017068 |         0.0704214 |         0.0288880 |          0.0179687 |
| 1754 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.1851571 |    1.1633333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         1.1466666 |          0.1500000 |
| 1755 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.1989508 |    0.9133334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9100000 |          0.1300000 |
| 1756 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.8166119 |    0.0182597 |     0.6785259 |                   |        0.0037337 |         0.0030944 |        0.7783458 |         0.1455895 |         0.0369369 |          0.0186772 |
| 1757 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8033233 |  658.2466431 |  1946.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      2044.0000000 |      1281.8967285 |        998.2766724 |
| 1758 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8033670 |  164.4766693 |   486.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       511.0000000 |       320.2466736 |        249.5366669 |
| 1759 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1233341 |    8.1458378 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.6429973 |          0.0932821 |
| 1760 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.0843650 |    0.0349252 |     0.1600514 |                   |       -0.0090122 |        -0.1396332 |        0.0204182 |         0.0102752 |         0.0003658 |         -0.0341987 |
| 1761 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9999402 |    0.0068505 |     0.0639479 |                   |       -1.0029564 |        -1.0242544 |       -0.9603065 |        -0.9958194 |        -0.9998611 |         -1.0061709 |
| 1762 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           | -0.0141387 |    3.1851006 |     6.2830973 |                   |       -3.1415846 |        -3.1415923 |        3.1415882 |         3.1415823 |         0.5441739 |         -1.3491100 |
| 1763 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0709090 |   19.5555096 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.1030698 |          4.7088423 |
| 1764 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9577749 |    0.1719118 |     1.0565728 |                   |        0.4474396 |         0.4148016 |        2.5905423 |         2.6011131 |         1.0030488 |          0.9261072 |
| 1765 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8896973 |    0.7002750 |     8.7612753 |                   |        1.5643018 |         1.5140703 |       13.3370028 |        13.4787331 |         3.2388759 |          2.9300835 |
| 1766 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           | -0.0141387 |    3.1851006 |     6.2830973 |                   |       -3.1415846 |        -3.1415923 |        3.1415882 |         3.1415823 |         0.5441739 |         -1.3491100 |
| 1767 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2320547 |   10.1267281 |    55.4087448 |                   |      -44.4999542 |       -43.2559662 |       11.1263418 |        11.6330271 |        -8.2578030 |         -4.9461255 |
| 1768 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.1335734 |    9.4332638 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.0821817 |          0.6729291 |
| 1769 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.1335734 |    9.4332638 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.0821815 |          0.6729292 |
| 1770 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.8166119 |    0.0182597 |     0.6785259 |                   |        0.0037337 |         0.0030944 |        0.7783458 |         0.1455895 |         0.0369369 |          0.0186772 |
| 1771 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.1989508 |    0.9133334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9100000 |          0.1300000 |
| 1772 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.5034865 |    0.8333333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9766667 |          0.5500000 |
| 1773 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.7347152 |    0.0029136 |     0.3920860 |                   |        0.0005285 |         0.0007033 |        0.4604718 |         0.0683858 |         0.0084343 |          0.0066495 |
| 1774 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8895273 |  514.8666382 |  2008.0000000 |                   |        0.0000000 |         0.0000000 |     2040.0000000 |      2044.0000000 |      1286.0833740 |       1260.8433838 |
| 1775 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8893996 |  128.7233276 |   502.0000000 |                   |        0.0000000 |         0.0000000 |      510.0000000 |       511.0000000 |       321.2766724 |        315.0733337 |
| 1776 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1435999 |    8.3228502 |   120.3164978 |                   |      -60.6556091 |       -60.6662750 |       59.4821014 |        61.3528862 |        -2.7284048 |          0.8318331 |
| 1777 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0765441 |    0.0016580 |     0.0694729 |                   |       -0.0101520 |        -0.0027593 |        0.0692994 |         0.0300672 |         0.0003532 |          0.0001967 |
| 1778 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9999697 |    0.0024138 |     0.1062355 |                   |       -1.1064489 |        -1.0019816 |       -0.9512901 |        -0.9916406 |        -0.9997451 |         -0.9993611 |
| 1779 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0800000 |    2.8900244 |     6.2830887 |                   |       -3.1415780 |        -3.1415846 |        3.1415656 |         3.1415880 |        -0.2935408 |         -0.7122919 |
| 1780 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           | -0.0547061 |   21.6534081 |   120.3164978 |                   |      -60.6556091 |       -60.6662750 |       59.4821014 |        61.3528862 |        -0.6368890 |         11.8634748 |
| 1781 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9918309 |    0.0590938 |     1.0092795 |                   |        0.4587190 |         0.4523272 |        1.9287689 |         2.5326161 |         0.9201108 |          0.9312112 |
| 1782 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.9781594 |    0.2044005 |     7.9988847 |                   |        1.5820462 |         1.5719663 |        6.8810339 |        12.5863914 |         2.8224940 |          2.8937001 |
| 1783 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0800000 |    2.8900244 |     6.2830887 |                   |       -3.1415780 |        -3.1415846 |        3.1415656 |         3.1415880 |        -0.2935408 |         -0.7122919 |
| 1784 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.4801274 |    8.8032627 |    56.9444809 |                   |      -47.7367287 |       -43.4737625 |       14.4296532 |        10.9435291 |        -9.9542427 |         -9.4115772 |
| 1785 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.1507088 |    9.4873219 |   120.3164978 |                   |      -60.6556091 |       -60.6662750 |       59.4821014 |        61.3528862 |        -2.3599451 |          1.5324503 |
| 1786 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.1507088 |    9.4873219 |   120.3164978 |                   |      -60.6556091 |       -60.6662750 |       59.4821014 |        61.3528862 |        -2.3599451 |          1.5324502 |
| 1787 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.7347152 |    0.0029136 |     0.3920860 |                   |        0.0005285 |         0.0007033 |        0.4604718 |         0.0683858 |         0.0084343 |          0.0066495 |
| 1788 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.5034865 |    0.8333333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9766667 |          0.5500000 |
| 1789 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.4694141 |    0.9800000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9300000 |          0.7833334 |
| 1790 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.7807899 |    0.0090096 |     0.7141762 |                   |        0.0007106 |         0.0006156 |        0.7934955 |         0.0793194 |         0.0159889 |          0.0069792 |
| 1791 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8792235 |  551.7733154 |  1612.0000000 |                   |        8.0000000 |         0.0000000 |     2040.0000000 |      2040.0000000 |      1293.8100586 |       1250.5300293 |
| 1792 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8790981 |  137.9233398 |   403.0000000 |                   |        2.0000000 |         0.0000000 |      510.0000000 |       510.0000000 |       323.2200012 |        312.4366760 |
| 1793 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.3648452 |    7.3446364 |   118.2884521 |                   |      -60.2596703 |       -59.9090347 |       63.8929520 |        63.5411987 |        -1.8885266 |         -0.4702059 |
| 1794 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.1651769 |    0.0015385 |     0.0716252 |                   |       -0.0044535 |        -0.0028612 |        0.0706733 |         0.0356542 |         0.0005153 |          0.0002747 |
| 1795 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9999740 |    0.0018012 |     0.1168773 |                   |       -1.1136398 |        -1.0097910 |       -0.9714010 |        -0.9924321 |        -1.0000886 |         -0.9992579 |
| 1796 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.1399278 |    2.7017624 |     6.2830949 |                   |       -3.1415896 |        -3.1415839 |        3.1415834 |         3.1415915 |         0.2927252 |         -0.3144335 |
| 1797 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2144055 |   18.8368855 |   118.2884521 |                   |      -60.2596703 |       -59.9090347 |       63.8929520 |        63.5411987 |         4.2702780 |          8.8125639 |
| 1798 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9967744 |    0.0431365 |     0.6100292 |                   |        0.4313621 |         0.4574384 |        1.6676811 |         2.1635664 |         0.9002389 |          0.9273011 |
| 1799 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.9943271 |    0.1313259 |     3.9739518 |                   |        1.5393528 |         1.5800214 |        5.2998633 |         8.7021170 |         2.7691855 |          2.8557022 |
| 1800 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.1399278 |    2.7017624 |     6.2830949 |                   |       -3.1415896 |        -3.1415839 |        3.1415834 |         3.1415915 |         0.2927252 |         -0.3144335 |
| 1801 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.6018493 |    8.0491962 |    55.4428635 |                   |      -43.0364838 |       -43.6692429 |       12.3255253 |        10.7960463 |       -11.7619228 |        -11.1309595 |
| 1802 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.3690550 |    8.3753986 |   118.2884521 |                   |      -60.2596703 |       -59.9090347 |       63.8929520 |        63.5411987 |        -1.3874654 |          0.1297488 |
| 1803 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.3690550 |    8.3753986 |   118.2884521 |                   |      -60.2596703 |       -59.9090347 |       63.8929520 |        63.5411987 |        -1.3874651 |          0.1297488 |
| 1804 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.7807899 |    0.0090096 |     0.7141762 |                   |        0.0007106 |         0.0006156 |        0.7934955 |         0.0793194 |         0.0159889 |          0.0069792 |
| 1805 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.4694141 |    0.9800000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9300000 |          0.7833334 |
| 1806 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.1989508 |    0.9133334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9100000 |          0.1300000 |
| 1807 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.8166119 |    0.0182597 |     0.6785259 |                   |        0.0037337 |         0.0030944 |        0.7783458 |         0.1455895 |         0.0369369 |          0.0186772 |
| 1808 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8033233 |  658.2466431 |  1946.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      2044.0000000 |      1281.8967285 |        998.2766724 |
| 1809 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8033670 |  164.4766693 |   486.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       511.0000000 |       320.2466736 |        249.5366669 |
| 1810 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1233341 |    8.1458378 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.6429973 |          0.0932821 |
| 1811 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.0843650 |    0.0349252 |     0.1600514 |                   |       -0.0090122 |        -0.1396332 |        0.0204182 |         0.0102752 |         0.0003658 |         -0.0341987 |
| 1812 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9999402 |    0.0068505 |     0.0639479 |                   |       -1.0029564 |        -1.0242544 |       -0.9603065 |        -0.9958194 |        -0.9998611 |         -1.0061709 |
| 1813 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           | -0.0141387 |    3.1851006 |     6.2830973 |                   |       -3.1415846 |        -3.1415923 |        3.1415882 |         3.1415823 |         0.5441739 |         -1.3491100 |
| 1814 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.0709090 |   19.5555096 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.1030698 |          4.7088423 |
| 1815 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9577749 |    0.1719118 |     1.0565728 |                   |        0.4474396 |         0.4148016 |        2.5905423 |         2.6011131 |         1.0030488 |          0.9261072 |
| 1816 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8896973 |    0.7002750 |     8.7612753 |                   |        1.5643018 |         1.5140703 |       13.3370028 |        13.4787331 |         3.2388759 |          2.9300835 |
| 1817 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           | -0.0141387 |    3.1851006 |     6.2830973 |                   |       -3.1415846 |        -3.1415923 |        3.1415882 |         3.1415823 |         0.5441739 |         -1.3491100 |
| 1818 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2320547 |   10.1267281 |    55.4087448 |                   |      -44.4999542 |       -43.2559662 |       11.1263418 |        11.6330271 |        -8.2578030 |         -4.9461255 |
| 1819 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.1335734 |    9.4332638 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.0821817 |          0.6729291 |
| 1820 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.1335734 |    9.4332638 |   121.2850189 |                   |      -59.5835381 |       -59.3683090 |       64.4056549 |        62.2177811 |        -2.0821815 |          0.6729292 |
| 1821 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.8166119 |    0.0182597 |     0.6785259 |                   |        0.0037337 |         0.0030944 |        0.7783458 |         0.1455895 |         0.0369369 |          0.0186772 |
| 1822 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.1989508 |    0.9133334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9100000 |          0.1300000 |
| 1823 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.6575191 |    0.7466667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         1.1033334 |          0.7766667 |
| 1824 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.8000526 |    0.0086886 |     0.5172116 |                   |        0.0027326 |         0.0006495 |        0.5930038 |         0.0757922 |         0.0169544 |          0.0082659 |
| 1825 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.8851986 |  536.6599731 |  1726.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      2043.0000000 |      1287.5034180 |       1261.3366699 |
| 1826 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.8850596 |  134.1600037 |   431.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       510.0000000 |       321.6000061 |        315.1400146 |
| 1827 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.2823218 |    7.6270151 |   121.5112457 |                   |      -60.2010574 |       -60.7261200 |       65.1050568 |        62.9007797 |        -1.9817518 |          0.0078061 |
| 1828 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.0330881 |    0.0014797 |     0.0255509 |                   |       -0.0068403 |        -0.0037464 |        0.0126832 |         0.0258955 |         0.0003315 |          0.0001071 |
| 1829 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9999977 |    0.0014900 |     0.0106897 |                   |       -1.0039302 |        -1.0055158 |       -0.9958704 |        -0.9902000 |        -1.0000366 |         -0.9992668 |
| 1830 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0400406 |    3.0155976 |     6.2831197 |                   |       -3.1415763 |        -3.1415820 |        3.1415880 |         3.1415920 |         0.9212026 |         -0.0419964 |
| 1831 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1321783 |   19.8885212 |   121.5112457 |                   |      -60.2010574 |       -60.7261200 |       65.1050568 |        62.9007797 |         1.4475125 |         10.2973108 |
| 1832 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9833623 |    0.0854303 |     0.9952264 |                   |        0.4686526 |         0.4564798 |        2.5396597 |         2.5477040 |         0.9243045 |          0.9546240 |
| 1833 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.9507249 |    0.3349642 |     8.0274048 |                   |        1.5978398 |         1.5785077 |       12.6753578 |        12.7777319 |         2.8473761 |          2.9864562 |
| 1834 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0400406 |    3.0155976 |     6.2831197 |                   |       -3.1415763 |        -3.1415820 |        3.1415880 |         3.1415920 |         0.9212026 |         -0.0419964 |
| 1835 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.5557066 |    7.9907813 |    54.9948273 |                   |      -44.2241478 |       -43.5119247 |       12.0543089 |        10.9011698 |        -9.3050041 |        -10.8902588 |
| 1836 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.2874721 |    8.7658396 |   121.5112457 |                   |      -60.2010574 |       -60.7261200 |       65.1050568 |        62.9007797 |        -1.4109145 |          0.7138527 |
| 1837 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.2874721 |    8.7658396 |   121.5112457 |                   |      -60.2010574 |       -60.7261200 |       65.1050568 |        62.9007797 |        -1.4109147 |          0.7138526 |
| 1838 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.8000526 |    0.0086886 |     0.5172116 |                   |        0.0027326 |         0.0006495 |        0.5930038 |         0.0757922 |         0.0169544 |          0.0082659 |
| 1839 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.6575191 |    0.7466667 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         1.1033334 |          0.7766666 |
| 1840 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.1400280 |    0.9733334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9533334 |          0.0933333 |
| 1841 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.8046563 |    0.0138567 |     0.4653685 |                   |        0.0032096 |         0.0035264 |        0.5447856 |         0.0794171 |         0.0323536 |          0.0184993 |
| 1842 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.7958323 |  669.5533447 |  1950.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      2044.0000000 |      1277.8066406 |        970.4533691 |
| 1843 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.7959262 |  167.2700043 |   487.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       511.0000000 |       319.2133484 |        242.5900116 |
| 1844 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1959176 |    7.2240777 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.8474393 |         -0.0288818 |
| 1845 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0409827 |    0.0380040 |     0.1400797 |                   |       -0.0324268 |        -0.1329119 |        0.0738843 |         0.0098130 |         0.0002546 |         -0.0373697 |
| 1846 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9997812 |    0.0103644 |     0.2276714 |                   |       -1.0025852 |        -1.0286509 |       -0.8009794 |        -0.9985102 |        -0.9983708 |         -1.0082958 |
| 1847 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0006087 |    3.1389205 |     6.2831593 |                   |       -3.1415908 |        -3.1415570 |        3.1415787 |         3.1415911 |         0.3139017 |         -1.3671111 |
| 1848 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1687804 |   17.3970661 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -4.2222977 |          3.2136111 |
| 1849 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9597427 |    0.1705891 |     1.0546170 |                   |        0.4500201 |         0.3953664 |        2.5881257 |         2.5805008 |         1.0092543 |          0.9095006 |
| 1850 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8968781 |    0.6812067 |     8.6704025 |                   |        1.5683436 |         1.4849281 |       13.3048115 |        13.2037497 |         3.2565849 |          2.8726444 |
| 1851 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0006087 |    3.1389205 |     6.2831593 |                   |       -3.1415908 |        -3.1415570 |        3.1415787 |         3.1415911 |         0.3139017 |         -1.3671111 |
| 1852 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2492932 |    8.9045086 |    55.2734604 |                   |      -44.9658356 |       -42.8884430 |       14.6895380 |        11.6104450 |        -6.8948622 |         -3.8804569 |
| 1853 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.2060634 |    8.4087276 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.3267822 |          0.5250285 |
| 1854 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.2060634 |    8.4087276 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.3267822 |          0.5250285 |
| 1855 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.8046563 |    0.0138567 |     0.4653685 |                   |        0.0032096 |         0.0035264 |        0.5447856 |         0.0794171 |         0.0323536 |          0.0184993 |
| 1856 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.1400280 |    0.9733334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9533333 |          0.0933333 |
| 1857 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0763763 |    0.9533333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9133334 |          0.0800000 |
| 1858 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.8422263 |    0.0131390 |     0.5189071 |                   |        0.0029829 |         0.0036528 |        0.6769617 |         0.1580546 |         0.0320869 |          0.0209099 |
| 1859 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.7795226 |  714.2466431 |  1918.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      2040.0000000 |      1289.0333252 |        897.4266968 |
| 1860 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.7796648 |  178.3999939 |   479.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       510.0000000 |       322.0299988 |        224.3366699 |
| 1861 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1725105 |    7.4012976 |   120.5957718 |                   |      -59.5478668 |       -59.3779068 |       64.9226151 |        62.4818993 |        -2.4317753 |          0.1917884 |
| 1862 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.1028324 |    0.0532062 |     0.2314897 |                   |       -0.0264124 |        -0.1440454 |        0.0874443 |         0.0112976 |         0.0004727 |         -0.0524095 |
| 1863 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9998807 |    0.0108123 |     0.1458083 |                   |       -1.0056723 |        -1.0265141 |       -0.8807058 |        -0.9984951 |        -0.9995952 |         -1.0098578 |
| 1864 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0774736 |    2.9014783 |     6.2830715 |                   |       -3.1415873 |        -3.1415880 |        3.1415830 |         3.1415739 |         0.8791361 |         -1.2687421 |
| 1865 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1114318 |   18.4970493 |   120.5957718 |                   |      -59.5478668 |       -59.3779068 |       64.9226151 |        62.4818993 |        -2.1675534 |          3.3689787 |
| 1866 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9600218 |    0.1682247 |     1.0691923 |                   |        0.4614190 |         0.4065219 |        2.5963767 |         2.5955029 |         1.0017194 |          0.8990809 |
| 1867 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8960829 |    0.6624244 |     8.8021564 |                   |        1.5863234 |         1.5015861 |       13.4150429 |        13.4033260 |         3.2233403 |          2.8412199 |
| 1868 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0774736 |    2.9014783 |     6.2830715 |                   |       -3.1415873 |        -3.1415880 |        3.1415830 |         3.1415739 |         0.8791361 |         -1.2687421 |
| 1869 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.3122393 |    8.4514790 |    55.3049431 |                   |      -43.9567947 |       -42.8560905 |       13.2100487 |        11.1066866 |        -7.4176359 |         -3.2107463 |
| 1870 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.1827942 |    8.5734329 |   120.5957718 |                   |      -59.5478668 |       -59.3779068 |       64.9226151 |        62.4818993 |        -1.8206410 |          0.7729615 |
| 1871 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.1827942 |    8.5734329 |   120.5957718 |                   |      -59.5478668 |       -59.3779068 |       64.9226151 |        62.4818993 |        -1.8206412 |          0.7729614 |
| 1872 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.8422263 |    0.0131390 |     0.5189071 |                   |        0.0029829 |         0.0036528 |        0.6769617 |         0.1580546 |         0.0320869 |          0.0209099 |
| 1873 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0763763 |    0.9533333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9133334 |          0.0800000 |
| 1874 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.1851571 |    1.1633333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         1.1466666 |          0.1500000 |
| 1875 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.7507168 |    0.0109192 |     0.6312854 |                   |        0.0036544 |         0.0031332 |        0.7017068 |         0.0704214 |         0.0288880 |          0.0179687 |
| 1876 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.7990786 |  659.5233154 |  1998.0000000 |                   |      512.0000000 |         0.0000000 |     2031.0000000 |      2040.0000000 |      1266.8800049 |       1010.1900024 |
| 1877 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.7990773 |  164.7966614 |   499.0000000 |                   |      128.0000000 |         0.0000000 |      507.0000000 |       510.0000000 |       316.4333496 |        252.5100098 |
| 1878 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1497899 |    7.4072933 |   117.6053925 |                   |      -59.5478973 |       -59.3639908 |       45.1807823 |        58.9901695 |        -3.1615613 |         -0.0128361 |
| 1879 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.0087737 |    0.0321222 |     0.1675926 |                   |       -0.0340449 |        -0.1402231 |        0.0273695 |         0.0100579 |         0.0001850 |         -0.0315479 |
| 1880 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9997925 |    0.0069149 |     0.3156190 |                   |       -1.0025165 |        -1.0223303 |       -0.7067113 |        -0.9979798 |        -0.9987783 |         -1.0052916 |
| 1881 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0127833 |    3.1011651 |     6.2830744 |                   |       -3.1415701 |        -3.1415911 |        3.1415837 |         3.1415923 |         0.6071541 |         -1.6657861 |
| 1882 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1393610 |   17.8360023 |   117.6053925 |                   |      -59.5478973 |       -59.3639908 |       45.1807823 |        58.9901695 |        -5.9546819 |          3.1030779 |
| 1883 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9535685 |    0.1865412 |     1.0897222 |                   |        0.4180713 |         0.3781894 |        2.6034861 |         2.5753868 |         1.0085294 |          0.9166041 |
| 1884 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8793243 |    0.7625325 |     8.9669542 |                   |        1.5190289 |         1.4596394 |       13.5107555 |        13.1363974 |         3.2527215 |          2.8894293 |
| 1885 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0127833 |    3.1011651 |     6.2830744 |                   |       -3.1415701 |        -3.1415911 |        3.1415837 |         3.1415923 |         0.6071541 |         -1.6657861 |
| 1886 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1692019 |    9.1245213 |    54.7912292 |                   |      -42.0631218 |       -43.0789413 |       13.7897215 |        10.9900217 |        -6.3133745 |         -3.7211342 |
| 1887 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.1606185 |    8.6270342 |   117.6053925 |                   |      -59.5478973 |       -59.3639908 |       45.1807823 |        58.9901695 |        -2.6438851 |          0.5148333 |
| 1888 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.1606185 |    8.6270342 |   117.6053925 |                   |      -59.5478973 |       -59.3639908 |       45.1807823 |        58.9901695 |        -2.6438849 |          0.5148333 |
| 1889 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.7507168 |    0.0109192 |     0.6312854 |                   |        0.0036544 |         0.0031332 |        0.7017068 |         0.0704214 |         0.0288880 |          0.0179687 |
| 1890 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.1851571 |    1.1633333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         1.1466666 |          0.1500000 |
| 1891 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.1851571 |    1.1633333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         1.1466666 |          0.1500000 |
| 1892 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.7507168 |    0.0109192 |     0.6312854 |                   |        0.0036544 |         0.0031332 |        0.7017068 |         0.0704214 |         0.0288880 |          0.0179687 |
| 1893 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.7990786 |  659.5233154 |  1998.0000000 |                   |      512.0000000 |         0.0000000 |     2031.0000000 |      2040.0000000 |      1266.8800049 |       1010.1900024 |
| 1894 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.7990773 |  164.7966614 |   499.0000000 |                   |      128.0000000 |         0.0000000 |      507.0000000 |       510.0000000 |       316.4333496 |        252.5100098 |
| 1895 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1497899 |    7.4072933 |   117.6053925 |                   |      -59.5478973 |       -59.3639908 |       45.1807823 |        58.9901695 |        -3.1615613 |         -0.0128361 |
| 1896 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.0087737 |    0.0321222 |     0.1675926 |                   |       -0.0340449 |        -0.1402231 |        0.0273695 |         0.0100579 |         0.0001850 |         -0.0315479 |
| 1897 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9997925 |    0.0069149 |     0.3156190 |                   |       -1.0025165 |        -1.0223303 |       -0.7067113 |        -0.9979798 |        -0.9987783 |         -1.0052916 |
| 1898 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0127833 |    3.1011651 |     6.2830744 |                   |       -3.1415701 |        -3.1415911 |        3.1415837 |         3.1415923 |         0.6071541 |         -1.6657861 |
| 1899 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1393610 |   17.8360023 |   117.6053925 |                   |      -59.5478973 |       -59.3639908 |       45.1807823 |        58.9901695 |        -5.9546819 |          3.1030779 |
| 1900 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9535685 |    0.1865412 |     1.0897222 |                   |        0.4180713 |         0.3781894 |        2.6034861 |         2.5753868 |         1.0085294 |          0.9166041 |
| 1901 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8793243 |    0.7625325 |     8.9669542 |                   |        1.5190289 |         1.4596394 |       13.5107555 |        13.1363974 |         3.2527215 |          2.8894293 |
| 1902 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0127833 |    3.1011651 |     6.2830744 |                   |       -3.1415701 |        -3.1415911 |        3.1415837 |         3.1415923 |         0.6071541 |         -1.6657861 |
| 1903 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1692019 |    9.1245213 |    54.7912292 |                   |      -42.0631218 |       -43.0789413 |       13.7897215 |        10.9900217 |        -6.3133745 |         -3.7211342 |
| 1904 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.1606185 |    8.6270342 |   117.6053925 |                   |      -59.5478973 |       -59.3639908 |       45.1807823 |        58.9901695 |        -2.6438851 |          0.5148333 |
| 1905 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.1606185 |    8.6270342 |   117.6053925 |                   |      -59.5478973 |       -59.3639908 |       45.1807823 |        58.9901695 |        -2.6438849 |          0.5148333 |
| 1906 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.7507168 |    0.0109192 |     0.6312854 |                   |        0.0036544 |         0.0031332 |        0.7017068 |         0.0704214 |         0.0288880 |          0.0179687 |
| 1907 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.1851571 |    1.1633333 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         1.1466666 |          0.1500000 |
| 1908 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.1400280 |    0.9733334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9533334 |          0.0933333 |
| 1909 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.8046563 |    0.0138567 |     0.4653685 |                   |        0.0032096 |         0.0035264 |        0.5447856 |         0.0794171 |         0.0323536 |          0.0184993 |
| 1910 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.7958323 |  669.5533447 |  1950.0000000 |                   |      512.0000000 |         0.0000000 |     2040.0000000 |      2044.0000000 |      1277.8066406 |        970.4533691 |
| 1911 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.7959262 |  167.2700043 |   487.0000000 |                   |      128.0000000 |         0.0000000 |      510.0000000 |       511.0000000 |       319.2133484 |        242.5900116 |
| 1912 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1959176 |    7.2240777 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.8474393 |         -0.0288818 |
| 1913 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.0409827 |    0.0380040 |     0.1400797 |                   |       -0.0324268 |        -0.1329119 |        0.0738843 |         0.0098130 |         0.0002546 |         -0.0373697 |
| 1914 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9997812 |    0.0103644 |     0.2276714 |                   |       -1.0025852 |        -1.0286509 |       -0.8009794 |        -0.9985102 |        -0.9983708 |         -1.0082958 |
| 1915 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.0006087 |    3.1389205 |     6.2831593 |                   |       -3.1415908 |        -3.1415570 |        3.1415787 |         3.1415911 |         0.3139017 |         -1.3671111 |
| 1916 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1687804 |   17.3970661 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -4.2222977 |          3.2136111 |
| 1917 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9597427 |    0.1705891 |     1.0546170 |                   |        0.4500201 |         0.3953664 |        2.5881257 |         2.5805008 |         1.0092543 |          0.9095006 |
| 1918 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8968781 |    0.6812067 |     8.6704025 |                   |        1.5683436 |         1.4849281 |       13.3048115 |        13.2037497 |         3.2565849 |          2.8726444 |
| 1919 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.0006087 |    3.1389205 |     6.2831593 |                   |       -3.1415908 |        -3.1415570 |        3.1415787 |         3.1415911 |         0.3139017 |         -1.3671111 |
| 1920 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2492932 |    8.9045086 |    55.2734604 |                   |      -44.9658356 |       -42.8884430 |       14.6895380 |        11.6104450 |        -6.8948622 |         -3.8804569 |
| 1921 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.2060634 |    8.4087276 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.3267822 |          0.5250285 |
| 1922 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.2060634 |    8.4087276 |   118.2212982 |                   |      -59.5350914 |       -59.3568420 |       52.6359482 |        60.5494690 |        -2.3267822 |          0.5250285 |
| 1923 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.8046563 |    0.0138567 |     0.4653685 |                   |        0.0032096 |         0.0035264 |        0.5447856 |         0.0794171 |         0.0323536 |          0.0184993 |
| 1924 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.1400280 |    0.9733334 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         2.0000000 |         0.9533333 |          0.0933333 |
| 1925 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.0548580 |    1.0300000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9566667 |          0.1000000 |
| 1926 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.7670916 |    0.0138573 |     0.7472601 |                   |        0.0038349 |         0.0032168 |        0.8314397 |         0.0841796 |         0.0334670 |          0.0196097 |
| 1927 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.int64   |           |  0.7871439 |  697.3033447 |  2006.0000000 |                   |      512.0000000 |         0.0000000 |     2031.0000000 |      2044.0000000 |      1285.8366699 |       1010.2866821 |
| 1928 |                                                | torch.Tensor.__floordiv__                                                     | torch.Tensor.__floordiv__                                               | torch.Size([300])                 | torch.int64   |           |  0.7871561 |  174.2466736 |   501.0000000 |                   |      128.0000000 |         0.0000000 |      507.0000000 |       511.0000000 |       321.2200012 |        252.5466766 |
| 1929 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 11])             | torch.float32 |           |  0.1644198 |    7.7184267 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -2.7052436 |          0.0485694 |
| 1930 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           | -0.1689903 |    0.0322387 |     0.1506395 |                   |       -0.0024969 |        -0.1269483 |        0.0236912 |         0.0101548 |         0.0002993 |         -0.0315192 |
| 1931 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300])                 | torch.float32 |           |  0.9998879 |    0.0084607 |     0.1394534 |                   |       -1.0034873 |        -1.0294310 |       -0.8899776 |        -0.9973001 |        -0.9996479 |         -1.0075480 |
| 1932 |                                                | torch.atan2                                                                   | torch.atan2                                                             | torch.Size([300])                 | torch.float32 |           |  0.1535830 |    2.6633875 |     6.2830935 |                   |       -3.1415570 |        -3.1415877 |        3.1415756 |         3.1415925 |         0.3766811 |         -1.3308859 |
| 1933 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.1268096 |   18.6543846 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -3.0642457 |          4.1176362 |
| 1934 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.9560702 |    0.1801606 |     1.1138730 |                   |        0.4480730 |         0.3745154 |        2.6225846 |         2.5966232 |         1.0114361 |          0.9041463 |
| 1935 |                                                | torch.Tensor.exp                                                              | torch.Tensor.exp                                                        | torch.Size([300, 3])              | torch.float32 |           |  0.8830637 |    0.7342767 |     9.2503681 |                   |        1.5652931 |         1.4542865 |       13.7712708 |        13.4183502 |         3.2674627 |          2.8456972 |
| 1936 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 1])              | torch.float32 |           |  0.1535830 |    2.6633875 |     6.2830935 |                   |       -3.1415570 |        -3.1415877 |        3.1415756 |         3.1415925 |         0.3766811 |         -1.3308859 |
| 1937 |                                                | torch.Tensor.__getitem__                                                      | torch.Tensor.__getitem__                                                | torch.Size([300, 3])              | torch.float32 |           |  0.2457193 |    9.4527864 |    55.4647331 |                   |      -43.1821747 |       -43.4529648 |       12.7721672 |        11.5965500 |        -7.5333009 |         -4.4973388 |
| 1938 |                                                | torch.cat                                                                     | torch.cat                                                               | torch.Size([300, 10])             | torch.float32 |           |  0.1745368 |    8.9187727 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -2.1613572 |          0.6067097 |
| 1939 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300, 10])             | torch.float32 |           |  0.1745368 |    8.9187727 |   122.5400314 |                   |      -59.6342049 |       -59.3784676 |       64.0388107 |        63.3794975 |        -2.1613572 |          0.6067097 |
| 1940 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.float32 |           |  0.7670916 |    0.0138573 |     0.7472601 |                   |        0.0038349 |         0.0032168 |        0.8314397 |         0.0841796 |         0.0334670 |          0.0196097 |
| 1941 |                                                | torch.Tensor.cpu                                                              | torch.Tensor.cpu                                                        | torch.Size([300])                 | torch.int64   |           |  0.0548580 |    1.0300000 |     3.0000000 |                   |        0.0000000 |         0.0000000 |        3.0000000 |         3.0000000 |         0.9566666 |          0.1000000 |
+------+------------------------------------------------+-------------------------------------------------------------------------------+-------------------------------------------------------------------------+-----------------------------------+---------------+-----------+------------+--------------+---------------+-------------------+------------------+-------------------+------------------+-------------------+-------------------+--------------------+