0 stringclasses 12 values | 1 float64 0 2.17k |
|---|---|
megatron.core.transformer.attention.forward.qkv | 202.814972 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.00304 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.003232 |
megatron.core.transformer.attention.forward.core_attention | 900.26062 |
megatron.core.transformer.attention.forward.linear_proj | 1.502976 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 1,105.927124 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 1,132.514526 |
megatron.core.transformer.mlp.forward.linear_fc1 | 9.751712 |
megatron.core.transformer.mlp.forward.activation | 459.473663 |
megatron.core.transformer.mlp.forward.linear_fc2 | 6.195008 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 481.104858 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.451968 |
megatron.core.transformer.attention.forward.qkv | 2.61296 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.00304 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.00304 |
megatron.core.transformer.attention.forward.core_attention | 6.067264 |
megatron.core.transformer.attention.forward.linear_proj | 1.452192 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 10.158784 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.451744 |
megatron.core.transformer.mlp.forward.linear_fc1 | 5.785952 |
megatron.core.transformer.mlp.forward.activation | 0.662592 |
megatron.core.transformer.mlp.forward.linear_fc2 | 5.704288 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 12.166272 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.45024 |
megatron.core.transformer.attention.forward.qkv | 2.618784 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.003008 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.002976 |
megatron.core.transformer.attention.forward.core_attention | 6.065056 |
megatron.core.transformer.attention.forward.linear_proj | 1.450816 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 10.16048 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.45168 |
megatron.core.transformer.mlp.forward.linear_fc1 | 5.783808 |
megatron.core.transformer.mlp.forward.activation | 0.661824 |
megatron.core.transformer.mlp.forward.linear_fc2 | 5.699232 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 12.158144 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.450752 |
megatron.core.transformer.attention.forward.qkv | 2.619136 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.003008 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.00304 |
megatron.core.transformer.attention.forward.core_attention | 6.060608 |
megatron.core.transformer.attention.forward.linear_proj | 1.454464 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 10.15984 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.450784 |
megatron.core.transformer.mlp.forward.linear_fc1 | 5.792064 |
megatron.core.transformer.mlp.forward.activation | 0.665312 |
megatron.core.transformer.mlp.forward.linear_fc2 | 5.720096 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 12.191584 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.507488 |
megatron.core.transformer.attention.forward.qkv | 2.646816 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.003008 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.003008 |
megatron.core.transformer.attention.forward.core_attention | 7.0088 |
megatron.core.transformer.attention.forward.linear_proj | 1.6304 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 11.311552 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.567648 |
megatron.core.transformer.mlp.forward.linear_fc1 | 6.291424 |
megatron.core.transformer.mlp.forward.activation | 0.816192 |
megatron.core.transformer.mlp.forward.linear_fc2 | 6.168224 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 13.287904 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.55296 |
megatron.core.transformer.attention.forward.qkv | 2.787008 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.00304 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.003072 |
megatron.core.transformer.attention.forward.core_attention | 7.056448 |
megatron.core.transformer.attention.forward.linear_proj | 1.562848 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 11.430688 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.552928 |
megatron.core.transformer.mlp.forward.linear_fc1 | 6.100576 |
megatron.core.transformer.mlp.forward.activation | 0.796288 |
megatron.core.transformer.mlp.forward.linear_fc2 | 5.984 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 12.892704 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.552704 |
megatron.core.transformer.attention.forward.qkv | 2.773216 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.00304 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.00304 |
megatron.core.transformer.attention.forward.core_attention | 6.860928 |
megatron.core.transformer.attention.forward.linear_proj | 1.501184 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 11.159136 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.528672 |
megatron.core.transformer.mlp.forward.linear_fc1 | 5.840512 |
megatron.core.transformer.mlp.forward.activation | 0.75536 |
megatron.core.transformer.mlp.forward.linear_fc2 | 5.72352 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 12.331968 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.522336 |
megatron.core.transformer.attention.forward.qkv | 2.660896 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.003072 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.003104 |
megatron.core.transformer.attention.forward.core_attention | 6.78096 |
megatron.core.transformer.attention.forward.linear_proj | 1.573216 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 11.039104 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.55312 |
megatron.core.transformer.mlp.forward.linear_fc1 | 6.088736 |
megatron.core.transformer.mlp.forward.activation | 0.796032 |
megatron.core.transformer.mlp.forward.linear_fc2 | 5.965536 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 12.862304 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.5528 |
megatron.core.transformer.attention.forward.qkv | 2.762336 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.003136 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.00304 |
megatron.core.transformer.attention.forward.core_attention | 7.053056 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.