0 stringclasses 12 values | 1 float64 0 120k |
|---|---|
megatron.core.transformer.attention.forward.qkv | 195.845856 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.002976 |
megatron.core.transformer.attention.forward.core_attention | 4,870.123047 |
megatron.core.transformer.attention.forward.linear_proj | 6.990304 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 5,074.141113 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 281.044647 |
megatron.core.transformer.mlp.forward.linear_fc1 | 9.194528 |
megatron.core.transformer.mlp.forward.activation | 280.179626 |
megatron.core.transformer.mlp.forward.linear_fc2 | 10.303936 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 299.88736 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.893184 |
megatron.core.transformer.attention.forward.qkv | 2.762336 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.002912 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.002976 |
megatron.core.transformer.attention.forward.core_attention | 623.04541 |
megatron.core.transformer.attention.forward.linear_proj | 6.046144 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 631.878479 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.891424 |
megatron.core.transformer.mlp.forward.linear_fc1 | 5.953824 |
megatron.core.transformer.mlp.forward.activation | 0.660736 |
megatron.core.transformer.mlp.forward.linear_fc2 | 7.547424 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 14.173728 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.892 |
megatron.core.transformer.attention.forward.qkv | 2.766656 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.003008 |
megatron.core.transformer.attention.forward.core_attention | 625.338257 |
megatron.core.transformer.attention.forward.linear_proj | 7.243584 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 635.372498 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.890848 |
megatron.core.transformer.mlp.forward.linear_fc1 | 5.948128 |
megatron.core.transformer.mlp.forward.activation | 0.6608 |
megatron.core.transformer.mlp.forward.linear_fc2 | 7.560928 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 14.18176 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.89104 |
megatron.core.transformer.attention.forward.qkv | 2.770912 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.002912 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.002912 |
megatron.core.transformer.attention.forward.core_attention | 625.412964 |
megatron.core.transformer.attention.forward.linear_proj | 3.020736 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 631.227966 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.892224 |
megatron.core.transformer.mlp.forward.linear_fc1 | 5.95616 |
megatron.core.transformer.mlp.forward.activation | 0.65968 |
megatron.core.transformer.mlp.forward.linear_fc2 | 7.549024 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 14.176736 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.891936 |
megatron.core.transformer.attention.forward.qkv | 2.77056 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.00304 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.002976 |
megatron.core.transformer.attention.forward.core_attention | 624.714478 |
megatron.core.transformer.attention.forward.linear_proj | 5.703392 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 633.213074 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.89136 |
megatron.core.transformer.mlp.forward.linear_fc1 | 5.953792 |
megatron.core.transformer.mlp.forward.activation | 0.661024 |
megatron.core.transformer.mlp.forward.linear_fc2 | 7.553568 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 14.18016 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.890848 |
megatron.core.transformer.attention.forward.qkv | 2.767104 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.002976 |
megatron.core.transformer.attention.forward.core_attention | 625.55011 |
megatron.core.transformer.attention.forward.linear_proj | 4.669216 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 633.010742 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.891936 |
megatron.core.transformer.mlp.forward.linear_fc1 | 5.955712 |
megatron.core.transformer.mlp.forward.activation | 0.661152 |
megatron.core.transformer.mlp.forward.linear_fc2 | 7.568544 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 14.197024 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.891776 |
megatron.core.transformer.attention.forward.qkv | 2.7688 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.002976 |
megatron.core.transformer.attention.forward.core_attention | 622.591675 |
megatron.core.transformer.attention.forward.linear_proj | 6.93856 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 632.322937 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.89184 |
megatron.core.transformer.mlp.forward.linear_fc1 | 5.959584 |
megatron.core.transformer.mlp.forward.activation | 0.660768 |
megatron.core.transformer.mlp.forward.linear_fc2 | 7.556544 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 14.188896 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.893696 |
megatron.core.transformer.attention.forward.qkv | 2.769696 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.002976 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.002976 |
megatron.core.transformer.attention.forward.core_attention | 626.785522 |
megatron.core.transformer.attention.forward.linear_proj | 3.01328 |
megatron.core.transformer.transformer_layer._forward_attention.self_attention | 632.592346 |
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda | 0.893088 |
megatron.core.transformer.mlp.forward.linear_fc1 | 5.968 |
megatron.core.transformer.mlp.forward.activation | 0.660608 |
megatron.core.transformer.mlp.forward.linear_fc2 | 7.522784 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp | 14.163552 |
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda | 0.893504 |
megatron.core.transformer.attention.forward.qkv | 2.769312 |
megatron.core.transformer.attention.forward.adjust_key_value | 0.002944 |
megatron.core.transformer.attention.forward.rotary_pos_emb | 0.003008 |
megatron.core.transformer.attention.forward.core_attention | 622.00769 |
End of preview. Expand
in Data Studio
No dataset card yet
- Downloads last month
- 6,113