|
|
--- |
|
|
library_name: transformers |
|
|
license: mit |
|
|
datasets: |
|
|
- CraftJarvis/minecraft-grounding-action-dataset |
|
|
metrics: |
|
|
- accuracy |
|
|
base_model: |
|
|
- Qwen/Qwen2-VL-7B-Instruct |
|
|
pipeline_tag: image-text-to-text |
|
|
arxiv: 2509.13347 |
|
|
--- |
|
|
# Minecraft-Pointha-Qwen2vl-7b-2509 |
|
|
<!-- <div align="left"> |
|
|
<a href="https://craftjarvis.github.io/"><img alt="Homepage" src="https://img.shields.io/badge/%20CraftJarvis-HomePage-ffc107?color=blue&logoColor=white"/></a> |
|
|
<a href="https://github.com/CraftJarvis/OpenHA"><img alt="Github" src="https://img.shields.io/badge/%F0%9F%A4%97%20Github-CraftJarvis-ffc107?color=3b65ab&logoColor=white"/></a> |
|
|
<a href="https://arxiv.org/abs/2509.13347"><img src="https://img.shields.io/badge/arXiv-2509.13347-b31b1b.svg"></a> |
|
|
<a href="https://github.com/CraftJarvis/OpenHA/blob/master/LICENSE"><img src="https://img.shields.io/badge/Code License-MIT-blue"/></a> |
|
|
</div> --> |
|
|
|
|
|
<!-- **minecraft-pointha-qwen2vl-7b-2509** is part of the **OpenHA** suite, introduced in our paper [OpenHA: A Series of Open-Source Hierarchical Agentic Models in Minecraft](https://huggingface.co/papers/2509.13347). |
|
|
--> |
|
|
## π» Usage |
|
|
You can download and use this model with: |
|
|
|
|
|
```sh |
|
|
python examples/rollout_openha.py \ |
|
|
--output_mode grounding \ |
|
|
--vlm_client_mode hf \ |
|
|
--system_message_tag grounding \ |
|
|
--model_ips localhost --model_ports 11000 \ |
|
|
--model_path CraftJarvis/minecraft-pointha-qwen2vl-7b-2509 \ |
|
|
--model_id minecraft-pointha-qwen2vl-7b-2509 \ |
|
|
--record_path "~/evaluate" \ |
|
|
--max_steps_num 200 \ |
|
|
--num_rollouts 8 |
|
|
``` |
|
|
<!-- For more details, please refer to our [code repository](https://github.com/CraftJarvis/OpenHA). --> |
|
|
|
|
|
<!-- ## π Citation |
|
|
|
|
|
```bibtex |
|
|
@article{wang2025openha, |
|
|
title={OpenHA: A Series of Open-Source Hierarchical Agentic Models in Minecraft}, |
|
|
author={Zihao Wang and Muyao Li and Kaichen He and Xiangyu Wang and Zhancun Mu and Anji Liu and Yitao Liang}, |
|
|
journal = {arXiv preprint arXiv:2509.13347}, |
|
|
year={2025}, |
|
|
url={https://arxiv.org/abs/2509.13347}, |
|
|
} |
|
|
``` --> |