Skip to content
View Hal9000AIML's full-sized avatar

Block or report Hal9000AIML

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don’t include any personal information such as legal names or email addresses. Markdown is supported. This note will only be visible to you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse

Popular repositories Loading

  1. arc-pro-b70-inference-setup-ubuntu-server arc-pro-b70-inference-setup-ubuntu-server Public

    Ubuntu Server edition: automated setup script for Intel Arc Pro B70 GPU LLM inference server with vLLM tensor parallelism. 140 tok/s on 2x B70, 540 tok/s on 4x B70. For Windows, see arc-pro-b70-inf…

    Shell 8

  2. arc-pro-b70-inference-setup-windows arc-pro-b70-inference-setup-windows Public

    Windows installer for Intel Arc Pro B70 LLM Inference Server (4x B70, vLLM XPU via WSL2)

    PowerShell 1

  3. arc-pro-b70-ubuntu-gpu-speedup-bugfixes arc-pro-b70-ubuntu-gpu-speedup-bugfixes Public

    Makes Intel Arc Pro B70 GPUs actually fast on Ubuntu Server. 11 llama.cpp cherry-picks that fix the big B70 bugs (MoE slot-init SEGV, Q8_0 reorder crash, OOM reorder, missing BF16 GET_ROWS, wrong X…

    Python 1

  4. turzx-blue-8.8 turzx-blue-8.8 Public

    Open-source replacement for TURZX.exe on Turing Smart Screen 8.8" - H.264 streaming with NVENC

    Python