apple / coremltools

Core ML tools contain supporting tools for Core ML model conversion, editing, and validation.
https://coremltools.readme.io
BSD 3-Clause "New" or "Revised" License
4.33k stars 626 forks source link

Adding BitNet Layer Support to CoreML #2214

Closed mromanuk closed 4 months ago

mromanuk commented 4 months ago

🌱 Describe your Feature Request

I am requesting the incorporation of a BitNet layer in CoreML, similar to the PyTorch implementation by Kyegomez (https://github.com/kyegomez/BitNet). A BitNet layer is a neural network layer that uses binary weights and activations, which can lead to significant reductions in computational resources and memory usage while maintaining model accuracy.

The addition of a BitNet layer in CoreML would enable developers to create more efficient and lightweight machine learning models, which is particularly important for deployment on mobile and embedded devices. This feature would be especially useful for applications that require real-time inference, such as computer vision and natural language processing tasks.

How can this feature be used?

The BitNet layer can be used in a variety of applications, including:

Real-time object detection and image classification on mobile devices

Efficient natural language processing models for chatbots and voice assistants

Resource-constrained IoT devices that require machine learning capabilities

Real-time grammar and spell checking in writing apps

Efficient language translation models for chatbots and voice assistants

Resource-constrained IoT devices that require natural language processing capabilities

Describe alternatives you've considered

I have considered the following alternatives to this feature:

Using existing quantization techniques to reduce the precision of model weights and activations

Using quantization techniques to reduce the precision of model weights and activations. I have experimented with quantizing my model to 8-bit precision, but this still results in a larger memory footprint and higher computational requirements compared to a BitNet layer.

Implementing binary neural networks using existing CoreML layers, such as the Boolean layer

Using other lightweight neural network architectures, such as depthwise separable convolutions

However, these alternatives do not provide the same level of efficiency and accuracy as a native BitNet layer. The incorporation of a BitNet layer in CoreML would provide a more seamless and efficient way to deploy binary neural networks on Apple devices.

Additional context

I believe that the addition of a BitNet layer in CoreML would align with Apple's focus on machine learning and AI, and would provide developers with a powerful tool to create more efficient and effective machine learning models.

TobyRoseman commented 4 months ago

This sounds like an interesting idea. Thanks for the detailed request.

However the coremltools GitHub repository is not the correct place for such requests. Support for this feature can not be added from the coremltools Python package. Changes will be needed in the Core ML Framework.

Please submit your feature request the Feedback Assistant.