Closed tarrinw closed 2 months ago
Actually one can use the kraken shell commands (prefixed with '!') in these environments, but it would still be useful and slightly more efficient when building a wrapper application in Python not to have to call the shell which then runs code in Python anyway...
I think that's why the doc
folder is there... https://github.com/mittagessen/kraken/tree/main/docs
https://github.com/mittagessen/kraken/blob/main/docs/api_docs.rst
https://github.com/mittagessen/kraken/blob/main/docs/api.rst
Quote:
from kraken import pageseg
seg = pageseg.segment(bw_im)
seg
Segmentation(type='bbox',
imagename='foo.png',
text_direction='horizontal-lr',
script_detection=False,
lines=[BBoxLine(id='0ce11ad6-1f3b-4f7d-a8c8-0178e411df69',
bbox=[74, 61, 136, 101],
text=None,
base_dir=None,
type='bbox',
imagename=None,
tags=None,
split=None,
regions=None,
text_direction='horizontal-lr'),
BBoxLine(id='c4a751dc-6731-4eea-a287-d4b57683f5b0', ...),
....],
regions={},
line_orders=[])
You don't have to build a wrapper application in Python because kraken
is written in Python and can be imported.
On 24/04/22 12:32AM, Tarrin Wills wrote:
We are working on an application using Kraken for both training and recognition. The command-line tools work well, but it would be easier to deploy in a dynamic python envoronment (e.g. Jupyter / Google colab) if we could access the internal / API methods directly. Presumably this is just a matter of documentation?
The website has both an API tutorial
and the API reference. For some
more examples on how to use the API for inference the scripts in
contrib/
can be a good starting point as they tend to have less boiler
plate than the CLI code.
We are working on an application using Kraken for both training and recognition. The command-line tools work well, but it would be easier to deploy in a dynamic python envoronment (e.g. Jupyter / Google colab) if we could access the internal / API methods directly. Presumably this is just a matter of documentation?