GPU Acceleration for Scan Tailor?
Moderator: peterZ
GPU Acceleration for Scan Tailor?
I've been wondering about this after I found out Python had a nice GPU library. Would it be too hard to implement GPU acceleration into Scan Tailor? NVidia's CUDA language syntax is nearly identical to C++, so conversion wouldn't be too hard. Anyone have any ideas?
Re: GPU Acceleration for Scan Tailor?
It's not that it's hard, it's just time consuming.Anonymous wrote:Would it be too hard to implement GPU acceleration into Scan Tailor?
Even though CUDA is superior to OpenCL, I would use the latter, as it's cross platform.Anonymous wrote:NVidia's CUDA language syntax is nearly identical to C++, so conversion wouldn't be too hard.
It won't happen by itself though. Someone will have to put effort into it. Don't count on me, as I've retired from Scan Tailor development.
Scan Tailor experimental doesn't output 96 DPI images. It's just what your software shows when DPI information is missing. Usually what you get is input DPI times the resolution enhancement factor.
Re: GPU Acceleration for Scan Tailor?
Good man. You've done your part, and that's all that was needed. I can't even get my OCR Page Naming script to work, so I've "temporarily" suspended it, so I can't imaging how you could code that whole thing almost by yourself...
I'll see what I can do, but I doubt I can do much, as C++ is not my thing.
I'll see what I can do, but I doubt I can do much, as C++ is not my thing.
Re: GPU Acceleration for Scan Tailor?
Tulon, what are the most processor-intensive parts of the whole program? I've compiled it from source a bunch of times, so maybe I can tinker with the GPU acceleration stuff too. Thanks!
Re: GPU Acceleration for Scan Tailor?
I haven't profiled it for a while. You will have to do it yourself. On Linux you can use sysprof or callgrind + kcachegrind. On Windows you would need an expensive version of Visual Studio (Professional is not enough, I believe you need Team Editition).
Scan Tailor experimental doesn't output 96 DPI images. It's just what your software shows when DPI information is missing. Usually what you get is input DPI times the resolution enhancement factor.