Management Imaging Recognition Scanning Software Strategy Privacy

Current Filter: Document>>>>>>2016-03-01

PREVIOUS

Filtered Articles:10 of 24   Current Article ID:6523

NEXT



Taking capture step by step

Editorial Type: Feature     Date: 03-2016    Views: 1805      







Ashley Keil, Sales Director at ibml, argues in favour of a multi-step approach to transactional scanning processes

Operations directors are rightly reviewing the optimum approach to their process, knowing that competitive pressures and SLA dictates play a significant consideration in process methodology.

Maximising the efficiency of the end to end process is driven by the optimisation of each of the component parts. This is true in many analogies, but particularly true in a commercial capture process.

If the capture process is treated as a single step, as some would propose, then by definition it can only work as fast as the slowest component within the process. Batch processing on the other hand was developed to separate the capture steps so that each step can work to its optimum level and can be tuned accordingly. The principle of 'batch processing' is not new; in fact it's been part of the capture process for many years now, and for good reason. That's why the batch process has been endorsed by the majority of the capture industry for many years now.

The decision however depends on the mix of work to be processed. For example where an application requires a very high degree of preparation based on very poor paper quality, such as expense receipts and the like, then a calculation can be made that may lend itself to a single step process. However, when you consider a typical mix of work, for most companies this remains the minority and therefore typically not how the main capture process would be geared or would want to be geared to maximise efficiencies and recognise ROI.

Scanners, software and people remain the principal costs in any capture operation. Software, once purchased, operates as an automated component; this leaves the optimisation focus on the scanner assets and people (FTE) costs. In most cases document extraction and preparation remain the majority tasks and therefore cost. They are also typically the slowest component tasks. Therefore tying these tasks to your scanner asset in turn will under-utilise that asset.

Also, when preparation staff remain tied to the scanning asset on a one to one basis, the flexibility to change the application profile becomes extremely restricted. This may go beyond the speed at which differing paper types can be extracted and prepared, but also where knowledge workers are required to view and qualify a document within the process. Treating extraction, preparation, scanning and document classification as independent batch components that can be geared-up and geared-down is particularly important within most BPO's who receive a wide variation in application profiles. This is also true for commercial users such as banks who receive multiple application threads within the same Digital Mailroom master application.

Whether quality assurance and document repair should take place on the scanner while it sits idle has often been discussed. Should my scanning asset come to a halt while a document is reviewed, released, or repaired and corrections inserted? Or should this be done separate from the scanning asset as a post scan component process - enabling dedicated review software and volume matched exception scanners to complete the task? Again, the principle of optimisation would suggest these are best split into batch component tasks so that each process can be scaled accordingly without having to invest in additional scanning assets.

The market is now talking about Transaction Processing, which is not be mistaken for a different process than that of batch processing, but is rather a sub-set, or an enhancement. Technology advances in the capture space continue to abound and much of this advancement lies in IDR (Intelligent Document Recognition), which makes the identification of transactions within a batch an automated process.

Operations want to deal in transactions, but not manually and not one at a time. Instead, they want the benefit of volume based batch processing with transactional output, both electronically and physically where it makes sense.

In summary, multi-step, batch processing for transactional delivery is the most effective methodology for the majority of capture processes and therefore, remains the most effective way to achieve process optimisation.
More info: www.ibml.com

Like this article? Click here to get the Newsletter and Magazine Free!

Email The Editor!         OR         Forward ArticleGo Top


PREVIOUS

                    


NEXT