Longmont, CO, March 13, 2018 (GLOBE NEWSWIRE) — Parascript today announced powerful, best-in-the-industry classification and recognition for black-and-white claims in addition to its red drop-out claims capture. Parascript has overcome the perpetual image quality and scaling challenges of black-and-white claims and now provides 200 percent to 400 percent better performance than other solutions on the market today—achieving close to the same results as extracting data from much more straightforward red drop-out claim forms.

“The keys to our unequaled technology performance are Parascript’s proprietary image processing and field-level recognition technology. We call it “virtual drop-out” since it performs in most cases as well as red drop-out recognition because of its field-level image clean-up and alignment,” said Greg Council, Vice President of Marketing and Product Management. “To do this, Parascript applied new deep learning algorithms to improve out-of-the-box accuracy to the industry’s highest level.”

Unlike other claims recognition solutions, Parascript FormXtra provides not only pre-built configuration for claims documents, but in addition, each field is tuned and optimized to achieve a specific statistically-measured accuracy rate that is the equivalent of dual-pass data entry. This is the standard bearer process for high accuracy SLAs. Parascript helps enterprises achieve their target accuracy level to completely remove manual verification for a significant portion of data processing operations.

Pre-built health claims processing modules designed for standard forms include both red drop-out and black-and-white. However, it can easily accommodate unique situations, such as additional information applied on the form header, footer or margins that is automatically added by a scanner or other application. Enterprises leverage these optional out-of-the-box modules to accelerate their document processing or easily configure their own.

“We have found that most of our prospective clients gave-up on black-and-white claims a long time ago due to the inability to deal with image quality problems or due to interference that the form structure has on achieving good OCR. Even though these claims represent a smaller portion of overall production volume, it represents an outsized cost. With this new capability, that cost can be significantly reduced by 80 percent or better,” explained Mr. Council.  “The same platform can also deal with automating multipage claims and the supporting documentation through use of our automated document classifier, and we’ve found that it drastically reduces the need to manually sort and organize every page. Pages can be scanned in any order, further reducing labor-intensive hours spent in document preparation.”

Parascript also supports other types of document processing with its template-less, neural network-based document extraction. Parascript offers custom-developed recognition projects with much quicker turnaround than traditional rules-based approaches. As a result, Parascript provides significantly faster document processing optimization with more reliable results for clients.

 

About Parascript, LLC

Parascript automates the interpretation of meaningful, contextual data from image and document-based information to support transactions, information governance, fraud prevention and business processes. Parascript software processes any document with any data from any source with its easy-to-use, image-based analysis, classification, data location and extraction technology powered by machine learning. More than 100 billion documents for financial services, government organizations and the healthcare industry are analyzed annually by Parascript software. Parascript offers its technology both as software products and as software-enabled services to our partners. Our BPO, service provider, OEM and value-added reseller network partners leverage, integrate and distribute Parascript software in the U.S. and across the world. Visit Parascript.

 

Attachment:

A photo accompanying this announcement is available at http://www.globenewswire.com/NewsRoom/AttachmentNg/c955bf28-04e8-45f5-98ba-32200e65aaf1

CONTACT: Rebecca Rowe
Parascript
303-381-3122
[email protected]