servers in Discovery Building

Large data sets require statistical analysis, but some data may come in varied forms (text, audio, video, sensor data, etc.). To cope with data structure variations, optimization explores integrating statistical processing techniques with data processing systems to make such systems easer to build, maintain and deploy.

Mass digitization of printed media into plain test is changing the types of data that companies and academic institutions manage. Scanned images are converted to plain text by conversion software, but often the software is error-prone. Any query of the digitized media may miss information that may lead to a poor results or applications. Staccato software improves the digitization process.

Bracketology

Laura Albert is an affiliate of the optimization group at WID […]

Hazy

There is an arms race to perform increasingly sophisticated data […]

Jellyfish

Jellyfish is an algorithm for solving data-processing problems with matrix-valued […]

nextml.org

Active learning methods automatically adapt data collection by selecting the […]