Coordinating European Future Internet Research

Computational models for big data algorithms.

_____________________________________________________________________

SUMMARY

A central theme in computer science is the design of efficient algorithms. However, recent experiments show that many standard algorithms become unreliable when applied in the context of big data. This is particularly true in the case of database queries. Unfortunately, existing theoretical tools for analysing and validating algorithms cannot tell whether an algorithm will perform reliably in a big data context.

Indeed, algorithms that are considered to be tractable in the classical sense are not tractable anymore when big data is concerned. This project calls for a revisit of classical theoretical assumptions regarding computational complexity. The main goal of this project is the development of a formal foundation, and an accompanying model for scaling up computational complexity, which can be used as a framework in the study of algorithm tractability in the context of big data.

_____________________________________________________________________

SOLUTIONS

  • Developing a formal foundation, and an accompanying model for scaling up computational complexity, which can be used as a framework in the study of algorithm tractability in the context of big data
  • Revisiting classical theoretical assumptions regarding computational complexity

 

_____________________________________________________________________

LINK: https://www.uantwerpen.be/en/rg/adrem/projects/

_____________________________________________________________________

Back to Belgium

This entry was posted in Uncategorized and tagged , . Bookmark the permalink.

Comments are closed.