Our expertise also allows us to extend the provision of our services to the field of Big Data systems. The Big Data paradigm facilitates to process extraordinarily large volumes of information quickly and efficiently, characteristics that make it very attractive for applications known as High-Performance Data (HPDA). This kind of application marks the convergence between HPC and big data models, as the tasks handle a large volume of data and are of such algorithmic complexity that they require the use of HPC resources.

We design, provide, install, and maintain all necessary hardware and software to run a data processing cluster. We offer solutions as:


User Environment:

Hadoop Clusters , Horton Works


Spark , Impala , Python , R , HDFS , MapReduce , Yarn


Big Data Cluster:

Storage nodes , Gateway nodes , Switches , Spark Cluster