Use of distributed computation resources

#1

Is it possible to use distributed computation resources of Hadoop clusters (e.g. Spark, MLib etc.) with C3?

0 Likes

#2

C3 provides distribution of your workload on its cluster. At the simplest level you can use MapReduce, Batch, and event processing (Analytic).

Here is an example on how to use MapReduce available in the platform documentation:

https://<your_environment>/api/1/shellMV/prod/documentation/topic/c3-plat-map-reduce-tutorial

As of 7.8, Spark is not offered as a hosted service in C3.

0 Likes