Remove Akka from Spark Core dependencies

In Spark 2.0, Akka is removed from Spark Core dependencies. For one to understand the works required to accomplish this, check out these tickets.

https://issues.apache.org/jira/browse/SPARK-5293

https://issues.apache.org/jira/browse/SPARK-5214

https://issues.apache.org/jira/browse/SPARK-6602

https://issues.apache.org/jira/browse/SPARK-6028

You can also find the design doc for new pluggable RPC implementation in this ticket

https://issues.apache.org/jira/browse/SPARK-5124

As a result of the refactoring, a common single-threaded event loop was implemented in DAGScheduler to replace Akka.  An alternative non Akka RPC implementation was also introduced.

Also, see Reynold Xin’s comment in the following pull request about the reasons behind the refactoring.

https://github.com/apache/spark/pull/4016

If we ever do that, it’d be for making networking easier (both debugging and deployment), and enabling our users to use Akka (using Akka, especially a different version of it for an app on top of Spark is a mess right now. Spark not depending on Akka will make it easier for applications on top of Spark to use Akka).

This major undertaking started as early as Spark 1.3 and finally in Spark 2.0, Akka is removed from Spark Core dependencies. Kudos to Shixiong Zhu for getting it done.

 

 

 

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s