Six reasons your boss must send you to Spark Summit Europe 20172017 Oct 07
It’s redundant to say that Apache Spark is becoming the most prominent open-source big data cluster-computing framework in the last 2 years, where this technology not only shattered old paradigms of general purpose distributed data processing, but also built a very vibrant, innovation-driven, and receptive community. This is my first time at Spark Summit, and for me personally, it’s a great time as Machine Learning professional to be part of such event that has grown dramatically in the last 2 years only. Here in Brazil we do not have such tradition to invest in conferences (that are some cultural reasons involved that needed to break down in another blog post), but this is the six reasons that your boss must send you to Spark Summit Europe 2017:
- Accomplish more than the rest: While some your company competitors are heavily busy making re-work in old frameworks, your company can stay focused to solve real problems that permit scalability for your business using bleeding edge technologies.
- Stay ahead of the game: You can choose one of these two sentences to put in your resumé: 1) “Worked with Apache Spark, the most prominent open-source cluster-computing framework for Big Data Projects”; or 2) “Worked with «Put some obsolete framework the needs a couple USD millions to be deployed and have 70% fewer features than Apache Spark and the most stable version was written 9 years ago and the whole marketing are migrating>>”. It’s up to you.
- Connect with Apache Spark experts: In Spark Summit you’ll meet some real dealers of Apache Spark, not someone with marketing pitch (no offense) offering difficulties (e.g. closed-source, buggy platform) to sell facilities (e.g. never-ending-consulting-until-drain-your-entire-budget style, sell (buggy) plugins, add-ons, etc… ). Some of Spark experts are Tim Hunter, Tathagata Das, Sue Ann Hong, Holden Karau, to name a few.
- Network that matters: I mean people with shared interest in enthusiasm over an open-source framework Apache Spark and technology, headhunters of good companies that understand that data plays a strong role at business; not some B.S. artist or pseudo-tech-cloaked-sellers someone else.
- Applied knowledge produce innovation, and innovation produce results: Some cases using Apache Spark to innovate and help business - Saving more than US$ 3 million using Apache Spark and Machine Learning, managing 300TB data workload using Apache Spark, real-time anomaly detection in some systems, changing the game of digital marketing using Apache Spark, and predicting traffic using weather data.
- Opting out will destroy your business and your career: Refuse to get knowledge and apply that it’s the fast way to destroy your career with stagnation in old methods/process/platforms and become obsolete in a few months. For your company, opting out of innovation or learning new methods and technologies that can help to scale the business or enhance productivity, it’s a good way to get out of business in a few years.