Vuclip, a PCCW media company, is a leading premium video-on-demand service for emerging markets with 9 million subscribers per quarter across nine countries. Through strategic partnerships with over 270 top studios around the world, Vuclip brings blockbuster Hollywood and Regional movies, TV shows and Music videos in more than 34 different languages to its subscribers. Vuclip’s Dynamic Adaptive Transcoding™ provides unbuffered viewing experience to consumers across all mobile devices and on any network. Vuclip also runs its own OTT service under the brand “VIU” with apps on both Play and iOS stores.
Vuclip and its subsidiaries are headquartered in Milpitas, California and have a presence in Mumbai, Delhi, Pune, Dubai, Beijing, Kuala Lumpur, Jakarta and Bangkok.
Our Culture Ignites Innovation (Smart People, Big Dreams, Fiery Passion, Fun and Engaging Journey). Our culture defines who we are, what we do and how we do it. We are very focused on this as we scale rapidly our employee base worldwide. We believe in being one’s authentic self in the workplace, ideas over hierarchy, merit over tenure, not efforts but results, and having a blast while enjoying the journey.
Java, Spring, API Programming, Messaging systems(Kafka/Rabbitmq/Kinesis/PubSub), Streaming Workflows(Spark/Storm/DataFlow/Beam), NoSQL databases (Hbase/Cassandra/Couchbase), Micro- services, RESTful web services, Cloud computing (GCP/AWS/Azure)
Key Responsibility And Accountability
- Build data services to integrate streaming / batch data.
- Closely interact with Engineering teams to define appropriate logging requirements, design scalable data processing and access mechanism.
- Collaborate with ETL and Engineering teams to implement end to end data systems that are efficient and cost effective.
- Drive POCs on Big Data/Cloud Platform for tool evaluation and implementation.
- Work closely with Data Architect, ETL Leads, Reporting & Engineering team to ensure effective analytics pipeline that support BI and end user requirements
Skills and Competencies
- B. Tech/M.Tech/ MCA.
- 8 to 15 Years of software development experience in Java programming.
- 5+ year experience on any Data Technology
- 2+ year experience in implementing large scalable systems.
- Should have developed and implemented micro-services project in production.
- Ability to develop applications using Public Cloud APIs.h
- have in-depth understanding and implementation experience with at-least one from each stack below:
- Google BigQuery/Amazon Redshift/Teradata/Greenplum/Netezza
- Managed platforms for service deployments – Google Kubernetes Engine/Google App Engine/AWS Elastic Beanstalk/AWS Elastic Container Service
- Quick in learning & adapting new technologies: Ability to develop solutions with long term maintainability and scalability with appropriate changes to design/tools used in current architecture
- Understanding of modern data streaming/real-time processing technologies such as Spark/Storm/Beam(Dataflow) would be added advantage
- Python knowledge would be added advantage.