AWS Solution Architect
We at xEnabler, provide digital transformation using new age smart technologies for more than 10 years. We have clients located primarily in Australia and NewZealand that are served by our Australian teams. This job is part of our expansion in India.
We have a number of openings in our technical team focusing on mobile and web development to be part of our technology leadership team. As a part of this expansion we are looking for an AWS Solution Architect, to join our award winning digital team.
As a part of this role, you will be responsible for,
- Working on a great mix of AWS and Cloud Services
- Work with development teams and product managers to ideate software solutions
- Working on a great mix of greenfields projects and maintaining existing applications
- Using your technical skills to help improve all aspects of xEnabler; from our applications to sharing your knowledge and experience in our knowledge sharing presentations
- Involved through the whole delivery life cycle, from brainstorming sessions and inception to production
- Working within an agile (Scrum) environment and contributing to the continuous improvement of xEnabler
- Helping to create exceptional user experiences for our customers
- Contributing to code reviews & documentation
- Experimenting with lots of new technologies through our great R&D projects. We are always keen to explore something new
- Working alongside an excellent team
- Taking initiatives, training & assisting team members to use & implement latest technology solutions
- Troubleshoot, debug and upgrade software
- Create security and data protection settings
To be successful in this role, you must have,
- AWS certified data engineering professional
- Experience on AWS and its cloud service offering S3, Redshift, EC2, EMR, Lambda, CloudWatch, RDS, Step functions, Spark streaming etc.
- Good knowledge of Configuring and working on Multi node clusters and distributed data processing framework Spark.
- Hands on experience with EMR Apache Spark Hadoop technologies
- Experience with must have Linux/Unix
- Python and PySpark, Spark SQL.
- Experience in crafting scalable data pipelines, sophisticated event processing, analytics components using big data technology (Spark, Python, Scala, PySpark)
- Specialist in AWS RDS, Redshift, DynamoDB database
- Experience in process orchestration tools Apache Airflow, Apache NiFi etc.
- Hands on knowledge of design, development and improvement of Data Lakes, constantly evolve with emerging tools and technologies
- Knowledge in ETL tools (SSIS, Talend etc)
- Good to have knowledge on Automation Runbooks
- Have in-depth ELT Processing
- Good expertise in Data warehousing/Dimensional Modelling
- Should have experience in Agile projects with knowledge in Jira
- Must have handled Data Ingestion projects in AWS environment
- Prefer someone has knowledge on GitHub Integration and DevOps
- Demonstrable analytical skills and excellent Communication skills
- Mid to senior level talents – at least 5 to 8 years of experience
It would be nice if you have any extra skills such as,
- Knowledge of GCP or PCF
- Knowledge on scripting languages
- DevOps basics and knowledge on micro services
What you will get as a reward
- Above market level salary
- Opportunity for continuous growth
- Flexibility to work from home
- Ability to be part of core leadership team and drive development decisions
- Working with highly skilled team who prides in excellence of their solution
- Greenfield setup to match your career path.
- Opportunity to work on cutting edge technologies and solutions
If you’re passionate about AWS and building efficient architecture, we would like to meet you. Send us following to start the conversation
- Your LinkedIn Profile link
- Your GitHub link
- Any Stackoverflow contribution is a plus
- Your resume to firstname.lastname@example.org
- Projects you have developed so far, along with links and reference
Remote or Pune, India