Role:- Big Data Engineer (AWS services Must)
Location:- Franklin, TN (Onsite)
Roles & Responsibilities:-
Technical Skillset (Mandatory)
· Big Data Tools: Hadoop, Cloudera (CDP), PySpark, Kafka, etc.
· Data Pipeline & Integration: Airflow, Kafka, Informatica Cloud
(IICS), PowerCenter
· Data Governance: EDC, AXON
· Stream-Processing Systems: Storm, Spark-Streaming, AWS Kinesis
· Databases: Relational, Amazon DynamoDB, Mongo DB, Snowflake,
Postgres.
· AWS Cloud Services: EC2, S3, EMR, RDS, Lambda, DMS, Kinesis, Glue
· Object-Oriented Languages: Python, R
Technical Skillset (Optional)
· Visualization Tool: SAP Business Objects, Tableau
Role and Responsibilities
· Enhance our cloud capability by creating and implementing cloud
application patterns
· Develop and implement ways to move apps and workloads to the cloud
· Work closely with business leads and product owners to understand
solution requirements and identify architectural patterns
· Write and develop cloud automation playbooks for managing and
scaling containers, hosts, cloud services, and applications
· Monitor compliance of cloud resources to see if they fit industry
guardrails and best practices
· Help other development and engineering teams resolve application
to platform integration issues for Platform as a Service (PaaS) and
Infrastructure as a Service (IaaS) services
· Research and propose solutions for AWS data transformation, data
connections, operational frameworks, and application integration
· Work closely with lead architects and engineers to create and
maintain architectural templates and build/operational documents
· Work with DevOps and engineering teams to develop service catalogs
· Based on customer technology landscape arrive at solutions which
is more Agile, scalable, and Cost effective.
· Provide Proactive proposals to customer on Cloud Strategy.
· Understand Customer Problem Statements and leverage Ai/ML
Capabilities for Solution.
· . Build reusable artefacts and accelerators in Cloud
· Collaborate with technical and business users to develop and
maintain enterprise wide solutions and standards to provide data required
for metrics and analysis
Project Specific requirement (if any):
· Data-oriented personality, good communication skills, and an
excellent eye for details.
· Certifications in AWS Cloud, Machine Learning Programs
Relevant Experience required
· 8+ Years? experience in Enterprise Solutions and 3+ Years in
Cloud preferably AWS.
· Proficient understanding of distributed computing principles.
· Strong knowledge and practical experience with AWS
· Strong programming skills with experience in Webhook and API
development using Node.js, Ruby, Python, Shell, and PowerShell
· Familiarity with modern cloud application architecture
· Exposure to cloud managed services and microservices like
Function as a Service, Containers, and managed databases
· Thorough understanding of ML, data analysis, data visualization,
and event-driven architecture
· Familiarity working with large systems
· Experience with setting up load balancers, cloud networks, and
virtual servers
· Capable of working under tight deadline
· Proficient in Solutioning for Real time Analytics
Reference : Big Data Engineer jobs
|