We provide IT Staff Augmentation Services!

Data Engineer Resume

0/5 (Submit Your Rating)

AlpharettA

SUMMARY

  • 20+ Extensive years of experience in pre - sales, analysis, architecture, design, and implementation of large scale software and services solutions, with an emphasis on Cloud, Big Data, MDM and Data driven applications
  • Capable of multi-tasking, managing large teams, complex delivery environments and Onsite-Offshore models.
  • Industry experience includes Consulting, Retail, Manufacturing, Financial and Government Sectors.
  • Striving for exemplary success in implementing and delivering solutions encompassing innovative and practical ideas.
  • Responsible for delivering solution for architecture, design and for key global initiatives using Cloud, COTS and custom developed applications hosted in a variety of environments considering re-use, scalability, and reliability
  • Lead the team in successful implementation of proposed design and architecture by managing expectations and risks, ensuring appropriate organizational structures, principles, tools, and responsibilities are in place
  • Evaluates solutions to ensure that they are in alignment with the broader organization's mission, strategy / objectives, capabilities, and processes using industry recognized architectural models and roadmaps
  • Created reliable solution plans that include cost estimates and optimized delivery approaches by working with the business and multi-disciplinary teams, enforcing governance and compliance activities for the solution
  • Acts as a bridge between technical and business audiences during solution planning, development, and deployment
  • Vast experience implementing solutions on Azure cloud and minimal experience on AWS, GCP
  • Specialized in implementing solutions for Big Data using HDFS, Mapreduce, Hadoop Ecosystem (Hive, Sqoop, Flume, Oozie, Pig), Confidential Biginsights, Confidential Streams, Kafka, HBase, Cassandra, Graph Databases, Titan, ElasticSearch), Cloud EC2, Spark, EAI, SOA, Databases (RDBMS, No SQL)
  • Expertise in implementing solutions on Confidential Biginsights, Hortonworks and Cloudera Hadoop distributions
  • Have experience in executing Data mapping exercise, Data Models and interfacing with other integration teams
  • Implementation of Big Data Solutions using Cloud EC2, Hadoop, Hadoop EcoSystem and No SQL Databases
  • Guided the client and other teams implementing the Big Data and MDM solution in highly efficient fashion using Best Practices
  • Experience in developing SOA applications using ESB, Web services (REST, SOAP, JAXP, SAAJ, UDDI, WSDL) JCA/JMS and ORM
  • Skilled team leader with excellent organizational, interpersonal and motivational skills
  • Involved in Conducting relevant Proof of Concepts and Proof of Technologies
  • Technology enthusiast, avid blog reader and helped creating MDM & Big Data whitepaper for Lockheed Martin
  • ITIL, Confidential Big Data Technical Mastery, Confidential InfoSphere BigInsights Technical Mastery and SCJP Certified

PROFESSIONAL EXPERIENCE

Confidential, Alpharetta

Data Engineer

Responsibilities:

  • Defined the EY’s next generation of products with road map, and platform architecture
  • Defining platform’s suite of tools,frameworks, and services to empower managed service teams to deliver production-grade
  • Understanding and proposal of latest cloud computing and data storage technologies, business drivers, emerging computing trends, and deployment options.
  • Working with the business to understand and solve complex problems, by presenting various solution options and technical concepts in a comprehensible manner.
  • Implemented Agile & DevSecOps methodologies to build efficiency
  • Working with the Vendors in resolving the issues with infrastructure setup or with any performance aspects
  • Driving the Data mapping and Data modeling exercise with the stake holders
  • Guiding others in resolving complex issues in data architecture and solve complex, escalated aspects of a project
  • Broad understanding of EY Technology, including service offerings, technical standards and policies, technical and business strategies as well as organizational structure
  • Experienced with authentication (SAML/OAuth), MFA, and/or RBAC / Ping etc.
  • Knowledge of cloud security controls including tenant isolation, encryption at rest, encryption in transit, key management, vulnerability assessments, application firewalls.
  • Provide coaching and support to development teams during blueprinting and implementation phases
  • Managed the end-to-end lifecycles of a project in solution architect capacity
  • Experienced with mission critical technology components with DR capabilities
  • Experienced with multi-geography, multi-tier service design and management
  • Extensive experience in financial management, solution plan development and product cost estimation
  • Working on Proof of concepts / technologies and providing benchmarks
  • Excellent project management, collaboration, interpersonal and communication skills

Environment: Operating SystemWindows 10, Linux DatabaseMS SQL server, Databricks Delta Lake,ADLS2, BLOB, Hive, HBase, Big SQL, Cosmos DB Software Azure IaaS/PaaS/SaaS services, Event Hubs, IBMBigInsights 4.2 (Hortonworks),Hadoop 2.6, MR,HDFS, Sqoop, Oozie, Flume, Kafka, AVRO, ElasticSearch, Kibana, Spark, Java 1.7, Python, Scala, XML

Confidential, Atlanta

Big Data Solutions Architect/Engineer

Responsibilities:

  • Defined the Big Data Road map, architecture for D2C MAIN platform
  • Defining the Big Data frameworks, access patterns and techniques
  • Working with the Vendors in resolving the issues with infrastructure setup
  • Driving the Data mapping and Data modeling exercise with the stake holders
  • Building highly scalable big data solutions using Lambda Architecture and complex event processing
  • Evaluating right fitment of big data tools for the use cases
  • Analyzed and provided valuable Insights on the complex data sets
  • Building standards, naming conventions and best practices for Big data platform
  • Built generic frameworks for data ingestion (File, Batch, Events) to Hadoop
  • Access patterns defined for Data Ingestion, Storage, Access, Analysis and Visualization
  • Design and development of Flume Agents, Oozie workflow, Coordinator schedulers
  • Design and development of Hive UDF and Flume Interceptor in Java
  • Working with the Visualization team in getting the right access approach and building interactive dashboards on Spotfire and Kibana
  • Working on Proof of concepts / technologies and providing bench marks

Environment: Operating SystemWindows 7, Linux DatabaseOracle,Teradata,DB2,Hive, HBase, Big SQL, SQL Server Software IBMBigInsights 4.0,Hadoop 2.6, MR,HDFS,Streamsets, Sqoop, Oozie, Flume, Kafka, AVRO, ElasticSearch, Kibana, Spark, Java 1.7, Python, Scala, XML, Tibco EMS, Spotfire, Control

Confidential, Atlanta

Big Data Solutions Architect/Engineer

Responsibilities:

  • Translating functional requirements into Technical requirements
  • Providing Big Data solution architecture and created High level and Low level design documents
  • Leveraging MapReduce framework to build the application.
  • Driving the Data mapping and Data modeling exercise with the stake holders
  • Involved in designing Hive regular tables and partitioned tables.
  • Developed Hive DDL’s and DML’s to populate data within cluster and across clusters
  • Developed shell script to pull data from HDFS and apply the incremental and full load to the Hive tables.
  • Pushed the data to Windows mount location for Tableau to import it for reporting.
  • Developed shell scripts to load data from HDFS to partitioned tables.
  • Written Sqoop scripts to import/export data from HDFS to RDMS databases and Vice versa
  • Develop data access layer using Hector API to insert and retrieve records from Cassandra.
  • Developed Hive query to perform a duplicate check on incoming files before loading the data into Hive tables.
  • Working on pushing data to mount location for tableau server to pick it up for reporting.
  • Working on other Big Data initiatives for business customers in generating pricing analytics/reporting on Hadoop environment

Environment: Operating SystemWindows 7, Linux DatabaseTeradata, DB2, Hive, Oracle, SQL Server SoftwareHadoop 1.3/2.1, MR, HDFS, Sqoop, Oozie, Bizlink, DataStax, Cassandra, Titan, Elastic Search, Java 1.7, XML, Tableau

Confidential, Minneapolis

Solutions Architect

Responsibilities:

  • Translating functional requirements into Technical requirements
  • Created Architecture, Design, Implementation, Monitoring and Transition documents
  • Scoped and worked on design to have up-sell and cross-sell products solution
  • Driving the Data mapping and Data modeling exercise with the stake holders
  • Analysis and Migration of data while defining the data Synchronization approach
  • Developed components using Streams Standard and Specialized toolkits
  • Built Data warehousing and analytics solution on HDFS/ BigInsights
  • Design and Development of read services on HBase to improve ROI
Environment: Operating SystemWindows 7, Linux DatabaseDB2, HBase SoftwareIBM Infosphere BigInsights v2.1, Infosphere Streams v3.0, HDFS, Java 1.6, Ant, JMeter, XML

Confidential, Atlanta

IT Consultant

Responsibilities:

  • Responsible for design, implementation and delivery of High level and low level design documents.
  • Maintenance, enhancements support of CCMS 4.
  • Lead for Refactoring project to implement the recommendations and to improve the application quality with regards to the adoption of standards, best practices, and patterns.
  • Feasibility and investigation of any additional features requested by the business customers.
  • Design, development and testing of Client notifications Adapter.
  • Monitoring and support of WMQ queues and messages.
  • Investigate, troubleshoot and write cleanup SQL in fixing the bad data.
  • Running and monitoring the daily MDM batch files.
  • Writing the functional and regression Test cases in Rational Service Tester and getting it signed off from the stake holders.
  • Responsible for managing the builds and maintenance support.

Environment: Operating SystemWindows 7, Linux DatabaseDB2, HBase SoftwareIBM Infosphere BigInsights v2.1, Infosphere Streams v3.0, HDFS, Java 1.6, Ant, JMeter, XML

We'd love your feedback!