We provide IT Staff Augmentation Services!

Data Architect/engineer Resume

SUMMARY:

  • 13+ years of experience in software development lifecycle and data architecture with extensive experience in Big Data, Data Warehousing and Dimensional modeling.
  • As a manager, Demonstrated ability to communicate complex technical concepts succinctly to non - technical colleagues, understand & manage interdependencies between all facets of a project
  • Ability to lead client presentations; familiarity with the end-to-end sales process; Demonstrated ability to manage the team to drive engagements to successful closure and also the ability to mentor others.
  • Designing, Architecting, and Developing solutions leveraging big data technology (Azure Cloud, Hortonworks Hadoop) to ingest, process and analyze large, disparate data sets to exceed business requirements.
  • Unifying, enriching, and analyzing Healthcare member and clinical data to derive insights and opportunities. Leveraging in-house data platforms as needed and recommending and building new data platforms/solutions as required exceeding business requirements. Experienced in organizing, aggregating, querying, and analyzing large datasets.
  • Have performed research, experiment, and utilizing leading Big Data methodologies (Hadoop, Spark, Kafka, Hive, HBase, Azure) with cloud/on premise/hybrid hosting solutions to Architect, implement, and test data processing pipelines, and data mining/data science algorithms on a variety of hosted settings (AWS, Azure, client technology stacks, and Kaiser's own clusters). Provided proficient documentation and operating guidance for users of all levels
  • Possess excellent communication skills (verbal and written) and interpersonal skills and ability to effectively communicate with both business and technical teams. Experience in gathering complex business requirements and identifying data needs.
  • Have extensive experience with design & development of relational databases and data warehouses. Advanced level of proficiency in SQL development. Strong knowledge & expertise with Python, Spark, Shell scripting. Extensive ETL development experience with large-scale DBS and big data systems such as Azure Cloud, Hadoop, Informatica BDM, AWS Redshift etc.
  • Have been an active participant and advocate of agile/scrum practices to ensure health and process improvements for the team.

BIG DATA TECHNOLOGIES:

  • Azure Cloud
  • Hortonworks Hadoop
  • Spark and SparkFlow(InHouse ETL Tool)
  • Hive
  • HBase
  • Kafka
  • Nifi
  • Sqoop
  • Python and R Programming
  • Knowledge in Amazon AWS S3, EC2, Cloud Watch, RedShift & Dynamo DB.
  • ETL - Informatica Cloud / BDM, SAP BODS, Web services API.
  • Experience in Data Wrangling, Data Tagging, Data quality, Data Visualization & Data Security using Trifacta, Waterline, Power BI and Ranger.
  • Oracle, SQL Server, Teradata, MySQL
  • Certified in advanced SQL, Retail CPG & Logistics & Healthcare domain.
  • SDLC using waterfall and Agile/scrum.

PROFESSIONAL EXPERIENCE:

Data Architect/Engineer

Confidential

Responsibilities:

  • Create data pipelines using Python scripts & SQL queries. Work with Data Scientist and provide them the datasets to create data science model.
  • Successfully created Predictive model & Behavioral Model (Python & R) using clinical datasets.
  • Software development using waterfall and Agile/scrum framework. Comfortable in running scrum calls, creating features/tickets in Jira, retro meetings using Trello board.
  • Data Ingestion from RDBMS/flat files to Azure Data Lake Store (ADLS) raw zone (Hbase) using in house framework component built using big data technologies.
  • Built data pipelines using spark flows ETL tool to transform (apply data quality & MDM) the data from Raw to Refined Zone.
  • Performance tuning on Hive tables created in refined zone.
  • Using Data Wrangler (Trifacta) tool to derive the data from different source systems and apply transformation rules to build the target data sets.
  • Assessment on tools & technologies for Azure cloud platform and create POC’s to integrate the tools on Azure platform.
  • Solutioning for Best practices to be followed, data solution patterns and ABC framework.
  • Building Data Access & Use architecture on Azure cloud datasets.
  • Created business rules and mapping documents on how to load the data on target systems.
  • Managing the team of 10+ members both at onsite & offshore.

Data Modeler/Architect

Confidential

Responsibilities:

  • Prepared impact analysis on the existing application. Created detailed requirement document based on the business needs.
  • Update the existing dimensional data model using ERwin tool to accommodate the new business requirements. Create the database based on the data model.
  • Provide necessary details to the application team on extract, transform & load.
  • Managed the application team of 5-6 members.

ETL Architect

Confidential

Responsibilities:

  • Worked with business team on requirement gathering and creating the application architecture and dimensional model using star schema through SAP systems.
  • Prepared architectural diagram, technical/functional document to represent how the data is extracted from the OLTP systems and transformed using SAP BODS 4.0 and loaded into SAP BW & Oracle databases. Creating user acceptance criteria for each use cases.
  • Prepared mapping document for each data set along with business rules to transform the data.
  • Create complex mappings and do the integration testing along with the other components developed by other team members. Compare the result of the new mappings with SAP PI output.
  • Managed 4 team members and allocating the work on a daily basis.
  • Created metrics report on the development process and circulate the same to higher management/Quality team.
  • Created data quality/governance using SAP Data Steward Tool.

ETL Lead

Confidential

Responsibilities:

  • Converting functional/requirement document to technical design document. Working with architects and build dimensional model for the target system.
  • Create mapping documents to transform the data with business rules. Create test scripts to match with the requirement document and capture the test results.
  • Created mappings using Informatica Power center 9.x and loaded the data into respective data warehouse system such as Netezza and SQL server databases.
  • Created BO universe for different modules by removing loops through the use of aliases and contexts.
  • Created adhoc BO reports using business objects functionalities like drilling methodology, filters, ranking, sections, graphs and breaks.
  • Managed the team of 5-6 members and reviewing their work along with taking responsibility to deliver the product.

ETL Integration Specialist

Confidential

Responsibilities:

  • Create and enhance ETL code using Unix & Oracle based on the business needs. Working with DBA on performance improvement process on the SQL queries developed.
  • Resolving tickets based on SLA. Interacting with business users to understand the issue and fix them before the timeline.
  • Involved in requirement gathering and technical document sign off from business. Working with offshore team to ensure the application developed is intact with business needs.
  • Lead the team of 5 members and assigning them the day to day tasks and goals.

Application Developer/Production Support Analyst

Confidential

Responsibilities:

  • Develop code & test them based on the requirements provided in the technical design document.
  • Resolve the tickets based on the priority and do periodic deployment for the bug fixes.
  • Prepare impact analysis and root cause analysis document for the high priority tickets and work on fixing the issue.
  • Send periodic updates to business team on the status of scheduled jobs in production environment.

Hire Now