Big Data Developer Resume
OBJECTIVE
- To provide quality solutions to complex business problems and delivering solutions to the new systems and services with the intention to build a career in a leading corporate hi - tech environment with committed and dedicated people, which will help me to explore myself fully and realize potential.
SUMMARY
- More than 10 years of experience delivering end to end BI solution.
- Experience working on AWS platform.
- Experience in various project phases like business requirements gathering, source to target data mappings, ETL development, testing and production deployment.
- Experience working with Big Data and Hadoop technologies.
- Excellent SQL programming skills.
- Experience working with different teams across different time zone and lead the offshore teams
- Excellent understanding of data management principles, Star Schema, Snowflake Schema.
- Expertise working in various domains: Banking, Telecom, Transit, Health Care and Retail industry.
- Ability to deliver tasks within project deadline and contribute in project planning and scheduling.
- Experience working in relational database design, development and testing.
- Full Software Development Life Cycle (SDLC) experience including Analysis, Design and review of Business and Software Requirement Specifications.
- Excellent working knowledge on Data Modeling, Data Integration and Data Mining.
- Expertise handling large data from multiple sources and ability in data profiling, scrubbing, analyzing, integrating and validation from multiple source systems.
- Experience working with AGILE Methodologies.
TECHNICAL SKILLS
Data Warehouse: Big Data, Hadoop, Oracle, Teradata, IBM Netezza and Microsoft SQL Server
ETL Tools: SAS DI Studio, Microsoft SSIS, Informatica, Ab Initio, Pentaho, IBM Data stage
Data Modeling Tools: Erwin Data Modeler and Power Designer
Reporting Tools: SAS EG, Tableau, Microsoft SSRS, SAP BOBJ, MicroStrategy and Power BI
Operating System: Windows, UNIX, LINUX, Macintosh, IBM Mainframe
Languages: SQL, PL and SQL, SAS, Python, Hive, HTML, XML
Testing Tools: Microsoft Test Manager, HP Quality Center
CRM Technologies: Salesforce
Cloud Environments: AWS and Azure
PROFESSIONAL EXPERIENCE
Confidential
Big Data Developer
Responsibilities:
- Analyzing, designing, developing and testing ETL process based on the requirement from business and technical users.
- Work as a team Lead to co-ordinate with different business teams to gather the requirements and then delegate and lead the development activities and manage resources from on-shore and off-shore teams.
- Analyze data from different source systems and perform data gap analysis to understand which data already exists in target database.
- Provide feedback on the finding of various data trends and gaps to senior management to make informed decisions for data migration.
- Perform data ingestion tasks to move data from various sources to AWS S3 buckets.
- Wrote UNIX scripts to automate copying the data into job servers from the SFTP locations.
- Worked on data migration projects which involves migration from SAS ETL to Python within AWS environment.
- Developed various daily, monthly, quarterly reports with data insights for senior stakeholders using Power BI, Tableau.
- Performed ETL tasks using various ETL tools such as SAS, Informatica, Python.
- Wrote complex SQL queries for data analysis and issue analysis on production incidents.
- Maintain and version control ETL code through GitHub(CI/CD).
- Perform regular data profiling and data quality checks with Quality Assurance team.
Confidential
Big Data Developer
Responsibilities:
- Recreating existing etl jobs and reports in SAS 4.2 version to SAS 6.3 version.
- Developed various analytical methods to categorize new customer types that require enhanced due diligence mentioned in the CDD Standard.
- Create test cases and scripts to validate the ETL code developed and test the accuracy of data from legacy systems to new target system.
- Created various reports for AML (Anti-Money Laundering) and CDD (Customer Due Diligence), KYC (Know Your Customer) and EDD (Enhanced Due Diligence) process.
- Create and update existing documentation to support new systems deployed.
- Created various reports for AML (Anti-Money Laundering) and CDD (Customer Due Diligence), KYC (Know Your Customer) and EDD (Enhanced Due Diligence) process.
- Create and update existing documentation to support new systems deployed.
Confidential
Big Data Developer
Responsibilities:
- Reverse engineer legacy SAS code into actionable requirements.
- Designed and developed complex ETL structures for transformation of data sources into data warehouses.
- Created various reports for AML and CDD and EDD process.
- Worked on creating etl jobs for custom AML scenarios.
- Conducted code and design reviews for suggesting and approval of design specifications.
- Resolved troubleshooting problems relating to ETL applications and data issues.
- Perform in depth data analysis in various sources systems supporting Credit Risk Models.
- Implemented processes for extraction and loading of ETL data into data warehouses and marts.
- Create and document mapping logic to transform source data to usable Credit Risk Model data sets.
- Worked on IFRS9 for gathering requirements and producing reports.
- Maintain macro economic application and load schedules.
- Utilize JIRA to manage work and releases.
- Worked with the MicroStrategy technical support to troubleshoot the dynamic display of prompts in the web mode and some other Report related issues.