We provide IT Staff Augmentation Services!

It Application Developer-specialist Resume

5.00/5 (Submit Your Rating)

Phoenix, AZ

PROFESSIONAL SUMMARY:

  • Over 9+ years of IT experience in various stages of SDLC includes Design, Development, Maintenance and Testing in Finance, Retail and Insurance sectors.
  • Experience in enterprise RDMS Data modeling techniques - Erwin, Dimensional modeling (Start & Snowflake schema), Conceptual, Logical and Physical data models.
  • Experience in Informatica tools like PowerCenter, Data Quality (IDQ) and Master Data Management (MDM)
  • Experience in Data analysis, Cleansing, Standardization, visualization, integration, and reporting using BI tools.
  • Experience in Data ingestion into Big-data platform using Hadoop, Hive, No SQL and MapReduce.
  • Experience in various Data Sources - Big Data, Teradata, DB2, Oracle, &Mainframe.
  • Experience in writing and maintaining SQL Scripts, PL/SQL and Stored programs.
  • Experience in writing and managing UNIX shell scripts, C, Python and Perl scripts.
  • Experience in scheduling ETL jobs with Control-M scheduler
  • Extensively worked on ETL Performance tuning and debugging issues.
  • Experience in creating End-to-end ETL design documents for data flow.
  • Excellent Team Player with strong written, communication, and interpersonal skills.
  • Experience in Client/Server application development.

CORE PROFESSIONAL STRENGTHS:

  • Data Analytics &data reporting
  • Data Governance& Maintenance
  • Data Cleansing, Validation & Standardization
  • Data Visualization techniques
  • Supporting implemented BI solutions
  • Self-motivated and determined
  • Success oriented and Natural Lead
  • Data Modeling Techniques
  • Master Data Management
  • ETL Performance Tuning
  • Agile & Water fall methodologies
  • Error Handling & Audit controls
  • Hard working and Team Player
  • Problem Solving

TECHNICAL SKILLS:

RDMS dATABASES: Oracle 9i/10g, DB2, SQL server 2008, Teradata, Neteeza.

Big data tools: Hadoop, NO SQL, MapReduce, Oozie, Spark, Hive, Pig database modeling: Oracle SQL Developer Data Modeler, Toad Data Modeler etl tools: Informatica - PowerCenter 9.x, Power Exchange, IDQ, MDM, CONTROL-M scheduler.

Reporting tools: Micro-strategy, Web-Focus, Tableau and Excel reporting

Software languages: SQL, MySQL, PL/SQL, JavaScript, UNIX scripting, C, C++ Programming, HTML, and Python

WORK EXPERIENCE:

Confidential, Phoenix, AZ

IT Application Developer-Specialist

Responsibilities:

  • Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality.
  • Identify, analyze and interpret trends or patterns in complex data sets.
  • To maintain the consistency and quality of the data worked with Data Governance, Data profiling and Data quality team that required managing the master data from all the business units as well as from IT and ensuring data quality standards across the enterprise.
  • Create ETL processes and reports using Informatica suite (PowerCenter, IDQ, analyst, MDM) and other integration tools like SSIS, SSRS.
  • Performing data management projects and fulfilling adhoc requests according to user specifications by utilizing data management software programs and tools like Toad, Excel and SQL.
  • Involved in extensive DATA validation by writing several complexes SQL queries and involved in back-end testing and worked with data quality issues.
  • Worked on Technical documentation process like preparing High Level Design Documents, Low Level Design Documents and Source to Target mapping sheets
  • Developed complex mappings using Informatica to load Dimension and Fact tables as per Data modeling techniques.
  • Preparation and Review of Unit Test cases and Executing Unit test cases and documenting the results.
  • Extensively used INFA PowerCenter Objects- Source & target designers, transformations, sessions, worklets, and workflows.
  • Developed and tested ETL mappings based on used cases and tuned for better performance.
  • Written and maintained UNIX shell scripts and SQL scripts.
  • Helped SIT team & Business users in UAT and resolved issues and documented root cause and resolution in QA.

Environment: Informatica 9.6, Oracle, Perl Scripting, Unix Scripting, ESP scheduler, Ruby Programming, Business Objects, Informatica Analyst, Informatica IDQ, UCD, LiquidBase, IBM Infosphere

Confidential, Phoenix, AZ

ETL Application Architect

Responsibilities:

  • Design the high level process and the tools required for ingestion, extraction and transformation
  • Involved in ETL, Data Integration and Migration from Heterogeneous source platform to data lake(cornerstone)
  • Develop generic wrapper and automation scripts for maintenance and
  • Develop complex Oozie XMLs to schedule HQL batches and SPARK jobs
  • Create/implement Hive tables with Batch mode execution with incremental logic
  • Develop programs to parse the raw data, populate staging tables and store the refined data in partitioned tables
  • Configure event engine nodes for extraction polling via status nodes
  • Configure SFTP profiles and file patterns for file uploads
  • Prepare Ad-hoc Hive queries for data analysis to meet the business requirements.
  • Implement test scripts to support test driven development and continuous integration.
  • Documented all the solution architecture, Security interaction and detailed design documents.
  • Manage ETL packages and ensured availability of the application to the Business users
  • Support the application migration to production and provide support to teams during the post warranty period.

Environment: Hadoop 1.3, Hive 1.12, Spark, Pig, Oozie, Sqoop, informatica 9.6, Tableau, Magellan

Confidential, Phoenix, AZ

ETL Application Architect

Responsibilities:

  • Involved and gathered business requirements and documented scope of the project and deliverables.
  • Worked with Business users to understand reporting requirements and documents metric calculations information.
  • Worked with technical teams for ETL platform setup and Data model design and implementation.
  • Designed ETL mappings to transform the raw data into desired format to report for Business users based on requirements.
  • Extensively used INFA PowerCenter Objects- Source & target designers, transformations, sessions, worklets, and workflows.
  • Developed and tested ETL mappings based on used cases and tuned for better performance.
  • Written and maintained UNIX shell scripts and SQL scripts.
  • Helped SIT team & Business users in UAT and resolved issues and documented root cause and resolution in QA.
  • Prepared implemented plan and executed successfully with no issues.

Environment: UNIX, Windows, Informatica 9.6.1, Db2, netezza, Web-Focus, MS office, IDQ

Confidential, Phoenix, AZ

Lead ETL Developer

Responsibilities:

  • Worked with Business users &analysts to understand the processes and requirements and translated them into technical requirements.
  • Worked source team to analyze the application data and how it captures and stored at back end application database and created data dictionary document.
  • Analyzed data sources and developed plans to perform data cleansing, standardization and matching using Informatica IDQ.
  • Designed, developed, and tested ETL mappings & mapplets for data quality using IDQ and imported to Informatica PowerCenter tool to reuse the Data Quality checkpoint logic which reduced development time.
  • Designed and documented ETL architecture changes and created end-to-end ETL design document and presented to project team for approval.
  • Worked with Informatica PowerCenter tools - Source analyzer, Target designer, Transformation designer, mapping designer, workflow manager and workflow monitor.
  • Designed and developed Informatica mappings and sessions based on business user requirements and business rules to load data from source Xml, flat files into DB2.
  • Worked with various transformations to solve slowly changing dimensional problems using PowerCenter designer tools.
  • Developed and scheduled workflows using task developer, worklet designer and workflow designer in workflow manager and monitored results in workflow monitor.
  • Increased the performance of ETL data load using Informatica performance techniques- tuning at source, transformation, target, session, and workflow levels.
  • Developed file handling and validation scripts using UNIX shell scripts and created PMCMD, UNIX scripts for workflow automation.
  • Scheduled the ETL loads using Control M and configured the dependency jobs.
  • Resolved critical and high priority incidents within time frame.
  • Designed and developed reusable objects in Informatica shared folder and maintained updated with the new requirements and changes.
  • Performed development testing and unit testing at each level of ETL developments.
  • Supported the quality assurance team in testing and validating the ETL objects.
  • Presented ETL data flow in show & tell sessions and made changes based on feedbacks.

Environment: UNIX, Informatica PowerCenter 9.6.1/9.1, Informatica 9.1, Toad Data Modeler, Oracle, Teradata, Neteeza, DB2, XML, Web-focus, Control-M, MS Office, Citrix, PMF, Agile methodology and SharePoint.

Confidential, Phoenix, AZ

ETL Data Migration Consultant

Responsibilities:

  • Designed and developed ETL data mappings to extract multiple data sources and transform into useful information and load into Data Warehouse and Data Marts.
  • Cleansed, Standardized and eliminated duplicate and inconsistent data. Enhanced Data Integration across enterprise to help multiple business units.
  • Improved existing ETL mapping performance for data extraction process using Informatica standard performance tuning techniques.
  • Designed and developed Error handling and Audit controls for financial data. Successfully lead and implemented multiple projects.
  • Involved in Data Model designing using 3rd Normal form, Validated Logical and Physical data models and provided suggestions.
  • Extensively used INFA PowerCenter tools- Source Qualifier, Filter, Expression, Router, Update, Sorter, Joiner, Normalizer, Lookups, Aggregator, Sessions, Task, worklets, and workflow.
  • Developed data quality check points using IDQ to cleanse, standardize, and match different versions of Customer data in different application sources data.
  • Created Comparison logics, consolidation logics, and duplicate removal logics to standardize the Customer & Employee master data sets using Informatica IDQ.
  • Designed and developed audit controls for regulatory reporting and to support Production issue resolutions.
  • Increased performance of Data loads using techniques like removing bottlenecks, Optimizing Source, Target, Transformations, Mappings, Sessions, and using pipeline partitions.
  • Written and maintained UNIX scripts and SQL programs.
  • Designed and developed historical data purge process for individual markets based on Business requirements.
  • Involved in creation of Micro-strategy reports, dashboards, drill maps, custom groups & consolidations and user security filters.
  • Worked with Admin teams to configure connection strings, ControlM job scheduler, Server setup, and INFA & UNIX folder creations.
  • Lead and managed 4 members ETL tech team to support development and testing end-to-end project completion process.
  • Supported User Assistance Testing (UAT) by providing documents, solving bug fixes and updating QA with root cause & resolutions.
  • Prepared production support documents, gave project overview presentation and explained critical tasks those needs monitoring.

Environment: UNIX, Windows, Toad Data Modeler, Informatica PowerCenter 9.1, Informatica IDQ 9.1, Pega, Teradata, DB2, XML, Flat file, Informatica Administrator 9.1, Micro-strategy, Control-M, Agile, MS office, SharePoint.

Confidential

Senior ETL Developer

Responsibilities:

  • Attended sprint planning meetings and involved in product backlog.
  • Analyzed Legacy source systems and worked extensively on data profiling to calculate the quality data and documented.
  • Worked with Informatica Data Quality 9.1 for data analysis, data cleansing, data matching, data conversion, exception handling and reporting.
  • Identified and eliminated duplicates in datasets through IDQ 9.1 components of edit distance, jaro distance and mixed field matcher, it enables the creation of a single view of vendors, customers and employees’ data sets.
  • Involved in implementing the Land process of loading customer, product, and vendor data set into Informatica MDM 9.1
  • Developed Base objects, Staging tables, foreign key relationships, static & dynamic lookups, queries, packages.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM
  • Developed trust and validation rules and setting up the match/merge rules sets to get the right master records.
  • Configured match rule set property by enabling search by rules in MDM according to Business rules.
  • Created transformations, mapplets and mappings using Informatica 9.1 PowerCenter to move data from multiple sources into targets.
  • Created ETL transformations like source qualifier, lookup, joiner, expression, update, SQL, and normalizer using Informatica PowerCenter Designer.
  • Designed and developed Data Migration approach to extract Legacy Data sources.
  • Designed and Documented data migration steps - Project Scope, Identify Data & Requirements, Assess Source Data, Data Quality Remediation, Prepare Data Load, Trail Data Loads, Validation Results, Load Target Systems and Go-Live preparation.
  • Supported SIT and UAT phases and resolved defects.
  • Implemented ETL Objects successfully.

Environment: UNIX, Windows, Informatica PowerCenter 9.1, Informatica IDQ, Informatica MDM 9.1, SAP, flat files, Legacy Data sources.

Confidential

Sr. Analyst Programmer

Responsibilities:

  • Gathered and analyzed business requirements and prepared ETL design documents as per the standards.
  • Design and developed data models for customer, sales, employee data marts.
  • Designed, developed, implemented and maintained Informatica PowerCenter and IDQ 8.6.1 application for matching and merging process.
  • Designed, tested, and deployed data quality processes in Data Quality Workbench as Plans.
  • Worked with Informatica plug-in enabling PowerCenter to run data quality plans for standardization, cleansing, and matching operations.
  • Created plans in IDQ and imported to PowerCenter, reused for different data sets to validate & standardization.
  • Developed IDQ transformations like Duplicate record, Filter, Expression, Key Generator, Joiner, Lookup, Match, Merge, Normalizer, Rank, Router, and Standardizer transformations to cleanse the Data Quality.
  • Build ETL solutions using PowerCenter 8.6.1 to extract and transform data from Legacy mainframe systems.
  • Implemented ETL solutions with great ETL process and Error handling techniques.
  • Enhanced data quality and integrated different data sources into single version of truth using Master data management techniques.
  • Wrote and deployed stored procedures, tuned and managed SQL scripts.

Environment: Informatica PowerCenter 8.6.1, Informatica IDQ 8.6.1 Oracle 10g, Oracle SQL Developer Data Modeler, flat files, Micro-Strategy, UNIX, Windows.

Confidential

ETL Developer

Responsibilities:

  • Designed and developed import and export process on transactional system.
  • Implemented ETL process to extract data from transactional system and transform into business required information and stored into Database.
  • Supported and provided resolution for emergency production issues and defects.
  • Addressed issues of Data integrity and troubleshoots data issues.
  • Documented Data flow process.

Environment: Informatica 8.6, Oracle, Oracle SQL Developer Data Modeler, Excel reporting, DB2, flat files, web-services, MS-office, SharePoint, UNIX, Windows.

We'd love your feedback!