We provide IT Staff Augmentation Services!

Etl Informatica Bdm /integration Support Resume

3.00/5 (Submit Your Rating)

Dallas, TexaS

SUMMARY

  • 10 years of IT experience in Analysis, Design, Development, Implementation, Testing, Support of application.
  • Experience with iPaaS tool Snaplogic to integrate cloud - based databases such as DynamoDB, snowflake with on-premise and cloud-based business systems
  • Experience in data warehouse migration from Teradata to Snowflake.
  • Have good experience in data Migration and supporting of data migration and Data Integration Solutions using IBM Infosphere data stage.
  • Created customized BDM mappings/workflows for incremental using Informatica Developer and deploy them as part of an application on Data Integration service for the native execution or push down using theBlazeorSparkexecution engines.
  • Have strong experience in creating parallel and sequence jobs using IBM infosphere datastage and Informatica PowerCenter to migrate the data from source system to Confidential system.
  • Having support experience with Snaplogic integration tool.
  • Have configuredPower centerto load the data to Hive directly without Informatica BDM for less resource intensive jobs.
  • Experienced inenhancing and deploying the SSIS Packagesfromdevelopment server to production server.
  • Expert level skills in testing the Enterprise Data Warehouses using Informatica Power Center, Data Stage, Ab Initio, and SSIS ETL tools.
  • ETL implementation using SQL Server Integration ServicesSSIS, Reporting ServicesSSRS.
  • Proficiency in developing SQL with various relational databases like Oracle, SQL Server.
  • Experience in Error handling, Debugging, error logging for production support in SSIS.
  • Strong experience in full cycle of development of application with Oracle PL/SQL and Shell scripting.
  • Proficient in Optimizing Database querying, data manipulation and population using SQL PL/SQL and Utilities in Oracle 12c/11g, DB2 UDB and SQL Server 2008/2000.
  • Have working experience in Oracle concepts like, table creation, Procedure creation, package creation, Indexes, Synonyms, Views, Triggers, Materialized view, Partitioning, etc.
  • Created UNIX shell scripts to run the Datastage workflows and controlling the ETL flow.
  • Strong with relational database design concepts, data modelling, tuning and procedural queries.
  • Extensively worked with Informatica performance tuning involving source level, Confidential level and map level bottlenecks.
  • Independently perform complex troubleshooting, root-cause analysis and solution development.
  • Out of own interest and to grow my technical skills with various ETL tools, I have completed training and certification in Tableau hands-on.
  • Successfully lead a team of maximum 15 associates and delivered project release within Confidential date and without any Production Issue and successful implementation that ever done in past.
  • Strong knowledge in Retail business such as Inbound, warehousing, outbound, shipping, Store layout & planogram, and Forecasting & Replenishment. Good knowledge in Confidential business including third party vendor such as Witron, Swisslog and Fortna.
  • Good domain knowledge in Unemployment Insurance (UI) in US.
  • Have hands on experience on Perl, Mongo DB.
  • Have good knowledge on concept of programming languages like Core Java, Python and C++.
  • An effective communicator with strong analytical / logical/ interpersonal skills and ability to relate to people at any level of business and management.
  • Commitment results oriented and interested to learn new technologies.
  • Flexibility in approach, good time management and confident with commitment in the work.
  • Authorized to work in United States with H1B visa.

PROFESSIONAL EXPERIENCE

Confidential, Dallas, Texas

ETL Informatica BDM /Integration support

Responsibilities:

  • Support production applications and assist other project teams with their solution development.
  • Understanding the business impact on ETL informatica workflows which transfers the data from one system (Vendor systems) to data warehouse.
  • Familiar with split domain functionality for BDM and EDC and use the same Blaze engine on the Cluster.
  • Supporting with Snaplogic integration tool to transfer the data from source systems to CMD system (Reltio).
  • CreatedETLprocess usingSSISto transfer data from heterogeneous data sources.
  • Worked on data cleansing by creating SSIS packages against the Flat Files.
  • Created customized BDM mappings/workflows for incremental using Informatica Developer and deploy them as part of an application on Data Integration service for the native execution or push down using theBlazeorSparkexecution engines.
  • Created SSIS packages to clean and load data to data warehouse.
  • Understanding the daily reaped activities and automatization of the process to reduce the manual work.
  • Work closely with Business Analysts and Developers to design and solve problems
  • Assist project managers in the definition of technical tasks, estimates, and dependencies
  • Provide solid knowledge of Workday’s object-oriented data model and approaches to integrations

Environment: Informatica, Snaplogic, PostgreSQL, BDM,AWS S3, DynamoDB, Snowflake,SSIS, Control-M,Python, Java, Json, Unix, Github, Jira

Confidential

Informatica BDM Developer & Business Intelligence

Responsibilities:

  • Understanding the Terada data warehouse system.
  • Developing and testing the snowflake ETLs to archive the same business logic implemented in Teradata.
  • Used client specific tool, which is based on groovy code to convert the Teradata specific ETLs to Snowflake support ETLs.
  • Perform including SQL Agent jobs to update source of data by use of SSIS packag
  • Convert the Power Center code using the informatica developer to BDM Mappings
  • By using Informatica BDM refined the data with the scope of delta processing and for the cleansing the data from S3 Raw to S3 Refine.
  • Sync up the existing data (in Teradata warehouse) to Snowflake data warehouse using python scripts.
  • Creating data stage workflows to load the data from various source systems to snowflake.
  • Creating UC4 workflows to execute the Snowflake ETLS and load the incremental data.
  • Understand the different transformations supported by the BDM to run on the cluster before making any ETL Low level documents for the developers.
  • Monitoring UC4 workflows and make sure snowflake ETLS loads the incremental data correctly as in Teradata.
  • Monitoring Snowflake long running queries if any and optimize the snowflake ETL queries.

Environment: Snowflake, Teradata, BDM, UC4, Unix, Groovy - Client internal tool to convet Teradata ETLs to Snowflake ETLs, Python, SSIS, Jira, GitHub.

Confidential

ETL/Data stage Developer

Responsibilities:

  • Understanding of legacy system and business requirements with the legacy staff/Users.
  • Understanding the functional perspectives of the application with thorough knowledge of UI system.
  • Coordinating with clients to make efficient source mapping with Confidential DB and business rules to apply on the source data to migrate to Confidential DB.
  • Understanding of legacy data, perform data quality analysis on legacy data, develop cleansing scripts.
  • Prepare data mapping specification documents to map data from legacy to new system. Develop data migration programs (ETL) to load transform data into Confidential system, prepare data mapping specification documents to map data from legacy to new system. Develop data migration programs (ETL) to load transform data into Confidential system and creating parallel and sequence jobs to migrate the data from source system to Confidential system.
  • Efficient in writing Complex SQL Queries, Hierarchical queries to retrieve the source data and to convert it into the Confidential DB accepted format.
  • Responsible for testing of migration programs by creating and executing validation and verification scenarios.
  • Be responsible for the overall project and timeliness of the deliverables.
  • Providing support during UAT testing and coordinate with the customer and development team.
  • Analyze problems to automate or improve existing systems and review computer system capabilities.
  • Clarify queries with business to facilitate unambiguous requirement gathering.
  • Fine-tuned ETL processes by considering mapping and session performance issues.
  • Responsible for Creating workflows and Worklets. Created Session, Event, Command, Control, Decision and Email tasks in Workflow Manager.
  • Worked on DB2 SQL and tuning, maintained the proper communication between other teams and client.
  • Understanding the report requirement and creating reports using Eclipse BIRT tool.

Environment: IBM Infosphere Information Server, IBM Data Studio, SQL Server 2008, Oracle 11g, PL/SQL, DB2, Python, Java, Putty, UNIX, ER Studio, WinSCP.

Confidential

ETL/Data stage/Informatica Developer

Responsibilities:

  • Experience in integration of data sources like Oracle, SQL server and MS access and non-relational sources like flat files into staging area.
  • Designing custom reports via SQL Reporting Services to align with requests from internal account teams and external Clients.
  • Extensively used mapplet, pdf, reusable transformations and worklets to implement reusable like Data validation, Data quality, Reference check and data cleansing. used shortcuts for sources, targets, transformations, mapplets, and sessions to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Designed and developed Technical and Business Data Quality rules in IDQ (Informatica Developer) and created the Score Card to present it to the Business users for a trending analysis (Informatica Analyst)
  • Applied slowly changing Dimensions Type I and Type II on business requirements. extensively worked on performance tuning, enhanced readability and scalability.
  • Worked extensively on .csv file, pdf and xml files and in isolating header and footer in single file. working with large amounts of data independently executing data analysis, utilizing appropriate tools and techniques (Interpreting results and presenting them to both internal and external client.
  • Writing SQL queries to create end-user reports /Developing SQL Queries and stored procedures in support of ongoing work and application support.
  • Designing and executing test scripts and test scenarios, reconciling data between multiple data sources and systems.
  • Worked on Multidimensional Models and created Reports in Report Studio using Cubes as a data source. effectively understood session error logs and used debugger to test mapping and fixed bugs in DEV in following change procedures and validation.
  • Raised change requests, analyzed and coordinated resolution of program flaws and fixed them in DEV and Pre-Production environments, during the subsequent runs and PROD.
  • Perform analysis profiling on existing data and identify root causes for data inaccuracies, Impact Analysis and recommendation of Data Quality.
  • Precisely documented mappings to ETL Technical Specification document for all stages for future reference.
  • Scheduled jobs for running daily, weekly, and monthly loads through control-M for each workflow in a sequence with command and event tasks.
  • Created requirement specifications documents, user interface guides, and functional specification documents, ETL technical specifications document and test case.
  • Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator Update Strategy, Rank, Expression, lookups (connected and unconnected), Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
  • Responsible for studying the existing data base and working on migrating existing PL/SQL packages, stored procedures, triggers, and functions to Informatica PowerCenter.
  • Fine-tuned ETL processes by considering mapping and session performance issues.
  • Responsible for Creating workflows and Worklets. Created Session, Event, Command, Control, Decision and Email tasks in Workflow Manager.
  • Worked on DB2 SQL and Tuning.
  • Maintained the proper communication between other teams and client.

Environment: DataStage, Informatica PowerCenter 9.6, SQL Server 2008, IDQ 8.6.1, Oracle 11g, PL/SQL, DB2, Active Batch, Putty, UNIX, ER Studio, ESP, WinSCP.

Confidential

Oracle PL/SQL developer and Data stage Developer

Responsibilities:

  • Analysing Functional and Non-Functional Requirement and Automation feasibility.
  • Following up with Business Analyst on clarifications and suggesting improvements if any.
  • Understanding the business requirement and preparing designs documents (HLD and LLD).
  • Constructing the stored procedures/packages to meet business requirements and testing the same.
  • Worked with WebSphere message broker, which is used in this project to flow the message(Information) from one interface to another interface to achieve a special business requirement.
  • Extensively worked on performance tuning, enhanced readability, and scalability.
  • Writing SQL queries and stored procedures in support of ongoing work and application support.
  • Fine-tuned SQL queries in multiple stored procedures and packages by considering performance issues and application response time.
  • Understanding the source data which is coming from difference sources like Main frame DB, Oracle DB, Flat files & CSV files.
  • Understanding the Confidential database requirements and creatin g mapping documents.
  • Extensively used mapplet, udf, reusable transformations and worklets to implement reusable like Data validation, Data quality, Reference check and data cleansing.
  • Working with large amounts of data independently executing data analysis, utilizing appropriate tools and techniques (Interpreting results and presenting them to both internal and external client).
  • Designed and developed Technical and Business Data Quality rules in IBM Infosphere data stage.
  • Profiling the source data using IBM Infosphere quality Center tool.
  • Handling data stage administrative activities such as New user creation, New project creation and server maintenance.
  • Designing and executing test scripts and test scenarios, reconciling data between multiple data sources and systems.
  • Responsible for the overall project and timeliness of the deliverables.
  • Providing support during UAT testing and coordinate with the customer and development team.
  • Analyse problems to automate or improve existing systems and review computer system capabilities.
  • Using BIRT reporting tool to generate reports for bussinees purpose.
  • Raised change requests, analyzed and coordinated resolution of program flaws and fixed them in DEV and Pre-Production environments, during the subsequent runs and PROD
  • Delivered the project with 0 defects in post implementation and successful implementation for the first time in DPM history.
  • Preparing for project internal and project external audits within the organization.
  • Leading15 members of team successfully.

Environment: OraclePL/SQL, Mainframe DB, ETL process, IBM Infosphere Information Server, IBM websphere message Broker (WMB), Embarcadero ER/Studio tool, BIRT, Unix shell scripting, ProC, Perl, Control -M and HTML.

Confidential

Data stage Support executive/Developer

Responsibilities:

  • Work with business to gather requirements to build the functionalities for user interface
  • Prepare design document to address the flow that are observed from business requirement document.
  • Develop web-based program in order5 to implement the functionality. work on coding and performing unit testing for developed application
  • Providing support post implementation in production for the changes
  • Incident management - Performing fault diagnosis for the user reported issues in the same application and provide necessary technical solution to perform the business functions
  • Event Management - Diagnosed the automated alerts generated due to faulty condition and provide proactive solutions before a user/Business flow is impacted.
  • Problem management - Providing permanent resolution for the issues identified as part of incident. Event management.
  • Preparing for internal and external audits.

Environment: Oracle PL/SQL, JDA, Data Stage, ER Studio, ETL process, Control-M, Unix, Proc, Perl and Service Now.

Confidential

Oracle PL/SQL Developer

Responsibilities:

  • Analyzing Functional and Non-Functional Requirement and Automation feasibility
  • Following up with Business Analyst on clarifications and suggesting improvements if any.
  • Prepare HLD and LLD for Inbound module
  • Preparing unit test cases and testing the owned module along with other modules such as waving, Packing and outbound.
  • Constructing the stored procedures/packages to meet business requirements and testing the same.
  • Writing SQL queries and stored procedures in support of ongoing work and application support.
  • Fine-tuned SQL queries in multiple stored procedures and packages by considering performance issues and application response time.
  • Planning, Preparing test cases, execution, and validation for Interface testing with third party vendor Witron.
  • Supporting with team for emulation, 4-wall testing, and Onsite UAT.
  • Participated in successfully implement ation and supporting the application.
  • Provide the application warranty support and provide temporary fix if required.
  • Working on post implementation defects.
  • Working on understanding the change requests (CR) requirements and implementing the changes without deviating the existing behavior of the application.
  • Supporting post implementation if any issues and defects.

Environment: Oracle PL/SQL, Control-M, Unix, Proc, Perl and Service Now.

We'd love your feedback!