Workday Consultant Resume
Dallas, TexaS
PROFESSIONAL SUMMARY:
- Total 7+ years of overall experience in IT and dedicated 3 years as a Workday HCM Functional Consultant.
- Experienced in Workday HCM and a tangible work experience working on different modules of Workday i.e., HCM Module, Reports and Migration.
- Hands on experience in Workday Functional HCM wif responsibilities Supervisory Organizations, Staffing Models, Job and Positions, Compensation, Customizing Business Processes and Security Groups, Customized Report Generation.
- Experience in maintenance and creation of Workday Supervisory Organizations, Locations, Positions, Cost centers, Cost Center hierarchies and worked on the Role Based Security, User Based Security, Job Based Security groups.
- Substantial functional expertise in HCM business process analysis, mapping solutions to these processes, and determining best approaches based upon leading and emerging business practices
- Experience in creation and maintenance of Workday Supervisory Organizations, Reorganizations, Subordinate Supervisory Organizations, Locations, Company, Cost Centers and Organizational & Cost Center Hierarchies.
- Worked wif different Staffing Models, defining Hire Restrictions to Job Management and Position Management.
- Having Knowledge on Custom Reports, Advanced Reports and Calculated Fields.
- Experience in creating Job Profiles, Job Families and Job Family Groups.
- Having Knowledge on Workday security like creating/modifying Role Based Security, User Based Security, Intersection, Segment based and Job Based Security Groups.
- Experience in creating Compensation Rules, Compensation Components and setting up Compensation Segments.
- Good team player and ability to learn new technologies and apply very quickly.
TECHNICAL SKILLS:
Workday HCM: Core HCM, Performance Management, Compensation & Benefits process, Worker profile configuration, Business Process Routing, Time group configurations, Domain policy, Time off management, Integration Security, Inbound and Outbound Integration.
Big data/Hadoop: Hadoop2.7/2.5, HDFS1.2.4, Map Reduce, Hive, Pig, Sqoop, Oozie, Hue Servlets, JSP, JDBC, JSTL, EJB, JAXB, JAXP, JMS, JAX - RPC, JAX- WS.
Technologies: Scala, Java, Python, SQL, PL/SQL, AWS, Hive QL, Unix Shell Scripting.
Database: Oracle 12c/11g, MYSQL, SQL Server 2016/2014
Operating System: Windows 2007, Vista, 8, 10, IOS, Linux, Android
PROFESSIONAL EXPERIENCE:
Confidential, Dallas Texas
Workday Consultant
Responsibilities:
- Inbound and outbound Integrations using (EIBs) & Mass Data Loads, Mass Translations Calculated Fields, Advance Custom Reports, Simple, Standard Reports & Report Groups, Condition & Eligibility Rules.
- Dashboards & Worklets, Custom Objects & Lists.
- Maintain Business Process halp text.
- Configuring Questions.
- Maintain weekly survey.
- Created Business Process routings and rule-based Business Process routings.
- Involved creating Questions/Questionnaire.
- Good working experience in editing templates.
- Day to day activity involve assigning roles and removing roles.
- Been performing delegation setting to halp Stakeholders.
- Help end-user day in day out issues.
- I has good experience migrating data from Sandbox to Production.
- Using ServiceNow as ticketing platform.
- Experience in creating Custom reports and calculated fields based on stakeholder request.
- Automate Email notifications and Business Process configuration.
- Maintain Business process framework for Hire, On-boarding and transfer Etc.
- Configure new business processes and update existing BP’s for my client requirement.
- Create different types of security groups (Role, Job, Intersection and User Based Security Groups)
- Create Security Service Centres in Workday to reduce unwanted Worker records.
- Validate functional enhancements manually in the tenant.
- Validate Business process transactions according to client requirement.
- Configuring the system to meet the requirements of all the Core HCM Setup Data, Compensation and Business Process.
- Handled various department and user requests such as processing security requests for field users, processing access requests for users, setting up new supervisory organizations based on staffing changes, altering existing system configuration, locations, Location Hierarchy and Cost Centres etc.
- Validate data after loading into Workday and responsible for issue tracking and resolution
- Create calculated fields to be used in Custom reports based on the requirement from the client
Confidential - Eagan, MN
Hadoop Developer
Responsibilities:
- Big data is being used to modernize the infrastructure for pharmacy claims data. I halped architect and design a system dat will facilitate multiple batch and real time data flows for pharmacy claims. This will in turn provide clients wif the most up to date claim information for their customers.
- Responsible for migrating the 20 years’ worth of claim data to detect and separate fraudulent claims.
- Objective of this project is to build a data lake as a cloud based solution in AWS using Apache Spark.
- Implemented Installation and configuration of multi-node cluster on Cloud using Amazon Web Services (AWS) on EC2.
- Created Hive External tables to stage data and tan move the data from Staging to main tables
- Worked in exporting data from Hive 2.0.0 tables into Netezza 7.2.x database.
- Pulling the data from data lake (HDFS) and massaging the data wif various RDD transformations.
- Developed Python scripts, UDF's using both Data frames/SQL and RDD/MapReduce in Spark 2.0.0 for Data Aggregation, queries and writing data back into RDBMS through Sqoop.
- Developed Spark code using Python and Spark-SQL/Streaming for faster processing of data.
- Load the data from different sources such as HDFS or HBase into Spark RDD and implement in memory data computation to generate the output response.
- Developed complete end to end Big-data processing in Hadoop eco system.
- Used AWS Cloud wif Infrastructure Provisioning / Configuration.
- Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting on the dashboard.
- Involved in PL/SQL query optimization to reduce the overall run time of stored procedures.
- Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting on the dashboard.
- Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
- Performed File system management and monitoring on Hadoop log files.
- Utilized Oozie workflow to run Pig and Hive Jobs Extracted files from Mongo DB through Sqoop and placed in HDFS and processed.
- Continuously tuned Hive UDF's for faster queries by employing partitioning and bucketing.
- Implemented partitioning, dynamic partitions and buckets in HIVE.
- Used Flume to collect, aggregate, and store the web log data from different sources like web servers, mobile and network devices and pushed to HDFS.
- Supported in setting up QA environment and updating configurations for implementing scripts wif Pig, Hive and Sqoop.