SlideShare ist ein Scribd-Unternehmen logo
1 von 49
 
Plan for the Hour ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],Market Research Today
[object Object],[object Object],[object Object],[object Object],[object Object],Projects Today
[object Object],[object Object],[object Object],Challenges
[object Object],Typical Approach
[object Object],Problem
Plan for the Hour ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],Objective of the Paper
Plan for the Hour ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Customer Satisfaction Project
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Scope of the study
[object Object],[object Object],[object Object],[object Object],[object Object],Challenges
CST    6:00 am CST    4:00 pm Audio file per interview File Transfer to Partners Interview start Transcription & Coding Consolidation Transcription+ coding data  Upload to website Existing Technology
[object Object],[object Object],CST    6:00 am CST    4:00 pm Audio file per interview File Transfer to Partners Interview start Transcription & Coding Consolidation Transcription+ coding data  Upload to website Existing Technology
CST    6:00 am CST    4:00 pm Audio file per interview File Transfer to Partners Interview start Transcription & Coding Consolidation Transcription+ coding data  Upload to website Existing Technology ,[object Object],[object Object]
CST    6:00 am CST    4:00 pm Audio file per interview File Transfer to Partners Interview start Transcription & Coding Consolidation Transcription+ coding data  Upload to website Existing Technology ,[object Object],[object Object],[object Object],[object Object]
CST    6:00 am CST    4:00 pm Audio file per interview File Transfer to Partners Interview start Transcription & Coding Consolidation Transcription+ coding data  Upload to website Existing Technology ,[object Object],[object Object]
Wait a minute – Was that the BEST way??
Can we make it better, i.e. more profitable operations?
Of course we can and we did!!
[object Object],[object Object],[object Object],How you ask??
The “Out of Box” Process
[object Object],[object Object],[object Object],[object Object],CST    6:00 am CST    4:00 pm File per interview Interview start Transcription & Coding Download individual files and Consolidate Continuous upload - Transcribed + coded data  Upload to website Continuous transfer to partners Process Technology Fit File conversion & routing
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],CST    6:00 am CST    4:00 pm File per interview Interview start Transcription & Coding Download individual files and Consolidate Continuous upload - Transcribed + coded data  Upload to website Continuous transfer to partners Process Technology Fit File conversion & routing
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],CST    6:00 am CST    4:00 pm File per interview Interview start Transcription & Coding Download individual files and Consolidate Continuous upload - Transcribed + coded data  Upload to website Continuous transfer to partners Process Technology Fit File conversion & routing
[object Object],[object Object],[object Object],[object Object],[object Object],CST    6:00 am CST    4:00 pm File per interview Interview start Transcription & Coding Download individual files and Consolidate Continuous upload - Transcribed + coded data  Upload to website Continuous transfer to partners Process Technology Fit File conversion & routing
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],CST    6:00 am CST    4:00 pm File per interview Interview start Transcription & Coding Download individual files and Consolidate Continuous upload - Transcribed + coded data  Upload to website Continuous transfer to partners Process Technology Fit File conversion & routing
Process Technology Fit In Short  - Automated all manual tasks
But the story does not end here….
[object Object],[object Object],[object Object],[object Object],Few weeks later
Software Snapshot
Software Snapshot
Productivity Tracker
Productivity Tracker
[object Object],[object Object],[object Object],[object Object],Benefits
[object Object],[object Object],[object Object],Savings
Managed increased volumes
Improved Quality
So what did we learn from this?
[object Object],[object Object],[object Object],[object Object],Learning
Plan for the Hour ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Organization Process Automation Simulation Long term / large projects Project Document Processes Understand the project scope & requirements Design process as per project requirement Technology & process alignment Technology  & organization alignment Conceptual Model ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Technology Process Alignment ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Technology Organization Alignment ,[object Object],[object Object],[object Object],[object Object],[object Object]
Plan for the Hour ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Conclusion
Questions
Thank You...

Weitere ähnliche Inhalte

Andere mochten auch

Inorme 2 trimestre 2014 biblioteca del pio x
Inorme 2 trimestre 2014   biblioteca del pio xInorme 2 trimestre 2014   biblioteca del pio x
Inorme 2 trimestre 2014 biblioteca del pio xDaniel Francisco Doffo
 
"Кликай умно,кликай безопасно!"
"Кликай умно,кликай безопасно!""Кликай умно,кликай безопасно!"
"Кликай умно,кликай безопасно!"Оксана Алексеева
 
Intro to tsql unit 11
Intro to tsql   unit 11Intro to tsql   unit 11
Intro to tsql unit 11Syed Asrarali
 
An Example-Driven API Tester
An Example-Driven API TesterAn Example-Driven API Tester
An Example-Driven API TesterLi Lin
 
Kehittäjävalmennus tornio
Kehittäjävalmennus tornioKehittäjävalmennus tornio
Kehittäjävalmennus tornioMarjo Jussila
 
町内会通信23年12月2日
町内会通信23年12月2日町内会通信23年12月2日
町内会通信23年12月2日hongochonaikai
 
Hot500 overview
Hot500 overviewHot500 overview
Hot500 overviewWist Ltd
 
User behavioranaliticskurtzimmer
User behavioranaliticskurtzimmerUser behavioranaliticskurtzimmer
User behavioranaliticskurtzimmerGlobant
 
Call center 2015_sponsorship
Call center 2015_sponsorshipCall center 2015_sponsorship
Call center 2015_sponsorshipMustafa Kuğu
 
2009 10 e-marketer_marketing_to_the_online_video_audience
2009 10 e-marketer_marketing_to_the_online_video_audience2009 10 e-marketer_marketing_to_the_online_video_audience
2009 10 e-marketer_marketing_to_the_online_video_audiencePietro Lambert
 

Andere mochten auch (17)

Homage to Sri Aurobindo
Homage to Sri AurobindoHomage to Sri Aurobindo
Homage to Sri Aurobindo
 
9789740328698
97897403286989789740328698
9789740328698
 
Inorme 2 trimestre 2014 biblioteca del pio x
Inorme 2 trimestre 2014   biblioteca del pio xInorme 2 trimestre 2014   biblioteca del pio x
Inorme 2 trimestre 2014 biblioteca del pio x
 
"Кликай умно,кликай безопасно!"
"Кликай умно,кликай безопасно!""Кликай умно,кликай безопасно!"
"Кликай умно,кликай безопасно!"
 
Intro to tsql unit 11
Intro to tsql   unit 11Intro to tsql   unit 11
Intro to tsql unit 11
 
Informe iº trimestre 2012
Informe iº trimestre 2012Informe iº trimestre 2012
Informe iº trimestre 2012
 
TPL Dataflow - VTD12
TPL Dataflow - VTD12TPL Dataflow - VTD12
TPL Dataflow - VTD12
 
An Example-Driven API Tester
An Example-Driven API TesterAn Example-Driven API Tester
An Example-Driven API Tester
 
Informe para blog
Informe   para blogInforme   para blog
Informe para blog
 
Kehittäjävalmennus tornio
Kehittäjävalmennus tornioKehittäjävalmennus tornio
Kehittäjävalmennus tornio
 
町内会通信23年12月2日
町内会通信23年12月2日町内会通信23年12月2日
町内会通信23年12月2日
 
Tajuk besar big
Tajuk besar bigTajuk besar big
Tajuk besar big
 
Hot500 overview
Hot500 overviewHot500 overview
Hot500 overview
 
User behavioranaliticskurtzimmer
User behavioranaliticskurtzimmerUser behavioranaliticskurtzimmer
User behavioranaliticskurtzimmer
 
Trek2Freedom
Trek2FreedomTrek2Freedom
Trek2Freedom
 
Call center 2015_sponsorship
Call center 2015_sponsorshipCall center 2015_sponsorship
Call center 2015_sponsorship
 
2009 10 e-marketer_marketing_to_the_online_video_audience
2009 10 e-marketer_marketing_to_the_online_video_audience2009 10 e-marketer_marketing_to_the_online_video_audience
2009 10 e-marketer_marketing_to_the_online_video_audience
 

Customised Operations For Customsied Research At Casro

Hinweis der Redaktion

  1. Question…do we need to introduce ourselves? Or will they be providing an introduction for each presentation? Might need to add this in. Sameer>> Think they should but think we can introduce us and our company name incase they don’t This morning Sameer and I would like to talk about an ah-ha discovery we made on some recent projects…a moment that brought to light the need to take a different approach to how Operations has traditionally supported market research projects.
  2. There is a lot information to cover in the hour…this screen provides a basic outline of how we will be proceeding.
  3. I believe we can all agree with what is portrayed on this screen. Over the past few years, market research projects have become overall more challenging. Clients are demanding more and more, AND they want it quicker and at a lower cost. Many of them are expecting complete transparency as well. For example, not just receiving the data collected from a phone interview, but an audio file of each interview so they can ‘hear’ the respondents answer, voice inflection, interviewers quality, etc. In our experience, the technology that support MR operations is still pretty standard. Meaning, most market research companies are still approaching study execution in the same manner…utilizing the same technology. The tools have not necessarily kept up with the demands of the clients. Sameer>> In contrast we can see how other fields like Business Intelligence have developed so much w.r.t the technology over the years.
  4. The more challenging projects of today have certain characteristics… * High volume * Unique requirements such as recording phone interviews and NOT having interviewers type open end responses…then having respondent comments transcribed, or extremely quick turn around time for respondents data that meets specific requirements, etc. * Not just data for a deliverable…in addition an audio file, transcribed comments, results in tables but also posted in a portal, etc. Programs which run for an extended period of time…1,2,3 years with deliverables required on a consistent basis…daily is becoming more prevalent. Supporting these projects has introduced new challenges…
  5. Some of the biggest challenges are… How do we meet the requirements of the project, yet stay cost competitive, particularly with ever tighter turn around time and a growing number of deliverables? How do we make ourselves more attractive than our competitors? Why should they choose us? And of course most importantly, how do we do all this AND keep our margins? Sameer>> I am sure you will agree that in today economic scenario, these challenges have exemplified
  6. The traditional approach is to rely on our tried and true processes and technology…the ‘old reliables’ as it were. Makes sense…They are comfortable…well traveled…and have a proven track record and people are comfortable handling it. However…
  7. However, this approach often leads to over runs and the need for additional resources, particularly for these more challenging projects. For these projects, what we need is a ‘Customized Approach’… a solution that is best suited to support the project requirment at hand.
  8. This leads us to the purpose of our presentation…
  9. We would like to propose a conceptual model for customized operations…a model that will help us Identify whether a project requires customized operations…are there deliverables which current operations cannot support, or support in a time/cost efficient manner Help align operational processes and technology with the project requirements and maximize time and cost efficiencies Help with resource planning, identifying KPI’s and metrics for tracking, and making sure they are aligned with long term organizational goals, and Best of all, the model will aid in determining the total cost of operations for the different approaches and identify the best approach overall Will our tried and true methods fit the bill and keep us cost efficient? Or do we need to modify the tried and true to meet the project requirements? Or will modification allow us to realize a higher margin?
  10. We’d like to share with you one recent experience we had the privilege to share…one we hope will shed light on what we’ve been talking about up to this point…
  11. A case study if you will
  12. Synovate was recently took on a customer satisfaction follow-up program for a car company. The scope of the project included… Taking place over an extended period of time…3 years Extremely high volume for a phone project with around 5,000 interviews completed EACH day, Monday through Saturday 100% of the interviews were to be recorded…but respondent comments were not to be typed in by the interviewer…a qualitative approach within a quantitative methodology All respondent open end comments were to be transcribed after the interview was completed…then the individual comments coded to allow for quantitative analysis Turn around time was extremely tight… For any interviews that were flagged as ‘Hot’, an email containing the structured data from the interview; transcribed comments; and coded data would be sent to the client within 30 minutes. A ‘Hot’ flag indicates that the respondent had an important issue during their interaction with the client…and they want a contact from that client within 48 hours. All remaining interviews were to be completed, transcribed, coded, and ready for upload to the client website by 6am the following morning. To put some light on this requirement…interviews are conducted between 4:30 and 10:30 CST. All these deliverables make Operations 100% transparent. Every aspect of the projects is visible!
  13. Honestly, this project introduced just about every challenge we identified earlier… As I just mentioned…very tight turn around. Logistical challenge of file transfers across multiple locations with multiple times, and then consolidation at the end Very stringent quality requirements…95% per complete How to schedule and plan resources given the very small execution window And, finding an outside vendor to support the transcription and coding piece. We quickly realized that it would not be cost efficient to support these pieces given the timing constraints. This is the point where I met Sameer…as Datamatics was our partner of choice in this venture. So, how do we approach this? How do we meet all these challenges in the most efficient manner?
  14. First, we took a look at our processes and technology as they stand. It is important to point out that our processes are designed around supporting custom market research projects…which means that the scope and requirements of each project we execute changes from project to project, wave to wave. Lets take a look at what the typical approach to this project might look like…This is a very basic diagram of how the process or data would flow following using existing processes and technology. Let’s discuss each piece individually…
  15. First, interviewing begins at 4pm. As interviewers complete a call, an audio file in .wav format is automatically created and saved on the CATI dialer. Audios files will then be classified as per type of interview and language (hot/non hot) and only then be ready to send to Partners for transcription and coding
  16. The resulting audio files would need to be batched and transferred to an internal server which our transcription/coding partners could access. Typically, this process would take place every 30 minutes and completed by FTP. One exception would be hot alerts which need to be processed every 30 mins. Transfer of these files would need to happen immediately. Sameer>>It is important that Synovate and the vendors to be completely in Sync to ensure successful transfer and processing
  17. Next, our partners need to download the files so that transcribing and coding can take place. The files would first need to be transcribed and quality checked by one team and then sent to the coding team. Next, The transcription and coding output would need to be combined into one file and batched for uploading to Synovate servers at regular intervals…much like our downloading to the vendors.
  18. Finally, once the transcription and coding data is received, it needed to be consolidated with the interview structure data and audio file at an individual respondent level and uploaded to a website for client access and analysis. All of this would need to be completed within 30 minutes for Hot calls, and by 6am for all the remaining calls.
  19. Well, that is what we ideally would have done. However we chose a different path
  20. We needed to ensure that along with achieving the strict turn around time and quality standards, we had to run this project profitably over a period of three years
  21. So we put our heads together and started brain storming!!
  22. We started by doing a thorough understanding of project requirements and then designing an efficient process to meet these requirements. Once we had a efficient process in place, we evaluated different technologies and found the most appropriate technology that fit the process and gave us increased margins by bringing down total cost of operations
  23. So after Brain storming across two continents, we came up with this…. Having this process in place was half the work done, as we had a clear understanding on ‘What’ we needed to do. The other part was ‘How’ we were going to automate. The key areas that were highlighted after this exercise were The file naming convention - the dialer assigns to the automatically generated .wav files a default name which is meaningless in terms of an individual project. However for this project the file name needed to drive where the audio file was routed, so modification to the filename was required. 2. Another important point is that the audio files have to be removed from the dialer at the end of each day or else the performance of the dialer is negatively impacted. The standard process for removing audios from a dialer is for a person to manually batch the audios and move them to a predetermined drive/server for access. The number of projects requiring audio files has not been high enough to create an automated process. However considering that this project ws going to run six days as week for next three years, we relying on manual method would have been inefficient. This has to be automated. 3. Sending files to partners – considering the tight turn around time, if we batched and sent files every 30 minutes it would be impossible to meet the client requirement for Hot calls as well as complete transcription . As a reminder, for Hot calls, an email containing all the deliverables (structure data, transcribed comments, and coded data) has to be sent to the client WITHIN 30 MINUTES! It would also not be efficient for our partners to receive a huge batch of files in one go and then work their way through it. It would cause bottleneck when downloading files. To further ensure there is no bottleneck in transfer of files we decided to convert wave files to mp3 which on this project ended up being 45% smaller in size. Also, transferring the files to an internally owned server puts the ownership of reconciling the audio with associated transcription/coding results in Synovate’s hands. We wanted to ensure that the partners are responsible for ensuring that for each audio file received they have provided the associated transcription/coding results. Hence it was decided to post files in partners FTP servers. 4 Consolidation – We needed to ensure that files sent to partners are combined with the output sent by partners and uploaded within a specific window to meet client requirements. This would be an tedious process and therefore needed to be automated.
  24. Now we can look at ‘How’ we achieved what we specked out in our process flow.. We will look at each element individually as it involved automation in different areas at Synovate as well as the partners. The very first key area identified was identifying the different type of file and routing it to the appropriate location. This was solved by having the filenames contain key information that will help in routing. This helped if the file was a hot alert, english or spanish, long or short. We wrote a program to pull the key information from the data file and insert it in the filename in fixed locations. A program was written to continuously covert the wave files to MP3 format. This helped in making the transfer faster as it saved bandwidth. The routing was handled by another program that would ensure files were sent to an appropriate vendors based on the filenames. So by doing this we eliminated the need to send files in batches, the files were continuously (every minute) converted from wave to mp3 format, move to appropriate location based on the files name. All these processes were run concurrently and not in sequence.
  25. Once the files were ready to be uploaded, we needed to be sure that the transfer of files were not human dependent. A software was purchased that would continuously upload the files to partners. This software is commonly used in the newspaper industry and we found this suiting our requirements perfectly! It also generated a status report that was sent to all project managers. To maintain confidentiality, secure FTP was used that encrypted the files and it was hosted on secure servers. Since transfer of files was the backbone of the entire process, we had back-up ftp servers and connectivity. The internet connectivity was set-up to switch automatically, if one went down.
  26. The files needed to be processed continuously for transcription as well as coding to meet the 6:00 am deadline. The transcription software had a built in module to automatically download the audio files. It also had a module for file allocations, so the files once downloaded were not physically moved, but they could only be accessed only through the software. This eliminated the effort required to send files to different resources and also eliminated the possibility of file loss. The coding software could easily interact with our transcription software as they were built on the same platform. The transcription and coding software could exchange information seamlessly, so files after transcription files could move directly into coding.
  27. The coding software would export data as per the project requirement, i.e. one file containing the transcription text and open ended codes for each audio file. Once files were exported to a specified location, an upload module would pick them up and upload it to FTP site. Before uploading it would also do an integrity check to see if file was in the correct format and it did not have any information missing
  28. A software was deployed at Synovate to download all files that were uploaded by the partners. Another software consolidated the files that were downloaded with the audio files. It ensured that all three pieces of information existed– structured data, transcription+coding file, audio file. This consolidated data was automatically uploaded to a webiste for the client.
  29. All manual activities apart from cati interviewing, transcription and coding were automated. This ensured that project could be could run each day without any human dependency or intervention for manual tasks so that Eleanor and I could sleep peacefully every night
  30. The software we developed, had built-in modules to track productivity at each stage. For example – it kept track of the time when audio file was downloaded and when it was allocated. It kept track of time taken be each transcriber for each file he transcribed along with the duration of the file. Similarly productivity was also tracked for coding. Once we had analyzed the data we realized that we were spending lot of time in proof reading the transcribed data. Proof reading is a Quality process of transcription that involves comparing the audio file with the transcribed text. Since the proof readers were anyway going through the transcript and understanding what respondent was saying, we decided to train our proof readers in coding and integrate the transcription and coding software. The improved the efficiency by 30%. We also realized that the allocation of files was key to ensure good productivity. We modified our transcription software that classified audio files into different categories based on duration of audio. We could then route the bigger files to more experienced transcribers and smaller files to less experienced. It also gave us an estimate of how much work was pending at any point of time as we needed to check which category files were pending transcription or coding.
  31. The next few slides have snapshots of the software we have developed The one is the allocation module of transcription software. The files are auto-downloaded and placed in this window. The supervisor can allocate files to different resources present on that day (see on extreme right). On the left had side you will see various modules. Each module can be customized as per requirement so it is not hard coded
  32. This has snapshot of our coding software. It has features like built in validations customized for each project. It interfaces with the transcription software and also allows for proof reading and coding to be done simultaneously. The validations helped in improving coding quality
  33. We had half-hourly and daily productivity reports which were shared with the executives working on the project real time. The supervisor would monitor productivity real time and take corrective action if we were missing our hourly targets.
  34. Similar for coding as well
  35. As for the Benefits, we managed the client timelines and quality requirements. Due to the automation of manual tasks and reports, the project was well managed. Due to automation, most operations were running on its own so were not replying on people and hence no inconsistency of operations Transparency helped us improve our processes.
  36. So when we analyzed the data we saw savings on resources as well as software. Estimated saving was 630,000 for three years Employee costs at Synovate - $300000 Employee costs at Datamatics - $200000 Software licensing costs - $130000 Total costs - $63000
  37. We managed to ramp-up within a month without any quality issues.
  38. Our quality ratings improved over a period of time to 95% accuracy
  39. Not following the norm and using standard processes helped us achieve substantial cost savings Having real time metrics in place is very important for process re-engineering initiatives For Large/Long term projects, the transaction costs for of-the-shelf software and resource cost, contribute substantially to overall cost of operations. This two areas need to be looked closely. It is sometimes better to go in for one-time technology cost that will off-set the above costs in the long term. As per our experience, using our own technology gave us more control over the project.
  40. After our experience with this project, we thought of developing an conceptual model that will give serve as a solid approach to ensure operational efficiency. It could be used by anybody to customize their operations for similar large or long term projects. The model suggests a stepladder approach, with series for activities in a particular order. The outcome for one activity is the input for next activity and hence a planned order is important.
  41. For any long term/large projects, the very first step is to understand and document the project scope and project requirements. The different parameters that need to be considered are Inputs/outputs – we need to first understand the various inputs and their formats, the intermediate/final outputs expected. Internal/external connections – this covers the transfer of information, between different groups within the company and sources outside the company i.e. clients, client partners, our own partners. A understanding of technology used by external partners are essential at this stage. For example the way Synovate and Datamatics got an understanding and aligned their respective technology accordingly to work with each other. Procedural complexity – The complexity of the process based on the number of inputs and outputs required. The roles of the different internal or external groups involved and the number of changes expected. TAT – the turn around time for the project Quality – the quality standards expected by the client. This will have a bearing on the process design as it will require a more stringent QC process. If you recollect our case study, we had an 100% qc check for transcription. The idea behind this activity is to get all information on the above parameters and prepare a project document. Depending on your specific project requirements there could be other parameters added to the list but essentially this is an information gathering stage and we should have all information and project requirement clearly specked out. The next step is very crucial – it is designing a efficient process that takes into account all parameters of the project document. The process will ensure that the project will be run on optimum efficiency and Turn around time. The end result is this stage is Process flow diagram which will map the entire process from receipt of inputs to final output delivery & Data flow diagram which will show flow of information files between different processes/activities This is followed by Gap analysis to compare the designed process with existing process flow and identify areas that will need different solutions as compared to existing ones available. Once we have fixed on the processes, we need to see what technology is best suited for the process. This could be tweaking existing technology or building new technology. This is called technology process alignment. We will be taking more about this further in the presentation It is also important that the technology that is deployed should be aligned to the nature of work and scalable for long term organization needs. The scalability is very important so the automation tools can be used for other projects as well with some tweaking. We will look in to this aspect in detail in the following slides The last step is to do a simulation exercise before launch of the actual project
  42. Now lets look in detail how we can get the technology and process aligned. Once we have established an efficient process for a project, the next step is to align the technology that suits the process. The idea is to have different technology options and finally decide on technology solution that will bring savings at ‘Total cost of Operations’ for the entire duration of project. We can use the following parameters for deciding the best technology solution for the process To decide on different technology options, we can use the following parameters 1. Fitment of technology with process and data flow – The technology should be able to support the processes and workflow we have defined. There could be technology solutions already available within the organization which can be tweaked to suit our purpose. Else we need to look for solutions outside, and we should be open to look for solutions available outside MR domain. 2. Productivity – We need to determine metrics for estimating productivity and use it for making estimates on the different technology options available. For example, when we decided to use fingerpost, we estimated the productivity for collection and distribution of files to increase by XX % and by interfacing transcription and coding software we estimated an increase in TAT by 15% 3. Training – If we are deploying new technology, a training plan is needed to equip staff of handling the technology. 4. Monitoring performance – determine how productivity and quality will be monitored. If possible, the technology deployed should give productivity reports. The metrics are very important to check the effectiveness as well as leave scope for further process re-engineering 4. Resource planning – Based on productivity estimates, determine the total number of resources required 5. Costs – Last but the most important. What would be the total cost of operations. This includes costs for buying/tweaking technology and resource costs.
  43. It is also very important that the technology that is deployed is aligned to long term organization needs. It should not be used for just one project and then archived! The following parameters can help the check alignment Adaptability – we should determine how adaptable the proposed technology is for future projects. We can also check if any of the current projects could benefit from the proposed technology Flexibility – It should have room for tweaking and not entirely hard coded. This will ensure that if there is any change in scope than it can be suitably modified to suit the changed requirements. Scalability – It should handle increase in scale of operations. There should not be restrictions on number of users, or number of data inputs/outputs Scope of further process re-engineering – this is very important else technology will become a straightjacket for the process. Any process re-engineering effort will not be possible. This happens when technology is hard coded and not parameter based or module based. We can refer to the case study where we combined proof reading and coding and could do it as the software could be integrated.
  44. Finally, we would like to summarize by saying, Optimum use of technology is only possible if you have an efficient process. For ensuring an efficient process, it is important to first do a complete understanding of the project requirements. Having a process and technology in place is only half the battle won. The metrics would win you the other half. Process re-engineering is very important to further refine the effectiveness of the process and improve your profits Finally, it is important to partner with your service provider. This project was a success due to partnership between Synovate and Partners/Datamatics with each bringing different expertise to the table. Reducing the total cost of operations should be a joint effort.