go-back-left-arrow right-arrow-forward social twitter

ESOMAR 37

Company Profile

1. What experience does your company have in providing online samples for market research? How long have you been providing this service? Do you also provide similar services for other uses such as direct marketing? If so, what proportion of your work is for market research?

I-Apac Online has been providing high quality online sample since 2010. We have a team of experienced research staff, developer, executives, project managers and client service executives. We provide the best quality and industry standard sample products and services for our clients. We deal with online sampling, programming, hosting and respondent engagements with respect to qualitative, quantitative but not limited to focus groups, discussion boards, online surveys, online polls and panel recruitment.

2. Do you have staff with responsibility for developing and monitoring the performance of the sampling algorithms and related automated functions who also have knowledge and experience in this area? What sort of training in sampling techniques do you provide to your frontline staff?

Our tech team constantly monitors the accuracy and feasibility of results to ensure that it meets our high-quality standards. Also, all our sample delivery systems are automated to ensure that there are no human induced errors.

3. What other services do you offer? Do you cover sample-only, or do you offer a broad range of data collection and analysis services?

In addition to online sampling, I-Apac Online also conducts online research, online focus groups, survey programming and more.

Sample sources and Recruitment

4. Using the broad classifications above, from what sources of online sample do you derive participants?

I-Apac Online leverages various sources of online sample to derive our participants. We recruit our participants from our proprietary Powerr Sample panel (http://i-apaconline.com/powerr-sample.html). We built an effective in-house panel Powerr Sample for our client's online research needs through advertising and targeting right user groups using SNS and Google Ads and other acquisition sources.

5. Which of these sources are proprietary or exclusive and what is the percent share of each in the total sample provided to a buyer? (Assume proprietary to mean that the sample provider owns the asset. Assume exclusive to mean that the sample provider has an exclusive agreement to manage/provide access to sample originally collected by another entity.)

I-Apac Online provides 100% of the samples from our proprietary in-house panel that we manage ourselves. We are able to complete all our client's research needs in Asia-Pacific region using our in-house panel itself.

6. What recruitment channels are you using for each of the sources you have described? Is the recruitment process 'open to all' or by invitation only? Are you using probabilistic methods? Are you using affiliate networks and referral programs and in what proportions? How does your use of these channels vary by geography?

Our recruitment strategy is based on user acquisition using various organic channels such as SNS, Google and direct referrals. Recruitment is 'open to all' who pass our stringent quality assurance system. We don't use probabilistic methods and affiliate networks. In terms of geography our channel use is same except where social media is limited.

7. What form of validation do you use in recruitment to ensure that participants are real, unique, and are who they say they are? Describe this both in terms of the practical steps you take within your own organisation and the technologies you are using. Please try to be as specific and quantify as much as you can.

All of our panellist requires SMS verification as well as email verification before participating in any of our surveys. We use double opt-in to make sure panellist are voluntarily joining our panel. To prevent duplicate registration, we employ techniques such as device fingerprinting, IP filtering. Etc.

8. What brand (domain) and/or app are you using with proprietary sources? Summarise, by source, the proportion of sample accessing surveys by mobile app, email or other specified means.

We currently don't use any apps but our panel and survey interface are optimized for smartphone users. Smartphone users constitute about 60-70% of our total panellist numbers followed by desktop and tablet users.

9. Which model(s) do you offer to deliver sample? Managed service, self-serve, or API integration?

We offer managed service to deliver our sample. Once our clients define their project requirements, our skilled team manage every step off their research project from sample design to launch and fieldwork management.

10. If offering intercepts, or providing access to more than one source, what level of transparency do you offer over the composition of your sample (sample sources, sample providers included in the blend). Do you let buyers control which sources of sample to include in their projects, and if so how? Do you have any integration mechanisms with third-party sources offered?

All our services are merged into a single panel so that there is no chance of duplication on our part. As of now we don't let our buyers control which sources of sample to include in their projects, we can however customize our buyer needs anytime when they request for it.

11. Of the sample sources you have available, how would you describe the suitability of each for different research applications? For example, Is there sample suitable for product testing or other recruit/recall situations where the buyer may need to go back again to the same sample? Is the sample suitable for shorter or longer questionnaires? For mobile-only or desktop only questionnaires? Is it suitable to recruit for communities? For online focus groups?

Our sample sources are equally suitable for conducting many different types of research applications. Our sample sources are suitable for any length of surveys and any devices used.

Sampling And Project Management

12. Briefly describe your overall process from invitation to survey completion. What steps do you take to achieve a sample that looks like the target population? What demographic quota controls, if any, do you recommend?

Our samples are invited in quotas to randomly participate in a study according to our client's requirements. For demographic quotas such as sex, place, age. etc we recommend based on the client requirements of what they are looking to achieve from the study.

13. What profiling information do you hold on at least 80% of your panel members plus any intercepts known to you through prior contact? How does this differ by the sources you offer? How often is each of those data points updated? Can you supply these data points as appends to the data set? Do you collect this profiling information directly or is it supplied by a third party?

All our members are required to provide basic profiling information such as name, age and place of residence. We also conduct regular attribute acquisition surveys to enhance our member data. These data can also be used for survey targeting and if required can be added to survey responses for delivery.

14. What information do you need about a project in order to provide an estimate of feasibility? What, if anything, do you do to give upper or lower boundaries around these estimates?

To provide you with the most accurate feedback on feasibility, we will need the following necessary information: sample size, deadline, basic respondent attributes and/or detailed attributes.

15. What do you do if the project proves impossible for you to complete in field? Do you inform the sample buyer as to who you would use to complete the project? In such circumstances, how do you maintain and certify third party sources/sub-contractors?

If a project cannot be completed using our proprietary panel, we will request/inform our client for using a third-party partner. All our partners are meticulously selected and all pass a desired level of quality standards.

16. Do you employ a survey router or any yield management techniques? If yes, please describe how you go about allocating participants to surveys. How are potential participants asked to participate in a study? Please specify how this is done for each of the sources you offer.

No, we don't employ survey router.

17. Do you set limits on the amount of time a participant can be in the router before they qualify for a survey?

No, we don't employ survey router.

18. What information about a project is given to potential participants before they choose whether to take the survey or not? How does this differ by the sources you offer?

Generally, we don't provide any information about a project to our participants beforehand. All the questionnaires are distributed randomly according to the target respondents and quotas available. To prevent response bias, we don't mention any details about a project except publicly available information such as Name.

19. Do you allow participants to choose a survey from a selection of available surveys? If so, what are they told about each survey that helps them to make that choice?

Survey names, number of questions and the number of reward points to be earned are clearly mentioned. If the survey requires any special notes for the respondents to know, then it will be clearly mentioned on the cover page of the survey.

20. What ability do you have to increase (or decrease) incentives being offered to potential participants (or sub-groups of participants) during the course of a survey? If so, can this be flagged at the participant level in the dataset?

We don't offer the ability to increase or decrease incentive based on certain group of participants to ensure fair compensation and unbiased responses.

21. Do you measure participant satisfaction at the individual project level? If so, can you provide normative data for similar projects (by length, by type, by subject, by target group)?

No, we don't measure participant satisfaction at the individual level.

22. Do you provide a debrief report about a project after it has completed? If yes, can you provide an example?

No, we don't provide any debrief report.

Data quality and Validation

23. How often can the same individual participate in a survey? How does this vary across your sample sources? What is the mean and maximum amount of time a person may have already been taking surveys before they entered this survey? How do you manage this?

Depends on survey characteristics and client's requirements. As from our side we don't put limit of any kind on our participants whatsoever.

24. What data do you maintain on individual participants such as recent participation history, date(s) of entry, source/channel, etc? Are you able to supply buyers with a project analysis of such individual level data? Are you able to append such data points to your participant records?

We maintain all sorts of tracking data of a specific participant. This data is used only for analytics and internal panel management and is generally not provided to customers.

25. Please describe your procedures for confirmation of participant identity at the project level. Please describe these procedures as they are implemented at the point of entry to a survey or router.

Our participants are provided Unique ID when joining our panel. The Unique IDs are transmitted when a user participates in a survey which helps us in tracking duplicates when found and also for the confirmation of particular participant identity.

26. How do you manage source consistency and blend at the project level? With regard to trackers, how do you ensure that the nature and composition of sample sources remain the same over time? Do you have reports on blends and sources that can be provided to buyers? Can source be appended to the participant data records?

We randomly select participants from our panel so that we can manage consistency and limit response bias. For trackers we normally consider the dropout rate based on past performance data.

27. Please describe your participant/member quality tracking, along with any health metrics you maintain on members/participants, and how those metrics are used to invite, track, quarantine, and block people from entering the platform, router, or a survey. What processes do you have in place to compare profiled and known data to in-survey responses?

Regular quality checks on Member IDs returned as bad data by our customers are conducted. To ensure quality of our panellist data we frequently conduct quality attribution surveys on such Member IDs and if it doesn't meet our standards, we will stop sending any more surveys to those Member IDs.

28. For work where you program, host, and deliver the survey data, what processes do you have in place to reduce or eliminate undesired in-survey behaviours, such as (a) random responding, (b) Illogical or inconsistent responding, (c) overuse of item nonresponse (e.g., Don't Know) (d) inaccurate or inconsistent responding, (e) incomplete responding, or (f) too rapid survey completion?

For surveys that we host, we clean the data according to our own cleaning standards. We conduct the same quality check surveys mentioned in the previous question to weed out fraudulent and non-performant participants.

Policies And Compliance

29. Please provide the link to your participant privacy notice (sometimes referred to as a privacy policy) as well as a summary of the key concepts it addresses. (Note: If your company uses different privacy notices for different products or services, please provide an example relevant to the products or services covered in your response to this question).

http://i-apaconline.com/privacy-policy.html

30. How do you comply with key data protection laws and regulations that apply in the various jurisdictions in which you operate? How do you address requirements regarding consent or other legal bases for the processing personal data? How do you address requirements for data breach response, cross-border transfer, and data retention? Have you appointed a data protection officer?

We currently meet all international certifications and data protection laws. We have also appointed a data protection officer to ensure that we comply with GDPR, CCPA, PIPA (Korea), PIPL(China), the IT Act (India) and other area-specific laws and regulations.

31. How can participants provide, manage and revise consent for the processing of their personal data? What support channels do you provide for participants? In your response, please address the sample sources you wholly own, as well as those owned by other parties to whom you provide access.

For each survey or study, we conduct if any personal information is to be collected, we disclose the information of who is going to handle the data and clearly state the contact information (if third party vendors is used). For any other enquires regarding personal information, our panel members can contact of data response team through email and have them updated or revised in any case.

32. How do you track and comply with other applicable laws and regulations, such as those that might impact the incentives paid to participants?

We comply with all the laws and regulations of each country by checking it with our associated country partners.

33. What is your approach to collecting and processing the personal data of children and young people? Do you adhere to standards and guidelines provided by ESOMAR or GRBN member associations? How do you comply with applicable data protection laws and regulations?

We do not send out surveys to young people and children directly. Instead we make sure to contact the parents of the children so that they can answer the survey together in presence of their children.

34. Do you implement data protection by design (sometimes referred to as privacy by design) in your systems and processes? If so, please describe how.

All our systems are implemented with data protection in mind. No personal information is accessed in day to day operations. In rare cases if the need arises then a limited access is provided.

35. What are the key elements of your information security compliance program? Please specify the framework(s) or auditing procedure(s) you comply with or certify to. Does your program include an asset-based risk assessment and internal audit process?

As stated above we currently work in compliance with all the existing laws and regulations regarding data protection and security.

36. Do you certify to or comply with a quality framework such as ISO 20252?

As mentioned above

Metrics

37. Which of the following are you able to provide to buyers, in aggregate and by country and source? Please include a link or attach a file of a sample report for each of the metrics you use.

  1. Average qualifying or completion rate, trended by month
    • 2022-01 30.21%
    • 2022-02 29.75%
    • 2022-03 31.57%
    • 2022-04 25.12%
    • 2022-05 22.10%
    • 2022-06 27.67%
  2. Percent of paid completes rejected per month/project, trended by month
    • 2022-01 0.10%
    • 2022-02 0.08%
    • 2022-03 0.17%
    • 2022-04 0.21%
    • 2022-05 0.16%
    • 2022-06 0.35%
  3. Percent of members/accounts removed/quarantined, trended by month
    • 2022-01 0.09%
    • 2022-02 0.18%
    • 2022-03 0.76%
    • 2022-04 0.42%
    • 2022-05 0.27%
    • 2022-06 0.13%
  4. Percent of paid completes from 0-3 months tenure, trended by month
    • 2022-01 25%
    • 2022-02 22.20%
    • 2022-03 28.66%
    • 2022-04 34.21%
    • 2022-05 26.99%
    • 2022-06 19.43%
  5. Percent of paid completes from smartphones, trended by month
    • 2022-01 77.42%
    • 2022-02 50.67%
    • 2022-03 85.10%
    • 2022-04 57.09%
    • 2022-05 54.23%
    • 2022-06 58.60%
  6. Percent of paid completes from owned/branded member relationships versus intercept participants, trended by month
    • 2022-01 46.10%
    • 2022-02 52.38%
    • 2022-03 61.61%
    • 2022-04 59.30%
    • 2022-05 55.43%
    • 2022-06 41.65%
  7. Average number of dispositions (survey attempts, screenouts, and completes) per member, trended by month (potentially by cohort)
    • 2022-01 10
    • 2022-02 8
    • 2022-03 13
    • 2022-04 16
    • 2022-05 9
    • 2022-06 12
  8. Average number of paid completes per member, trended by month (potentially by cohort)
    • 2022-01 4
    • 2022-02 3
    • 2022-03 5
    • 2022-04 7
    • 2022-05 2
    • 2022-06 5
  9. Active unique participants in the last 30 days

    Unique participants: 2,006,036

  10. Active unique 18-24 male participants in the last 30 days

    Unique 18-24 male participants: 175,683

  11. Maximum feasibility in a specific country with nat rep quotas, seven days in field, 100% incidence, 10-minute interview

    Max Feasibility China, South Korea and India: 1,650,590

  12. Percent of quotas that reached full quota at time of delivery, trended by month
    • 2022-01 09.36%
    • 2022-02 19.06%
    • 2022-03 19.06%
    • 2022-04 11.77%
    • 2022-05 16.80%
    • 2022-06 13.18%