YamadaYamada
Mail Us
Call Us
Bangkok 10600, Thailand

Esomar 37

 INTRODUCTION TO THE ESOMAR 37 QUESTIONS 

TO HELP BUYERS OF ONLINE SAMPLES 

ESOMAR is the global organization that strives to enable better quality research of markets, consumers and societies. As part of their guidelines for conducting online research on the Internet, ESOMAR has a series of questions designed to help market researchers when purchasing and conducting online research. 

Essential to the Invito platform are the quality standards and controls built into its processes for conducting research. With Yamada’s marketplace platform, online sample buyers have full control over their fieldwork, whether they choose to do ‘self-service’ (DIY) or retain the use of Invito project managers who have been trained in research quality control. All systems and procedures adhere to the standards set forth in Yamada’s accreditation to ISO 20252, which includes Yamada’s focus on transparency to sample buyers. 

This document demonstrates how Yamada’s unique approach to undertaking online market research meets and often exceeds the requirements set by ESOMAR.

1.What experience does your company have in providing online samples for market research? How long have you been providing this service? Do you also provide similar services for other uses such as direct marketing? If so, what proportion of your work is for market research? 

Ans. Yamada Research is a full-service research agency that has been operating since 2009. Since then, the company has provided online research services and conducted fieldwork in Asia, the UK, Europe, and further afield. 

Yamada currently only supplies samples for market research purposes and does not have any plans to change this position. 

2. Do you have staff with responsibility for developing and monitoring the performance of the sampling algorithms and related automated functions who also have knowledge and experience in this area? What sort of training in sampling techniques do you provide to your frontline staff? 

Ans. Yamada does not employ any sample algorithms; instead, a group of project managers perform all sampling by hand. A thorough, documented three-month training plan is in place when new employees join the sample management team to make sure they are all familiar with our procedures. 

Yamada completes regular re-training and refresher courses in addition to the initial training programme to make sure that best practices are upheld. 

3.What other services do you offer? Do you cover sample -only, or do you offer a broad range of data collection and analysis services? 

Ans. Yamada Research offers a field and tab service in addition to providing samples for ongoing projects. It also provides advanced data analysis using various statistical methods, including cluster analysis and segmentation, regression analysis, multi-factor analysis, etc., and full analysis of the results with a final report produced and presented. 

4.From what sources of online samples do you derive participants? 

Ans. We periodically obtain samples from dependable partners in order to guarantee that we can offer a comprehensive solution in the event that the sample is required outside the range of Yamada Research’s capabilities. For several initiatives, including employee surveys and consumer satisfaction polls, we also use data lists provided by the client. 

5.Which of these sources are proprietary or exclusive and what is the percent share of each in the total sample provided to a buyer? 

Ans. Because of the breadth and reactivity of our membership, Yamada Research can almost completely rely on itself to supply samples for online studies across the world. The individuals who make up the Yamada Research database are our sole and exclusive property. We frequently do not request sample contributions from outside collaborators, which streamlines the study procedure and improves the integrity of the data as a whole. 

6.What recruitment channels are you using for each of the sources you have described? Is the recruitment process ‘open to all’ or by invitation only? Are you using probabilistic methods? Are you using affiliate networks and referral programs and in what proportions? How does your use of these channels vary by geography? 

Ans. To reduce any prejudice that may inevitably result from a particular source of recruiting, the affiliate’s specific composition frequently varies. We don’t employ probabilistic techniques, and the hiring process is “open to all.” 

7.What form of validation do you use in recruitment to ensure that participants are real, unique, and are who they say they are? 

Ans. The automated checks in place compare submitted information to the current database and known bad actors using measures like cross-referencing IP addresses, digital fingerprinting, and fingerprinting software. Additionally, thorough checks are performed to make sure that the physical details provided with data lists, voter registries, etc. match real-life people across a range of different sources. 

8.What brand (domain) and/ or app are you using with proprietary sources? 

Ans. Yamada Research is accessed via our member website, www.yamadaresearch.com, www.ymdonline.com and email invites. 

9.Which model(s) do you offer to deliver a sample? Managed service, self -serve, or API integration? 

Ans. We value good customer service, hence we would rather distribute our sample using solely managed service. This enables us to make sure that every survey is sampled with the care and consideration it deserves. 

10.If offering intercepts, or providing access to more than one source, what level of transparency do you offer over the composition of your sample (sample sources, sample providers included in the blend)? Do you let buyers control which sources of samples to include in their projects, and if so how? Do you have any integration mechanisms with third-party sources offered? 

Ans. Any source blending will only occur when a blended external source is purchased to fill a particular gap or specialized need. Sample planning at the beginning of the project will map our partner contributions to ensure consistent sample compositions across all waves in studies that include multiple waves or are longitudinal in nature. Only sample providers who have been verified through the Yamada Research supplier onboarding system will be used; we do not allow purchasers to choose the sources of their samples. When a third-party sample is added to a project, the respondents are immediately screened using the same quality procedures that our proprietary sample uses, including scripts that are specifically designed to find and eliminate source overlap. 

11.Of the sample sources you have available, how would you describe the suitability of each for different research applications? For example, Is there a sample suitable for product testing or other recruit/ recall situations where the buyer may need to go back again to the same sample? Is the sample suitable for shorter or longer questionnaires? For mobile-only or desktop-only questionnaires? Is it suitable to recruit for communities? For online focus groups? 

Ans. We allow clients to acquire PII from respondents (with their consent) for recall studies, or if an unforeseen recontact is necessary, our management system can identify respondents who have participated in prior research. Our members are used to participating in a wide range of different survey lengths and types, including being recruited for both online and offline qualitative surveys like focus groups, communities, and depth interviews. The majority of the surveys we conduct are device-neutral, however we can recognize the device a respondent tries to use to complete a survey and either provide advice or an alternative. 

12.Briefly describe your overall process from invitation to survey completion. What steps do you take to achieve a sample that “looks like” the target population? What demographic quota controls, if any, do you recommend? 

The criteria for each survey’s profile are used to select the sample, along with other considerations including fieldwork time availability and expected response rates. The most efficient technique to do sample selection is always determined throughout the project’s inception by establishing feasibility and incidence. Exclusions can be made according to the survey’s topic, how frequently people participate, or any other criterion as needed. 

Our technologies automatically extract at random those members who meet all profiling requirements when choosing or purposefully eliminating a sample. Before permitting our systems to engage in the bulk distribution of survey invitations, this process is subject to quality assurance checks to ensure that the correct sample needs and expectations are met. 

Invitations to surveys are often sent through email and dynamically placed to each invitee’s home page on the member website. Additionally, we are able to send SMS alerts to members who have submitted their mobile phone and triple opted-in. 

13.What profiling information do you hold on at least 80% of your panel members plus any intercepts known to you through prior contact? How does this differ by the sources you offer? How often is each of those data points updated? Can you supply these data points as appends to the data set? Do you collect this profiling information directly or is it supplied by a third party? 

Ans. Each member has the option of filling out more than 800 fields in total, although it is not required. Every member is reminded to finish their profiles and any other portions that are more than six months out of date. 

This aids in feasibility and sampling while ensuring optimal accuracy at all times. We use our MiniPollTM tool, which enables questions to be released in a matter of seconds and can serve to quantify occurrence or act as a pre-screening method for targeting specialised sample profiles, when there is no appropriate profiling in place. Additionally, we frequently write custom pre-screening programmes to meet the needs of unusual or complicated sample requirements that cannot be adequately met by core profiling 

alone. To final data sets or integrated in the client survey URL, we can include non- special category or PII information. 

14.What information do you need about a project in order to provide an estimate of feasibility? What, if anything, do you do to give upper or lower boundaries around these estimates? 

Ans. We need the following information in order to provide a quote: the number of completions needed, the goal specification and incidence rate, the intended fieldwork timetable, survey quotas, the duration of the survey, and any technical information that might affect fieldwork. Within 24 hours of the initial quote, we can use our MiniPollTM system to collect information in circumstances where an occurrence is unknown. 

15.What do you do if the project proves impossible for you to complete in field? Do you inform the sample buyer as to who you would use to complete the project? In such circumstances, how do you maintain and certify third party sources/ sub-contractors? 

Ans. Third-party providers are frequently engaged to support the completions in the few instances where Yamada Research is unable to complete a project in the field based on the original project specification. While the identity of third-party suppliers is kept a secret, the plan will take into account any preferences or vendors that a client may have. The suppliers utilised are automatically filtered via the same quality standards that our proprietary sample goes through, including proprietary scripts to identify and filter potential overlap between sources, just like when a third party was employed in the initial quote. 

16.Do you employ a survey router or any yield management techniques? If yes, please describe how you go about allocating participants to surveys. How are potential participants asked to participate in a study? Please specify how this is done for each of the sources you offer. 

Ans. We don’t use survey routers. 

17.Do you set limits on the amount of time a participant can be in the router before they qualify for a survey? 

Ans. Not applicable. 

18.What information about a project is given to potential participants before they choose whether to take the survey or not? How does this differ by the sources you offer? 

Ans. All participants are informed of the survey’s length and topic, the reward they will 

receive for completing it successfully, the time remaining before the survey goes offline, and the recommended equipment to participate in the study. Any additional instructions that are specific to a given survey can also be shown, and they would be reviewed with the client prior to implementation. In every email communication we send out, in addition to survey-specific material, we also include a link to our privacy policy, our terms and conditions, and a way for our members to unsubscribe from our panel. 

19.Do you allow participants to choose a survey from a selection of available surveys? If so, what are they told about each survey that helps them to make that choice? 

Ans. Depending on their eligibility and profiled information, each of our members is selected to participate in a survey on an individual basis. An invitation to the survey is then sent to each member individually through email. However, members can view all of the polls they have been invited to on the website. They can view all of the surveys and the email’s contents in one place on this page. 

20.What ability do you have to increase (or decrease) incentives being offered to potential participants (or sub-groups of participants) during the course of a survey? If so, can this be flagged at the participant level in the dataset? 

Ans. We have the ability to alter the incentives provided in the field and we can identify the incentive that a participant saw in the final dataset. This is normally only employed when infield measurements significantly deviate from the original brief or when raising the incentive supplied is necessary to increase feasibility. 

21.Do you measure participant satisfaction at the individual project level? If so, can you provide normative data for similar projects (by length, by type, by subject, by target group)? 

Ans. Every survey ends with a feedback section, which we use to develop a database of average ratings depending on several variables like survey length, research design, and subject matter. 

22.Do you provide a debrief report about a project after it has completed? If yes, can you provide an example? 

Ans. Information on incidence rates, the amount of quota full and screen-out behavior, and any anecdotal feedback from respondents—which we typically get for most surveys— come under the category of typical feedback. On some projects, the report is routinely provided; on others, it is only provided upon request and is sent through email. 

23.How often can the same individual participate in a survey? How does this vary across your sample sources? What is the mean and maximum amount of time a person may have already been taking surveys before they entered this survey? How do you manage this? 

Ans. Although we are careful to not over invite our members, we do not impose a strict cap on the number of surveys that can be conducted in a given time frame. We often employ a lock-out mechanism of three months for longitudinal studies to prevent a responder from participating in numerous waves of the study during this time. A case- by-case basis is used to implement lockout periods that may be shorter or longer for some projects. Lockouts can also be implemented depending on a topic or exposure to specific content during a predetermined amount of time. 

24.What data do you maintain on individual participants such as recent participation history, date(s) of entry, source/ channel, etc.? Are you able to supply buyers with a project analysis of such individual level data? Are you able to append such data points to your participant records? 

Ans. We keep track of every panel member’s involvement in surveys individually. Each email received, each click-through to the survey, each entrance into the survey, each exit from the survey, and the source of the panellist are all tracked for quality assurance and reporting needs. 

We can provide this information upon request if clients need it reported in reference to their projects. 

25.Please describe your procedures for confirmation of participant identity at the project level. Please describe these procedures as they are implemented at the point of entry to a survey or router. 

Ans. Regardless of whether we also supply the sample, every online survey we host is put through rigorous data integrity procedures. The survey results are not affected by any survey data that does not adhere to our data integrity standards. Additionally, accounts are automatically deleted for panelists who repeatedly supply useless data so they won’t be invited to participate in future polls. Such replies cannot redeem any awards that are currently in their account, and the payment of rewards to them is rejected. In order to actively identify any potential rogue respondents and remove them from the panel, we also employ several anti-fraud detection technologies at the point of registration, survey entry, and in other places. It seems sense to compare profile information to survey results. 

26.How do you manage source consistency and blend at the project level? With regard to trackers, how do you ensure that the nature and composition of sample sources remain the same over time? Do you have reports on blends and sources that can be provided to buyers? Can source be appended to the participant data records? 

Ans. All additional responders from third parties are obtained from our approved list of vendors who have their own exclusive sample and adhere to the same values as us. 

Although we brand each respondent with their original source, their members go through the same masking and quality control procedures as our own members as they enter our system. To ensure a precise sample composition from each source, this tag can be used to impose stringent quotas. Although the sources are not often appended to the final data and their names are not disclosed to customers, they can ask that certain sources not be used. 

27.Please describe your participant/ member quality tracking, along with any health metrics you maintain on members/ participants, and how those metrics are used to invite, track, quarantine, and block people from entering the platform, router, or a survey. What processes do you have in place to compare profiled and known data to in-survey responses? 

Ans. We use custom scripts to keep track of how well each member’s responses are written. The script examines the consistency of responses given on surveys hosted internally, comparing them to answers given on prior surveys, profile information, and surveys from which they were eliminated due to poor data quality. A respondent is removed from our panel if we detect too many quality problems from them. 

28.Fo r work where you program, host, and deliver the survey data, what processes do you have in place to reduce or eliminate undesired in-survey behaviours, such as (a) random responding, (b) Illogical or inconsistent responding, (c) overuse of item nonresponse (e.g., “Don’t Kno w”) (d) inaccurate or inconsistent responding, (e) incomplete responding, or (f) too rapid survey completion? 

Ans. Our data team performs all data checks at the conclusion of fieldwork using a combination of semi-automated and human procedures. Based on responses to grid or rotating statement questions, survey completion time when compared to the norm, inconsistent responses both within the survey and when compared to answers provided in previous surveys, and profile data, an automated script is used to determine the likelihood of bad data. A responder may be flagged for removal from the panel if a pattern of poor responses appears after highlighted responses have been manually reviewed alongside additional tests that are more challenging to automate, such as illogical comments made. 

29.Please provide the link to your participant privacy notice (sometimes referred to as a privacy policy) as well as a summary of the key concepts it addresses. 

Ans. The Yamada Research privacy policy can be found at www.yamada research.com/privacy-policy and is accessible via all pages of the www.yamadaresearch.com membership website. All new registrants to the website must confirm acceptance of this policy as well as our Terms and Conditions before being able to submit their registration. 

30.How do you comply with key data protection laws and regulations that apply in the various jurisdictions in which you operate? How do you address requirements regarding consent or other legal bases for the processing personal data? How do you address requirements for data breach response, cross-border transfer, and data retention? Have you appointed a data protection officer? 

Ans. Yamada Research works with a data protection officer who provides information and guidance on data protection regulations in various areas and aids in ensuring that we abide by all of them. To make sure everything is kept up to date, senior staff members and the DPO meet once a month. Any member may send an email to the DPO or the Yamada Research support team to exercise their right to be forgotten, seek access to their personal information, or have their replies to a specific survey deleted. All of these requests are handled under the parameters of the Freedom of Information Act. 

31.How can participants provide, manage and revise consent for the processing of their personal data? What support channels do you provide for participants? 

Ans. The profile part of our website allows members to control all of the information they provide us. Additionally, customers can contact our member support team via email if they need any help. 

32.How do you track and comply with other applicable laws and regulations, such as those that might impact the incentives paid to participants? 

Ans. We are members of ESOMAR and stay current with any publications or recommendations they produce to ensure that we always abide by the most recent laws and regulations. 

33.What is your approach to collecting and processing the personal data of children and young people? Do you adhere to standards and guidelines provided by ESOMAR or GRBN member associations? How do you comply with applicable data protection laws and regulations? 

Ans. We used our sub-panel of more than 50,000 kids under the age of 16 for hundreds of online and offline projects with kids between the ages of 6 and 15. In the case of offline studies, parental oversight of the research process is required and we only use Yamada- checked people on such projects. 

Parental permission is also required before doing online surveys with minors, and we never speak with them directly. Children’s surveys are conducted in accordance with ICC/Esomar best practices. All information submitted by children is handled with the same levels of care and integrity that we use to protect the identities of our panelists and the responses to our surveys. 

34.Do you implement “data protection by design” (sometimes referred to as “privacy by design”) in your systems and processes? If so, please describe how. 

Ans. All of Yamada Research’s systems adhere to the most recent GDPR and data privacy and protection requirements and implement privacy by design. All Yamada Research workers are prevented from accessing member PII and sensitive information and only have access to the data that is necessary for them to do their daily tasks. 

In order to prevent any information from being gathered from survey data or surveys that a member has participated in, all members are anonymously identified using an ID that is then encrypted and disguised for each survey they participate in, both internally and externally. 

Members of the data team who work with the live survey data when a project is internally hosted only have access to the encrypted survey-specific ID and are unable to link this up to the member ID to ensure that they are unable to link any survey responses to any specific member, or any survey responses given in any other survey. 

35.What are the key elements of your information security compliance program? 

Please specify the framework(s) or auditing procedure(s) you comply with or certify to. Does your program include an asset-based risk assessment and internal audit process? 

Ans. As a holder of the ISO 9001:2015 accreditation, Yamada Research adheres to the guidelines for both digital and physical security. Only authorized people have access to the secure servers where all information and project materials submitted by Yamada Research members and clients are saved. These servers are only used to manage member accounts and surveys. Using Extended Validation SSL technology, which encrypts the contents of the browser session and guarantees the integrity of the data transaction between the user’s internet browser and our systems, all data submitted by members via the Yamada Research website is protected. 

Yamada Research servers are physically and electronically protected from unauthorized or public access using the most up-to-date security measures, including but not limited to firewalls, data encryption, IP-based permissions, CCTV, and swipe entry access control. The same degrees of physical protection and authorized access apply to data backups. 

36.Do you certify to or comply with a quality framework such as ISO 20252? 

Ans. Since 2009, every year, our organization has maintained its ISO 9001:2015 certification. Additionally, we are striving to achieve the ISO 20252:2006 Market Research Quality Standard. 

Our ISO-approved quality management systems are based on the ideas of efficient data management, storage, and security. Since these systems are always being reviewed and altered, they always remain in conformity with external auditing criteria. 

37.Which of the following are you able to provide to buyers, in aggregate and by country and source? 

Ans. Yamada makes use of a variety of reporting and business intelligence tools. Data and metrics can be exported for study and are accessible upon request. Metrics can be aggregated by sample customer, supplier, location, etc. for a range of indicators, such as completions, conversion rates, price, and cost. 

No products in the cart.

X