Heath Journalanalysis Engl7702

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 4

Darsee Heath

September 9, 2017
Dr. Albers
ENGL 7702
Journal Analysis

Evaluation of research methods within articles published in


IEEE Transactions on Professional Communication

As the field of technical communication (TC) is emerging, an analysis is needed to


evaluate the research being conducted in the field. This evaluation takes an in-depth look
at three consecutive issues of a TC journal, IEEE Transactions on Professional
Communication, volume 59, issues 1-3, and compares and contrasts the research
methodologies listed in the research articles. This analysis looks to provide a summary of
methods used, what types of studies were performed and examining them overall.

The three issues examined focused on a myriad of subjects prevalent in the TC field,
ranging from content strategy to credibility to in-group dynamics. Each article followed
the journals formatting rules and provided explicitly mentioned research problems,
methods, literature reviews and conclusions. As a focus was done on the research articles
only, there was a heavy amount of quantitative research present, particularly in the ways
of surveys to collect data. A progression was noted from the first set of articles in issue
one with a focus on literature reviews and textual analysis to issue two with mixed
methods approaches as the main approach to the last articles in the third issue being
heavily quantitative in methods.

While reading issue one, there was a heavy dependency on textual analysis in both
articles. Clark’s “Content Strategy: An Integrative Literature Review” sought to find how
content strategy was defined, a theme for this particular issue, and used a literature
review as his main means of answering his research question. Explicitly defined
parameters were constructed and sought out in his literature review. This can be said for
Getto and Labriola’s article, the other feature in issue one, as they also included a
literature review but they also conducted a case study by means of interviews to
supplement their findings. They worked to build their evidence by adopting a flexible
methodology (43) with the use of a content model based on the case study and shaped by
the data sources and study participants. Opposed to Clark, Getto and Labriola found the
inclusion of interviews key to their study as it provided new understanding.

Issue two’s focus was not as singular as the first and the research articles focused more
on data seeking initiatives than one topic - though each article did provide explicitly laid
out literature reviews despite being more empirical in their methods. Issue one’s
evaluation on literature as an answer to their questions instead of just an aide in starting
the research is a stark contrast to issue two. The first article in issue two, Mackiewicz,
Yeats and Thornton’s “The Impact of Review Environment on Review Credibility,”
sought out to find how product reviews affect the potential consumers and the credibility
of user ratings. Since they were focusing on not only hard data points, i.e. positive and
negative numbered ratings, as well as consumer opinions, a mixed method approach was
selected and followed out by the creation and use of an online survey and evaluation of
reviews by the study participants. The researchers use of Mote Carlo simulation (78)
provided them with a way to measure the gathered responses. Opinion and perception
were major factors in their selection of methods and how they translated the textual
analysis into data points. This reasoning can be seen in Ada’s research article as well but
it was taken to a higher level with more focus on empirical methods and the inclusion of
interviews and survey-based study. The survey was used as their means of collecting
quantitative variables while the interviews were used to validate the survey questions and
provide qualitative backing. Ada et al also ran the survey through testing and modified
(94) and refined it based on student feedback, a step not taken by all researchers in this
issue.

The mixed methods approaches in issue two were continued in Cleary’s article focusing
on technical communicators in Ireland and how they operate as a community in the field.
This study is different from the previously mentioned articles thus far as it is of an
exploratory nature and, as Cleary states, “no previous studies have examined the Irish
context for technical communication,” (129). Due to the lack of research, it presented the
researcher with a unique opportunity to gather a multitude of data and evidence. A
questionnaire was drafted and designed for online survey use, a focus group was done to
help “cross-validatate survey data, and see richer attitudinal and behavioral information”
(129), and interviews were conducted face-to-face as a follow up to the online
discussions. The opinion based nature of this study aided in creating a qualitative study
with the added bonus of quantitative data points by way of number of job titles, gender,
and age.

The virtual nature of a quantitative, data heavy research was present in the last article in
the issue, “The Impact of Virtual Customer Community Interactivity on Organizational
Innovation: An Absorptive Capacity Perspective.” It is here Roberts and Dinger introduce
a hierarchical regression technique to test their initial hypotheses and results of their
survey-based research design. Statistical techniques were relied upon heavily in order to
find the effects of virtual customer communities and relationships. Two surveys were
used to collect data points measuring “absorptive capacity, firm performance,
environmental dynamism, firm age, firm size, and industry type,” (118). These data
points branched out further than we saw in other studies, such as Cleary’s that only took
in more person specific information, i.e. age.

As we have seen in the aforementioned research, surveys are heavily relied upon and
used. Every research article in issue three used a survey as one of their main methods to
aide in answering the proposed research questions. As surveys can be created for not a lot
of money, and in many instances for free, and can be distributed via email, it makes sense
as to why this method is often chosen - especially if you are connecting with people on
the other side of the country. This is the case for Paul et al as they sought out participants
in the U.S. and India – making the Internet a key concept in their research and how they
would obtain information. This presents an interesting challenge for the research and
requires all methods to be web friendly. This quantitative based research relied upon
surveys to give data for evaluations of customer-relationship management practices. The
decision for quantitative methods ranged from the researchers wanting common elements
that could be generalized and to have a controlled approach that “provided a rich data set
that allowed [them] to generalize [their] findings,” (191). The coordination and
effectiveness of this study is seen in their selection of methods that could be controlled
via the Internet while also giving data collection that would be useful to answering the
research questions.

Paul et al were not the only researchers to have a global focus, as Wong et al focused on
participants in China. In their quantitative study, a survey method was used to collect
data, but also to create a panel database – using the survey as a means to verify the
research model (235). As with Paul et al, this study wanted to use a survey to have results
that could be generalized. This desire to generalize information is something that is found
within many of the quantitative research articles, including Plotnick’s et al, the third
article in the third issue. The decision to make their study a quantitative vs qualitative is
explained as being a way to reach diverse, larger populations and create less confusion
with specific details that they say would be specific to certain participant teams, such as a
case study (211).

Fuller et al, the last article in the third issue followed the same pattern made in this issue
of survey-based research with the intention of colleting data but they also utilized
qualitative methods such as observation and the addition of coding interactions between
the participants. The researchers quantitative quasiexperimental study was selected
because it utilized access to the difference in communication process characteristics and
perceptions they sought to study. They selected this study design because it allowed them
to “control for other individual characteristics as well as team-level characteristics” (174).
This control factor can be compared to how the other three research articles sought
control in how they wanted generalized data results.

IEEE Transactions on Professional Communication’s focus is on applied research in


professional communication and that can be seen in the selection of articles presented in
the three issues mentioned above. The ten articles themselves all followed a similar
guideline and structure and fit within the applied research the journal is devoted to. The
articles examined in each issue all utilized literature reviews, showing a lean towards
textual analysis and opinion based (as in the parameters selected within the literature
reviews). The mix of qualitative, quantitative, empirical and opinion-based research were
present in many of the research articles. Surprisingly, many of them were solely focused
on quantitative data and surveys as their main method of obtaining results, an interesting
take away from this analysis.
Works Cited
Clark, D. (2016). Content Strategy: An Integrative Literature Review. IEEE
Transactions on Professional Communication, 59(1), 7-23.
doi:10.1109/tpc.2016.2537080

Getto, G., & Laboriola, J. T. (2016). IFixit Myself: User-Generated Content Strategy in
“The Free Repair Guide for Everything”. IEEE Transactions on Professional
Communication, 59(1), 37-55. doi:10.1109/tpc.2016.2527259

Mackiewicz, J., Yeats, D., & Thornton, T. (2016). The Impact of Review Environment
on Review Credibility. IEEE Transactions on Professional Communication,
59(2), 71-88. doi:10.1109/tpc.2016.2527249

Roberts, N., & Dinger, M. (2016). The Impact of Virtual Customer Community
Interactivity on Organizational Innovation: An Absorptive Capacity
Perspective. IEEE Transactions on Professional Communication, 59(2), 110-
125. doi:10.1109/tpc.2016.2561118

Cleary, Y. (2016). Community of Practice and Professionalization Perspectives on


Technical Communication in Ireland. IEEE Transactions on Professional
Communication, 59(2), 126-139. doi:10.1109/tpc.2016.2561138

Ada, S., Sharman, R., Han, W., & Brennan, J. A. (2016). Factors Impacting the
Intention to Use Emergency Notification Services in Campus Emergencies: An
Empirical Investigation. IEEE Transactions on Professional Communication,
59(2), 89-109. doi:10.1109/tpc.2016.2527248

Fuller, R. M., Vician, C. M., & Brown, S. A. (2016). Longitudinal Effects of Computer-
Mediated Communication Anxiety on Interaction in Virtual Teams. IEEE
Transactions on Professional Communication, 59(3), 166-185.
doi:10.1109/tpc.2016.2583318

Plotnick, L., Hiltz, S. R., & Privman, R. (2016). Ingroup Dynamics and Perceived
Effectiveness of Partially Distributed Teams. IEEE Transactions on
Professional Communication, 59(3), 203-229.
doi:10.1109/tpc.2016.2583258

Wong, L. H., Ou, C. X., Davison, R. M., Zhu, H., & Zhang, C. (2016). Web 2.0 and
Communication Processes at Work: Evidence From China. IEEE Transactions
on Professional Communication, 59(3), 230-244.
doi:10.1109/tpc.2016.2594580

Paul, R., Drake, J. R., & Liang, H. (2016). Global Virtual Team Performance: The Effect
of Coordination Effectiveness, Trust, and Team Cohesion. IEEE Transactions
on Professional Communication, 59(3), 186-202.
doi:10.1109/tpc.2016.2583319

You might also like