Tuesday, 17 May 2016

What Am I Supposed To Do With This? - Analytics

This week's topic is Social Media Analytics. My reflection is based on the article titled "10 Ways to Turn Social Media Data into Smart Data." In the article, Matejic (2016) gave a lite version of why and how you should use social media. She put forward, that social media is essentially a "gold mine" for information. You can aggregate and exploit this information in order to increase revenue; at no extra cost (except the cost of your time and effort).

My favourite part of the article, was her advice - advice that she is giving to people who are serious enough about business communication, to be reading the article in the first place.

She stated:

"4. Likes, followers and raving fans mean nothing if they aren’t converting."

She continued on to explain that there is a difference between a high turn over of clicks and actual engagement that translates to sales. The reason why I enjoyed reading her advice, was because I realised that very same thing in diary blog reflection just last week. My realisation was that measuring "likes" are superficial and unreliable measures when it comes to engagement.

Moving on, since I will miss the tutorial this week, I figured that I would try to play around with radian 6. Disappointingly, I couldn't for the life of me log into it. So I turned to trying to make sense of the google analytics data for my Public Relations Planning and Evaluation blogs. What did I learn? Not much! The stats showed me how many page views I'd had and where the views were coming from. The kind of person I am, sadly couldn't appreciate this.

While, it's nice to know that 13 page views originiated in Mauritius, I tend to appreciate the process more than the outcome. I like to know why and how things happen. Why are people in Mauritius reading my blog? What are they searching? Or maybe they aren't needing to do a search. Perhaps they are just clicking 'next blog' and google is randomly selecting the next blog for them to view. Why Mauritius then, and not Germany?

I am sure that all these questions could be answered, but this would only require the collection of even more data that could add context. Things like, what kind of people look up PR blogs in Mauritius? Are they students? From what institutions? What are their interests? What else do they search? All that would then only satisfy one question - why are people in Mauritius reading my blog... but what about the rest of my questions? It's never-ending!

I suddenly realised why search engines are inundating us with cookies and trackers that note down and store everything we do online. It stems from a quest for information. Once you learn a little bit you want to learn a little bit more, but before then you end up learning a bunch of other thingson your way to the original thing that you set out to investigate. Before long - you end up being obsessed with wanting to know as much as possible about anything that is even remotely releveant, because it adds depth. Eventually, instead of answering the question you set out to solve, you end up creating for yourself, several new ones.


I know this because it has just happened to me. If you'll recall, all I wanted to do was understand how to interpret data. I consulted google analytics and that only evoked a greater curiousity in a topic that had well and truly begun to veer away from my original question. Regretably, I come back to this: I still have no idea what I am supposed to do with google analytics. Except now I am potentially worse off, because now I'd like to understand how and why I have a tiny fan base of Mauritian readers (that was exaggerative, I know). Anyway, If you are in Mauritius and you are reading this, please comment because I'd really like to know!



Wednesday, 11 May 2016

Measuring and Evaluating Public Relations Efforts


This weeks topic was an introduction to evaluating and monitoring Public Relations Efforts.The areas of text (Macnamara 2014) which appealed to me the most, were those that discussed opinion, advocacy and engagement. Why? I think it has something to do with my childhood dream of being an incredibly famous actress. I hate money. I need the thing, but I hate it. So the hunger for fame was not based in the desire for fortune, but in the desire for influence.



Opinion and Advocacy 

In terms of opinion and advocacy, I agree with that posed by Nuendorf (2002), which I have interpreted as a belief that data (an opinion) is rich with symbolism and underlying meanings that have been sourced from other aspects of life. This means that content that could simply be analysed as 'negative' from a PR perspective, is actually a complex mixture of influences; consisting of many ideas/values and beliefs which contribute to the on-screen, face-value opinion that one is left with.

Essentially, what this means to me as a future PR officer, is:


"There's always more to what you see, than what you can see."


For that reason - don't underestimate the depth and complexity of an opinion, because even the most educated or informed perspectives, are riddled with unintentional biases. It's part of being human and it just can't be helped. Our opinions will always be tainted by indirectly-relevant matters, and negative past experiences can be hard to override. However, I believe that a company's greatest critic has a potential to be it's greatest advocate.



Engagement

Now this is where the real reflection happened. The only way to turn people around, is through engagement. I can agree (and appreciate) Macnamara's (2014) attention to the fact that:

I. Engagement should not be a buzzword (right on);

II. True engagement is profound and actually requires psychological depth (yes!)

III. It's about commitment, absorption and participation (PREACH IT)


So I read on... and then didn't actually see a suggestion on how to measure engagement?

Luckily, Patel (2016) supplied the "how to engage" aspect. Patel inadvertently outlined how to engage the audience. In retrospect it was both obvious and subtle. Patel outlined three ways to engage your audience:

  • By personally responding to comments - which requires commitment on your part.
  • Joining conversations - which requires your passion, enthusiasm or energy towards the topic.
  • And by mentioning people - which requires participation on your part, and also invites other to participate.


So, how do you engage your audience? By being engaged, yourself! How do you measure it? Well, I figure that someone who couldn't even care enough to argue - is completely disengaged because they don't exhibit any of the elements of engagement. No commitment, participation or energy towards to the topic. But comments to and fro - whether in argument or discussion, are a definite sign of engagement. Both parties are committed, both are investing energy towards the topic and both are participating. Therefore, the elements of engagement, are: commitment, energy and engagement.



How then, would I measure engagement?

1. I would personally choose to measure true engagement through repeat comments - whether "positive", "neutral" or "negative."

2.  I would also measure and evaluate engagement through the the number of overall comments (evaluates participation and energy)

3. Replies to a topic or comment, on a single thread (evaluates commitment and energy),

4. How many people are mentioned/tagged in comments (evaluates participation).

Friday, 29 April 2016

How to: Make Your Community Feel Valued

The Power of Communities

This week the topic of discussion was based on community engagement. We had a guest speaker from a local council present to our class. She shared with us success stories of community engagement and why it is so important. There are many things that I deeply enjoyed about the presentation - to the extent where it sparked an interest in me as something I should consider trying my hand at, later in my own career. We were given handouts on "Increasing the level of public impact." The handout illustrated the IAP2 Public Participation Spectrum. There are 5 stages of public participation:

1. To Inform
2. Consult
3. Involve
4. Collaborative
5. Empower

Now I was learning all these things and starting to feel motivated about making changes in my own community, but then I heard this: "It's not about reaching a consensus." 

When I heard this, I just felt so insulted and irritated. Isn't the entire point of community engagement, to achieve an agreement of some sort? To negotiate, compromise and then reach a common understanding, which at least satisfies the needs of all parties? It's cruel and even offensive to inspire people to waste their breath for the sake of being heard, but not listened to. Hearing is superficial, but can be helpful. However if people have been invited to speak - then do more than hear them, actually listen, or the spectrum quickly turns into this:

1: Inform
2. Consult
3. Ignore   
4. Overrule
5. Demoralise

After the presentation, I read this week's reading by Hartz-Karp (2005), which shed some light on the situation and taught me that community engagement is essentially "deliberative democracy". The three key elements of deliberative democracy, which apply also to community engagement, are:
  • Influence
  • Inclusion
  • Deliberation
Influence is the first fundamental step, because opinions must have the capacity to influence the process. Or why bother in the first place? Inhibiting an opinion from influencing discussion, essentially violates one of the primary purposes of community engagement.  Furthermore, the final step in deliberative democracy requires "movement toward [a] consensus" (Hartz-Karp 2005). Which means that the intention to justify non-consensus is based in error and self-interest. I say this because while I was reading Hartz-Karp (2005, 1), I stumbled upon this:

"We need to reinforce that we are a democracy, the problems confronting government are the problems of the community and we have to work together to solve them." 

I reflected on this statement and came to understand the problem with government and why consensus is not perceived as a necessity. 

The government is self-interested. If the Government's primary interest was honestly the community's primary interest, then the necessity of consensus would be undeniable. Instead, it seems like people are the tools that facilitate things for the government. It is the government who should facilitate things for the people. That quote should not say that "the problems facing the government are the problems of the community."

Since when does life revolve around the government? Life has always revolved around the community first, and the attitudes should be reversed. "We [the community] need to enforce that we are a democracy, the problems confronting [the] community are the problems of the government", it shouldn't be the other way around. We are more than just tools for the government to use when it suits them, and community engagement IS about consensus. If it is not - then you are doing it wrong.

Saturday, 16 April 2016

Fly or Flop - The Role of the Moderator

Week 6 | Getting to know your audiences: Interviews & focus groups


During this week's tutorial, the class was divided into 3 groups. In our groups, we had to prepare a structure which we would use to manage a focus group. Upon reflection, I considered a number of things. Firstly, what we could have done better; secondly, the role of the moderator and how that influenced the general focus group dynamic and lastly, I realised that we were almost a part of a focus group double-layer. Pre-assigned students played moderator roles, then Veronica re-assumed the moderator's role during group-to-group transitions. It was nearly (or was) a focus group within a focus group!



What we could have done better:

- Stacks and Michaelson (2010) say that questions need to become more specific as time advances. I recognise now that this would have enabled us to draw more relevant answers from the group. In turn, this would have allowed us to accurately fulfill our objective.

- The responses made it apparent to me that our questions were based in our own bias'. We had formed questions that depended on price as the factor which influenced student spending habits the most - when it turned out to be quality which most affected their decisions to buy food on campus.

- The seating. I regret that I didn't ask to pause our time so that we could reformat how we were seated. I am well used to circular seating and I understand how fruitful it is because of my history with drama and acting. The difference was significant once we were all seated in a circle.



The Role of the Moderator:

Some moderators addressed the group briefly but others remained as the prominent moderator figure. In no particular order, I picked up on these things:

- Moderator A was very definitive. Discussion was always kept on topic and within scope. Overall, the process was clear, highly coherent and flowed logically from start to end. It was evident that this was due to the moderator's focus. Austin and Pinkleton (2015) have identified that focus is a key to refining information, which is why of all the moderators - I believe moderator A would be the one to produce the best information.

- Moderator B posed a thought provoking question but it unfortunately stunted discussion. I suspect that the reason for this is because moderator B had been taking notes and analysing responses in depth. Therefore, moderator B was more deeply involved than most participants. This may have caused the question to seem like it came from nowhere. Perhaps a short recap could have helped contextualise the question before asking it. This would have given most, a chance to 'catch up'.

- Moderator C was able to expand on the question swiftly when faced with a barrier. Participants did not know how to respond to a very general question. Moderator C was able to elaborate without giving away the details that would lead to the development of preconceived notions on the topic. Moderator C talked too much, however. It should have been 80% participants and 20% moderator speaking.

- Moderator D was very passive and softly spoken. It enticed people to contribute. Some of the quieter students spoke up the most for moderator D. The moderator was prompt in moving to the next question but to some extent it impeded conversation. This moderator also carried the least expression while speaking. A combination of tone and promptness made it seem like our responses were to some extent - unimportant.

- Moderator E had a very conversational tone, which made them approachable. They were very relaxed in posture, and the informality caused their question to seem like one based in interest - when in fact it was fundamental to their objective. This seemed to work in their favour though since we all felt comfortable responding honestly. I would be curious to see moderator E facilitate a controversial topic.

- Moderator F was very expressive in tone and was good at drawing insightful responses from the group. Moderator F conveyed a definite air of direct questioning. Consequently, individuals tended to address their answers specifically to moderator F, rather than engaging with each other. Maybe this is how it is supposed to be? Dialogue is one of the advantages of qualitative research, after all (but I can't remember which reading mentioned it).

In summary, I have seen first hand how a moderator affects group dynamic. The success or failure of a focus group really depends on the moderator's skills. In future, if I ever find that I do not have the budget for a highly experienced moderator, then I am confident that I can hire one based on what I need and the moderator's personal attributes. 



Sunday, 10 April 2016

The Benefit of Qualitative Research

In week 5, the Commissioner of the West Australian Electoral Commission (WAEC) kindly gave up his time to talk about challenges facing the WAEC and to answer questions that would assist us in our next assignment. Appropriately, the week's topic was "The power of observation."

A summary of Stacks (2010) 'Practitioner's Guide to Public Relations Research'  outlines that the intent of qualitative research is to bring forth attitudinal and behavioural insight through detailed responses. The qualitative research methodologies discussed, covered in-depth interviews, participant observation and focus groups. Upon reflection, I cannot identify which of the methodologies applied to the visit, except that participant observation is not applicable.

Reasons why the Commissioner's visit does not fall under the category of a focus group:

  • - The overall purpose of the discussion was not focused on the exploration of the class' opinions
  • - While the class had a common background, it cannot particularly be said that the topic was a shared interest
  • - Individuals were not present voluntarily


Reasons why the Commissioner's visit did resemble a focus group:

  • - Veronica played the role of the moderator
  • - It can be generally said that it was a controlled group discussion
  • - The Commissioner posed the question - what could we suggest the WAEC do differently in order to encourage our demographic to enrol to vote? He openly sought our opinions and discussion, which are elements of a focus group

Why the visit does not seem like an in-depth interview

  • - An in-depth interview is one-on-one, his visit was not a one-on-one environment 
  • - Questions had not been sent to him prior to his attendance. For the most part they may have caught him by surprise
  • - The location was student domain. An in-depth interview typically takes place in a location where the interviewee is most comfortable or at least neutral (but never in the interviewer's company office, for example)
  • - The diversity of student lead to diverse questions that at some stages also seemed to be off-track

Why aspects of the visit seemed like an in-depth interview

  • - The questions drew rich responses
  • - Some student were able to use funnel questions
In sum, I greatly benefited from his visit. I had prepared a couple of probe questions defined by Stacks (2010), which I unfortunately did not get to ask because there was too little time. However, I would have liked to have asked the Commissioner, was; why it was so significant to him personally, to capture the votes of 18-25 year old's - other than the fact that it was his job to do so? 

Part of me tells me that I simply enjoy asking the hard questions that make people uncomfortable, but the other part of me knows that everything about his tone, reaction, body language and expression would have pinpointed his exact attitude towards us, which would have given me a great head start on making recommendations. Not that the words he used weren't telling enough - something you simply can't acquire through a survey.

Well, at least I got to ask how old the social media lady was. Call it what you will, the fact is that nothing replaces the insight of a person's physical cues as much as an unexpected question. In case you were wondering - yes it was a setup, I knew the answer before I asked it. In fact what I keep coming back to - is that when you're trying to get through to people, just speak how they speak and the task will get a whole lot easier. If anything, it was the Commissioner's visit that proved it to me.



Tuesday, 29 March 2016

Now Let's Talk About Me.

Week 4 was all about Surveys, polls and questionnaires. Admittedly I wasn't looking forward to this week's activities. As soon as anyone says the word 'compulsory' I immediately seem to harvest resistance. None the less, I filled my quota. First, I went through the list of surveys in the thread and scanned all the titles. I wasn't sure where to start so I decided that I'd start with the surveys that looked MOST interesting/relevant to me. Doing this highlighted the number one importance of having an appealing survey title, otherwise you simply don't get the respondent numbers that you might hope for. In all honesty, I didn't even bother with survey topics that I wasn't interested in - EXCEPT one! Why? Reflecting revealed to me that it was the catching title that sucked me in. Therefore, lesson learned.


In regards to all of the surveys, areas that were generally done well:

  • Most topics were highly relevant to Bentley students that study internally.
  • Surveys were based on 'hot topics' or themes that are currently trending
  • Well formatted surveys, clear structures (Most if not all, used survey monkey)
  • Most people used more than one style (multiple choice, rating, net promoter score)

Areas that could generally be improved:

  • Few titles were enticing for less popular topics
  • Logic was unclear, it felt like the survey was asking questions for the sake of asking questions
  • General survey feedback was positive but not very critical or costructive
  • Some survey titles needed to better reflect the subject of the survey
  • Opportunities for participants to freely express an opinion

Areas that were generally absent:

  • A survey description where the titles were not very specific
  • Expectation-setting at the start of the survey (I looked at (and did 6)) of 7 surveys and I think only 2 had it)



'Let's Talk About You' 
(A Reflection On My Survey)


I created mine - remembering what I liked and didn't like from the other surveys. The strategy for my survey was:

1. To attract a high number of respondents
2. To raise awareness
3. To inform 

Objectives were:
1. To create a title that appealed to the audience ('generation me')
2. To understand existing awareness 
3. To convey specific information 


I met these objectives by thinking from an organisational perspective. I chose the Student Wellbeing Advisory Service as the program I pretendingly work for. The tactic was to get participants talking about their feelings first, then seamlessly introducing them to what I am trying to raise/gage awareness about. In only one day I have had 5 responses. That is a good indicator that students are doing my survey because they are interested in the topic, rather than because it's compulsory. This means I am more likely to get accurate information from respondents.

When I designed the survey, I intentionally used a variety of formats that were at my disposal because I wanted to gain a more in depth understanding of my audience (it helps create a holistic picture). However I only used those which were obviously beneficial. For example, the multi' choice question about study load would help me benchmark which student-type is more likely to be highly stressed or at risk of compromising their well being (overloaders?). This would help me target the group specifically when attracting students to the use of the service later on. Some designs were not suitable however. For example, net promoter score is not something you would use when gaging awareness, because it would be inaccurate information coming from a person who didn't know about or has not used the service. The insight from such a question would be irrelevant and misleading.It would disrupt my results. 

Overall, I am satisfied with my survey and the results. My only concern now is that of 4 participants, Only 1 has left feedback for my actual survey. While I love positive feedback (which it was), there was nothing highlighted for me to improve - which becomes problematic now as I write my blog because it is harder for me to identify what I could have done better. 



Friday, 11 March 2016

A View From The Receiving End

What Makes a Strong Business Case for a Public Relations Program? This week's unit materials covered a range of details in relation to creating a business case for a Public Relations Program. The emphasis (especially in the reading) was why it is so important to have a strong business case - and also what the characteristics of one, actually are. I have worked in the retail end of a major telecommunications corporation for four years. I already understand the basics of why it is so important to have a strong business case, it's practically obvious. In addition to the obviousness of why it is necessary, I am usually on the receiving end of internal Public Relations (PR) programs. Even as I write, there is an organisational incentive movement at my workplace that is is aimed at achieving three things:

1. Saving the company thousands, by reducing employee errors to 0% and emphasising organisational resources that will help us perform our tasks correctly.



2. Changing employee attitudes and behaviours through empowerment (training) and accountability (KPI's) - encouraging a 'get it right the first time' approach to everything we do and measuring us on it.



3. Improving the organisation's image with consumers through the seamless experience of having services with us.



I understood that I was subject to a PR program, but I only identified the goals that I have outlined above, I didn't understand how the rest had came to be. Austin & Pinkleton (2015) essentially clarified for me however, that the first thing my company must have done, was conduct quantitative research - sourced from internal data recorded and kept by us. They would have then discovered key issues that were contributing to unnecessary expenses.

"My understanding of 'researching', is that it is
a process of conducting a thorough investigative inquiry
into a situation, the factors that make up
or influence said situation, and the
influential or relevant entities that
are present within the situation."


Research revealed an area of opportunity that was impeding organisational success. Once the relevant information had been collated, they defined the target audience and formulated goals. So how did our PR team develop a strong business case for the employee PR program? Strong research was their foundation. The goals, execution and (ongoing) evaluation naturally followed their latter-chronological paths. In the case of my company, it is highly unlikely that executives would have approved a nation-wide campaign without knowing that it would yield a return on investment. Now that I know how to analyse our own programs, I am confident that our organisation's performance will lift and that I may even have a very good chance at being a part of that PR team some day.



The reading and the recognition of its real life application within my organisation have made me appreciate the value of quantitative research more than I did before.. Quantitative research is difficult to contradict and critical to an organisation. The only problem (I feel) is that it can be shallow in the fact that it does not understand or reflect motive. Qualitative data however can go beyond face value; causing the why and how behind an action, to surface. Personally, I employ both techniques when I am trying to get my own way in anything I do.

I am such an advocate for qualitative research however, because everything to do with people, is subjective. People are not 'fixed', but ever changing. I also happen to be doing a degree that is designed to specialise in just that - people. Thus I have reached the conclusion that qualitative research is the one for me, because it gets to the root of things. I know however that the method I employ will be influenced by the nature of the issue and industry; and that I will undoubtedly have to employ and value quantitative research, should I secure a position in the PR department of my organisation.