Friday, December 24, 2010
Here's Ben "Bad Science" Goldacre presenting on how non-evidence-based medicine can kill you and thousands of your friends.
Ben's book, titled Bad Science, is great as well, and it comes highly recommended.
He had to take out a chapter to avoid litigation by Mathias Rath, who travelled to South Africa and widely publicised claims that vitamins can cure AIDS. This fell into line with the South African AIDS denialists.
Ben posted that chapter on his website, and it may be one of the most important things ever written in the area of critical thinking. Lack of responsible treatment for AIDS kills hundreds of thousands of people in Africa alone.
When people like Ben win, lives are saved. The more people who know about him, the better. He's a true hero of skepticism.
Wednesday, December 22, 2010
Researchers from the University of Birmingham in England collected health data for 1,300 people over a 14-year period. They also gave them each a stress test, which determines how a person's blood pressure and heart rate respond to stressful situations.
The more a person's vital signs remained unchanged during the stress test, the more likely they were to develop depression and obesity over the next five years -- and the more likely they were to report they were in poor overall health.
So, the saying shouldn't be "serenity now, insanity later"; rather, it's more like "serenity now, total physiological malfunction later."
Take some time to develop acceptable ways to release your emotions on a regular basis and stop carrying this added burden of emotional baggage.
Let it go! But don't aim it at anyone!
Saturday, December 18, 2010
Thursday, December 16, 2010
More information here at www.google.com/postini
Online attacks can be multimodal, in the sense of targeting multiple systems for maximum impact, such as the financial system (the stock exchange), physical plant (the control systems of a chemical, nuclear or electric plant), or mobile communications (mobile-phone message routers).
The trend toward supporting corporate applications on employee-owned notebooks and smartphones is already under way in many organisations and will become commonplace within four years.
The Apple iPad is the first of what promises to be a huge wave of media tablets focused largely on content consumption, and to some extent communications, rather than content creation, with fewer features and less processing power than traditional PCs and notebooks or pen-centric tablet PCs.
Friday, December 10, 2010
..With the continual emphasis on freemium business models and the ’give it away’ culture, Journalist and Broadcaster Guy Clapperton delivers a digitial reality check in which he asks us the famous phrase: show me the money!
He has been a freelance journalist in the technology and business arenas since 1989. You might have seen him on the BBC, read him in the Guardian, the Times, the Sunday Telegraph or most of the broadsheet national newspapers.
He has been freelance since 1993 and this has taught him, no matter what the superficial appeal of a business or technology idea, to sanity-check it for potential profits every time. Read more on his Alumni profile.
Watch Guy’s Insight in HD, along with other videos and speaker bio. His Twitter account
Thursday, December 9, 2010
In the latest edition of the Lab Matters video series, Ryan Naraine talks with Eugene Kaspersky about the state of the malicious Web and the evolution of malware from:-
- intrusion; viruses and worms, through
- Cyber crime; botnets to
- Cyber Warfare; Stuxnet and beyond.
Ernst & Young figures that something like two-thirds of major global enterprise businesses (the so-called Fortune 500) are now publishing a corporate sustainability or corporate responsibility report of some type. But what information should be included or excluded from these disclosures? How often should they published? Who should contribute information? Should financial analysts be explicitly briefed, especially since more of them apparently include these considerations in company valuations?
Those are among the broader questions explored in an Ernst & Young report called “7 Questions CEOs and Boards Should Ask About ‘Triple Bottom Line’ Reporting.”
The thing is, even though the Securities and Exchange Commissions is asking public companies to be more forthcoming about environmental-related risks and the Federal Trade Commission is cracking down on greenwashing, most of these reports are released voluntarily.
According to the report, here’s the risk of keeping triple bottom line reporting — information about a company’s environmental and social activities impact the planet, people and profits — to yourself:
“Companies that do not release sustainability information may appear less transparent than competitors that do, coming across as laggards even if they aren’t. And those that report incompletely, or which insufficient rigor may find that if reporting becomes mandatory and standards are tightened, glaring discrepancies may appear between past reports and newer ones. All of these factors have created momentum in the direction of more openness and more reporting.”
After reading the report and the questions it poses, I have these seven observations on best practices for crafting corporate sustainability or responsibility reports. I encourage you to download the entire Ernst & Young analysis, though, because it will really help your team start asking the right questions — regardless of whether or not you are already disclosing this information.
- Study your industry sector to understand which of your competitors are doing this.
- Understand the viewpoint of your major shareholders or institutional investors. According to Ernst & Young, there are two big resources to consult: the United Nations Principles for Responsible Investment and the Principles for Responsible Investment.
- Probably the most widely used framework for triple bottom line reporting today is the Global Reporting Initiative (GRI) Reporting Framework, which suggests information that should be included and how it should be presented. The GRI reporting framework is relatively mature: it currently is undergoing its third revision, and some of the latest updates are expected in early 2011.
- Make the collection of information you need for corporate sustainability reporting a part of core processes — and job mandates. Otherwise the data could be cumbersome to gather or it simply will not be a priority for your executive team.
- Consider getting a third party to verify the information you are reporting. Even though this isn’t necessarily required today, it will demonstrate a higher level of transparency plus you may get some valuable feedback on things your company can do better.
- Be careful about how you disclose data from division to division or business unit to business unit. The people reading this report will naturally be inclined to make comparisons and if the information is reported differently, your message could appear disjointed. This also harkens back to the first point on this list: You need to understand how your competitors are talking about similar information — otherwise the wrong conclusions could be drawn.
- Don’t forget to use the information you gather for this reporting exercise as real key performance indicators that can help your company become more efficient overall. Period.
Finally, here are four reports that Ernst & Young suggests consulting for great ideas in corporate sustainability reporting:
Tuesday, December 7, 2010
This "silver tsunami" has received a mixed response in the workplace. On the one hand, many employers have been slow to adapt to the changing needs of older workers and perceive them to be costly and troublesome to maintain and hire.
Data shows that people over the age of 55 find it harder to land jobs than their younger counterparts, even though age discrimination is illegal in many countries.
On the other hand, some enlightened companies are working to recruit, retrain and otherwise engage these 'older' workers.
These workers bring a lifetime of skills to their jobs and can be highly motivated and productive members of the workplace.
Many of the stereotypes that prevent employers from hiring and making good use of older workers are mere myths and bigotry.
Here are some frequent myths:
Myth. Older workers cost more than younger ones and are less productive on the job.
Reality. Both concerns are untrue. While older workers may take longer to recover from injuries, studies show that they use fewer sick days on the whole than their younger counterparts.
Private Health care costs are actually less for older workers because they no longer have small children as dependents.
When it comes to job performance, older workers frequently outdo their younger colleagues. Older workers have less absenteeism, less turnover, superior interpersonal skills and deal better with customers. The evidence is overwhelming, older workers perform better on just about every level.
Myth. People at or near retirement age tend to lose interest in their jobs.
Reality. Studies find the opposite to be true.In a report titled, "Working in Retirement: A 21st Century Phenomenon," the Sloan Center on Aging & Work at Boston College, it reported that those who worked past retirement age became more, rather than less, engaged and satisfied with their jobs.
Au contrire, the belief that older workers resist learning new things, older workers ranked "job challenge and learning" as a top source of satisfaction with their work.
Myth. Older workers in the workforce keep younger ones from getting jobs.
Reality. While it may be "a widespread belief that you have to get older people to retire to open up the career ladder and jobs for young people," the opposite again is true.
Ignorance of this fact caused many French college students to join the massive street protests last fall against raising the retirement age from 60 to 62.
Policies in countries that encourage workers to retire early actually have a damaging impact on youth employment.
This is because the growing number of retirees forces governments to finance their rising pension costs by raising taxes, which causes employers to scale back hiring or pay workers less.
In such cases, employers don't want to hire more young employees. The old notion of a fixed sum of jobs is just absolutely wrong."
Many myths about older workers reflect 20th century views of retirement that have proved to be short-lived.
Historically, the idea of people working full-time and stopping completely is an anomaly of world history.
The notion of retiring at age 65 came in with the Social Security system and employer-based pensions. But full retirement was never what most employees wanted. What they want is to keep working in some fashion. They want to change the way they work, but not stop altogether.
Monday, December 6, 2010
Games like World of Warcraft give players the means to save worlds, and incentive to learn the habits of heroes. What if we could harness this gamer power to solve real-world problems? Jane McGonigal says we can, and explains how.
Reality is broken, says Jane McGonigal, and we need to make it work more like a game. Her work shows us how.
Sunday, December 5, 2010
Below are a number of links to several s+b articles which suggest that the pressure on retailers during the next few years, stemming from changes in technology and customer habits, will be even tougher to deal with than the financial pressures of the recent past.
Perhaps the biggest change is the rise of mobile commerce [http://www.strategy-business.com/article/00053?gko=9bcc1]: a world where most shoppers have smartphones and the store boundaries have eroded.
New academic studies [http://www.strategy-business.com/article/re00122?gko=ab138] confirm that online commerce will indeed permanently cannibalize bricks-and-mortar sales.
Big changes are also coming in luxury retail [http://www.strategy-business.com/article/00033?gko=82498] and in a general movement towards minimising the provision range to create simpler, less overwhelming choices [http://www.strategy-business.com/article/00046?gko=13ead].
Consumer habits are going through a "spend shift" [http://www.strategy-business.com/article/00054?gko=340d6] toward value and frugality. How can retailers cope with all this?
Possibly by reinventing themselves with a distinctive edge. There's much guidance on this subject in the new consumer and retail electronic newsletter [http://www.booz.com/global/home/what_we_do/industries/consumer_products/CR-Foresight/crfs_signup] from our parent firm, Booz & Company.
Each reinvention strategy will be different, but every retail enterprise will need one.
Thank you Art Kleiner
Better assessments, taxonomies, decision support systems and integrated research efforts will enable the field to mature and integrate into mainstream care a new generation of digital tools to assess, enhance, and treat cognition.
Currently, there are no magic pills or general solutions to encourage brain health and flexibility, but there is a toolkit of growing value when used appropriately.
We continue to predict that between now and 2015 brain fitness will become a mainstream concept, hopefully supported by a brain-based remedial framework. A concept that consumers and institutions will have access to much enhanced digital toolkits and platforms, and that a growing ecosystem will enable this growth.
Unfortunately, all the groundbreaking research and innovation has been occurring without a parallel growth of quality consumer education and professional development.
Cognition remains an elusive concept in popular culture, which limits the ability of consumers and professionals to make informed decisions. This may well be the major bottleneck limiting the field’s potential to deliver real-world benefits and move up the remedial benefits curve.
Unfortunately, only informed demand will ultimately ensure the development of a rational and structured marketplace.
Innovative partnerships will be required to channel the growing amount of interest, research, tools and, yes, controversy, into a better structured and sustainable marketplace.
It is forecast that the worldwide market will range between $2-8 billion by 2015, depending on how well the important category bottlenecks are addressed.
This presents significant opportunities for innovation, investment, business development and, ultimately, enhanced brain health and fitness of an aging society.
Executive Summary | SharpBrains
A condition which I am reluctant to call a disease until it has been clearly identified as such or better defined.
Terry Pratchett is also determined to write about his experiences with or within the condition, for as long as he can, and as always, he has been very lucid and open about how the condition manifests itself and affects his life.
He is also not alone in raging against the debilitating condition and the difficulty in finding an effective treatment for it or an approach to avoiding the onset of the condition in the first place, if possible.
…it is strange that a disease that attracts so much attention, awe, fear and superstition is so underfunded in treatment and research. We don't know what causes it, and as far as we know the only way to be sure of not developing it is to die young. Regular exercise and eating sensibly are a good idea, but they don't come with any guarantees. There is no cure. Researchers are talking about the possibility of a whole palette of treatments or regimes to help those people with dementia to live active and satisfying lives, with the disease kept in reasonably permanent check in very much the same way as treatments now exist for HIV. Not so much a cure therefore as - we hope - a permanent reprieve. We hope it will come quickly, and be affordable.
When my father was in his terminal year, I discussed death with him. I recall very clearly his relief that the cancer that was taking him was at least allowing him "all his marbles". Dementia in its varied forms is not like cancer. Dad saw the cancer in his pancreas as an invader but Alzheimer's is a slow unwinding of me, losing trust in myself, a butt of my own jokes and on bad days capable of playing hunt the slipper by myself and losing.
Another appropriate quote is taken from Pratchett's Unseen Academicals has Havelock Vetinari speaking terry's own words about natural evil.
I have told this to few people, gentlemen, and I suspect never will again, but one day when I was a young boy on holiday in Uberwald I was walking along the bank of a stream when I saw a mother otter with her cubs. A very endearing sight, I'm sure you will agree, and even as I watched, the mother otter dived into the water and came up with a plump salmon, which she subdued and dragged on to a half-submerged log. As she ate it, while of course it was still alive, the body split and I remember to this day the sweet pinkness of its roes as they spilled out, much to the delight of the baby otters who scrambled over themselves to feed on the delicacy. One of nature's wonders, gentlemen: mother and children dining upon mother and children. And that's when I first learned about evil. It is built in to the very nature of the universe. Every world spins in pain. If there is any kind of supreme being, I told myself, it is up to all of us to become his moral superior.
A call to rise above the casual cruelty of nature. Terry Pratchett has been afflicted with a disease which has, currently no known cause or cure. It is a condition that is slowly destroying his 'mind' and/or his current view of himself.
Perhaps the greatest fear of all, is that we lose the very intimate recollection of who we are because our perception of 'self' is all we have to identify ourselves as individuals and thus, separate ourselves from a collective entity.
Saturday, December 4, 2010
Tom Scott's Ignite London talk "Flash Mob Gone Wrong" is a fictional account of just how badly a flash mob could go.
It's got an eerie ring of plausibility, largely because each of the steps leading up to the disastrous ending actually happened, just not all together. It's a freaky way to spend five minutes.
Saturday, November 27, 2010
It all depends on who the leaders are managing, according to Grant and co-authors Francesca Gino of Harvard Business School and David Hofmann of the University of North Carolina's Kenan-Flagler Business School.
Their paper, forthcoming in the Academy of Management Journal, is titled "Reversing the Extraverted Leadership Advantage: The Role of Employee Proactivity."
Extraverted leadership involves commanding the centre of attention: being outgoing, assertive, bold, talkative and dominant. This offers the advantages of providing a clear authority structure and direction.
However, pairing extraverted leaders with employees who take initiative and speak out can lead to friction, while pairing the same group of employees with an introverted leader can be a pathway to success, the researchers note.
This has implications for leaders and managers at all levels who want to improve their own leadership styles.
"If you look at existing leadership research, extraversion stands out as the most consistent and robust predictor of who becomes a leader and who is rated as an effective leader," Grant says. "But I thought this was simplistic and incomplete. It tells us very little about the situations in which introverted leaders can be more effective than extraverted leaders."
Read more at www.knowledge.wharton
Wednesday, November 24, 2010
RFID tags have made their way into document management to allow them to be physically tracked, as they move through an organisation.
An interesting way to know just where every document is at any given time and who’s holding it and who's working on it.
The video describes the process and shows the requirement for an RFID reader in the paper tray.
Tuesday, November 23, 2010
It’s claimed the software, Risk Manager from Xactium, will encourage a “collaborative approach to risk management through its organisation-wide visibility and remote accessibility”.
There are hopes the implementation at Business Stream, which provides water and waste services to over 96,000 businesses and the public sector in Scotland, will also promote better risk culture across the organisation, together with improvements to business efficiency, and “demonstrate Business Stream’s risk management commitment to customers and industry regulators”.
Risk Manager is built on Force.com, the Cloud application platform from Cloud Computing vendor, Salesforce.com.
Paula Louchart, IT solutions manager at Business Stream, said “Xactium’s Risk Manager stood out for us because of the speed and ease of implementation.
We were already familiar with Salesforce, so we know that the platform offers impressive business advantages, such as better visibility and effortless reporting”.
“We are delighted that we have been able to provide Business Stream with a risk solution that matches their forward-thinking ethos,” added Andy Evans, managing director of Xactium.
“It is a very positive sign for us that businesses are now recognising the value of cloud-based risk management. The flexibility and customizability it offers hasn’t been available with traditional systems.”
Monday, November 22, 2010
Don't let the picture put you off!
What do you get when you cross sociology, cognitive psychology, neuroscience and an outrageous purple suit? The result is Robin Wight’s insight at Like Minds - a nonstop ride of case studies, quips, and powerful revelations that brings Robin to one inexorable conclusion: the future is indeed social.
Robin Wight is the president of both WCRS and Engine, where he has handled the BMW account for 31 years and been the creative driving force behind their prominence.
Watch Robin’s Insight in HD, along with other videos and speaker bios.
Tuesday, November 16, 2010
Monday, November 15, 2010
Thursday, November 11, 2010
The political landscape in Washington and around the country shifted considerably as a result of the midterm elections, with Republicans taking control of the House, gaining ground in the Senate and claiming several high-profile state offices against incumbent Democrats.
What are the elections' implications for the economy and the stock market, health care reform, the Obama administration's leadership strategy and the future of both parties going forward?
Knowledge@Wharton spoke with Wharton finance professor Jeremy Siegel, insurance and risk management professor Kent Smetters about these and other issues.
Wednesday, November 10, 2010
Monday, November 8, 2010
Give the people you’re working with some credit and believe that they genuinely want to do a good job. When things go wrong, listen to what they are saying and examine all the factors. Take the appropriate actions with a positive attitude and an encouraging tone.
You must nurture your people but you don’t have to baby them. You do have to remember that they are thinking and co-operative human beings that deserve the respect of being treated as such.
If problems persist and you feel that work is actively being avoided or approached halfheartedly, then it may be time to adopt a more assertive approach.
However, this does not mean resorting to belittling comments and ad hominem attacks! It simply means that you shouldn’t live in denial of ineffective workers and must take steps to either push them into action or replace them.
There are many people that may respond much better to a direct and assertive management approach. Sometimes a considerate, forgiving boss simply isn’t respected in lieu of a strong and powerful leader.
The key to good management is being able to effectively take on either style when a situation calls for it. The adaptive manager will always be more effective than the fixed approach adopted by both the stern and understanding managers.
Sir Ken Robinson - Changing Paradigms
Sunday, November 7, 2010
Today we present Jennifer Aiker and Andy Smith’s book The Dragonfly Effect: Quick, Effective, and Powerful Ways To Use Social Media to Drive Social Change.
It’s not only a book about social change, but it’s a book about the presentation and understanding of a formula, showing how to use web tools to build marketing value.
Dragonfly Effect presents a simple formula that you may want to absorb and apply to your own projects. Check out the review by Chris Brogan the social media guru.
Wednesday, November 3, 2010
Consider some of the meanings expressed by this short cartoon:
- Complicated plans don’t work. If you can’t understand the plan, then be prepared to die (metaphorically speaking). Far better to break large projects into a program, or portfolio, of smaller one. If you can’t wrap your mind around the scope of a project, then it’s too big and almost certainly doomed to fail.
- “Spraying energy into the vortex of failure” doesn’t work. Neither wishful thinking nor the vain imaginings of an enthusiastic team are sufficient to solve the complication problem in the last bullet. Oh yes, if only wishful thinking worked, the world would be a better place.
- Your boss really doesn’t care. Sure, it’s a stereotype, and I beg mercy from all the great managers out there but, the fact remains, the myopia of people not directly connected to solving the problems can be strong. Which means their project is your problem.
At the very least, learn to recognise signs of potential catastrophe well in advance. If you know a problem exists, then there’s at least some hope to fix it before failure strikes.
Social media is a buzz word for all web tools that enable user interaction and networking. These include Twitter, Facebook, blogs and multimedia sharing. A social media strategy, when executed properly, can help companies stay relevant, create brand associations and conversations, interact with customers, and reach previously untapped, prospective customers.
A downside of social media is that it's hard to measure marketing efforts but that doesn't mean you shouldn't join the conversation. Whether you're on social networks or not, consumers are already talking about your brand; it's important to moderate and play an active role in those discussions.
For this reason, social media is making its way into more and more marketing plans. But since it's a relatively new concept, many companies are still experimenting with it. Don't let the unknown factor deter you from diving in though. Even at this early stage, you need to set your company up in the right direction and start participating.
Companies that are already well established may find their different departments, such as IT, digital, and sales, currently compete with one another instead of work together. A good social media strategy will effectively harmonize the disparate departments, increasing the overall productivity of the company.
When used properly, social media can increase positive brand interaction within a community of workers and outside customers through feedback, comments and participation. It also keeps a company’s Internet presence under control by monitoring online mentions of the brand.
Companies, then, are able to keep track of both positive and negative feedback. In addition, social media is viral by nature. When links are shared by the community, traffic increases to a company's website, improving the presence of the brand across social channels.
Beware, a poorly executed strategy can have negative effects. If a company tries to hold the reins on social media too tight, they'll end up censoring and dictating their online presence, killing all of the fun and creativity the platforms can offer. On the flip side, if a company allows their social media to get out of hand, their brand can become tainted with fallacies and unproductive interactions.
So, how do you create an effective social media strategy? Try following these guidelines to get started:
1. Be realistic. Don’t set impossible goals.
If the company has been around for a long time, catching up on all the social media hoopla might take a while. So do it step by step, little by little. Start with something basic, yet essential, such as turning the company newsletter into a blog. Or, to go even more basic, make sure the company has a Twitter feed and a Facebook page.
Then, move on to bigger and more ambitious things, such as establishing an official social media policy in the company. Message boards and collaborative forums are just a few tools you should keep in mind.
2. All members of the company should embrace the social media strategy and become an active part of it.
Top management members need to be as up to date with what’s going on as the other members of the community. Social media shouldn’t be excluded to the PR department or outside consultants. Even the community, audience and consumers need to be active participants. In this way, the company will be able to respond and adapt quickly to needs, concerns and questions.
Also, each person who participates in your company's social media conversation is a representative of it. They are an advocate for your company on the web, and their contributions can reflect heavily on your brand.
3. Look at the long-term picture.A good social media strategy will truly last, even though the nature of social media is ever-changing. Not only does the company need to keep up with these changes, it needs to always be one step ahead.
Monday, November 1, 2010
When applied to disruptive technology adoption by organizations, the “Five Stages” framework provides clear insight in anticipating how likely an organization is ready to embrace change.
Recent conversations with line of business operations managers about Social CRM identify both lack of awareness and high levels of internal resistance towards adoption.
In a recent phone and in-person survey of 31 front office operations owners (i.e. sales executives, support executives, and COO’s) about their attitudes on Social CRM, 67.7% (i.e. 21/31) expressed denial, 16.1% (i.e. 5/31) felt anger, and 9.6% (i.e. 3/31) experienced bargaining, 3.2% (i.e. 1/31) encountered depression, and 3.2% (i.e. 1/31) achieved acceptance (see Figure 1).
Figure 1. Most Front Office Executives Live In Denial About SCRM
The Bottom Line For Buyers (Users) – The Kübler-Ross Model Provides Techniques To Expedite The Internal Acceptance Of SCRM
The five stages of SCRM adoption describes each phase, discusses the typical reactions, and addresses how to move forward. Organizations can expect stakeholders to progress through the phases at their own pace. Expect organizations to fall between a 10 month to 21 month range.
However, proponents can accelerate the process through both qualitative understanding and quantitative support. Here are the five stages:
- Stage 1: Denial (Average duration 3 to 6 months). Many executives will put up defenses and excuses when initially broached about the need for SCRM. They may have a point should no quantitative data exist or may feel as strongly as my fellow Enterprise Advocate, Dennis Howlett does about SCRM. Typical responses include,”Social CRM is just another XLA fad”; “None of our customers will ever use this stuff, just look at how much we invested in CRM”; “Is there really any return in these social things?”
Approach: Proponents should share adoption trends based on SCRM Use Case #F1 – Social Customer Insights. If the analytical data provides the quantitative support on social media trends, then expect both a heightened awareness of the market realities and a rapid progression towards the second phase. In many consumer and classic B2C industries, the data will show that some significant population group is having a conversation in a social channel. The next set of questions to answer, “Can they be influenced and will they buy?” Now, if the data shows that customers, for example in classic B2B industries such as aircraft maintenance, repair, and overhaul (MRO), have barely adopted any social tools, then it will make sense to wait till social media adoption has hit a critical mass.
- Stage 2: Anger (Average duration 1 to 3 months). As data flows in about where customers are having conversations about an organization’s product, the individual recognizes that denial can not continue. “Outside” conversations happening without the supervision of the firm will most likely enrage the management team. Executives will often ask, “Can these customers really do this without us?”; “Why doesn’t our existing efforts have the same effect?”; “Do we have to deal with another channel?”; “How come we have to waste all this time on SCRM”; “Who’s fault is this?”
Approach: At this point, stakeholders will express their rage at anyone and anything they can. Proponents should let the individuals vent their frustration. From there, help the stakeholders visualize a time table and project plan to support the SCRM project. Show them how to engage and influence the customer. Let them know they no longer control the conversation.
- Stage 3: Bargaining (Average duration 3 to 6 months). Despite all logical arguments, stakeholders will begin to rationalize the situation. Excuses to postpone taking action balance out the recognition of the urgency to adopt SCRM. Quotes from the survey include, “None of our competitors are doing this, why should we?” “If we can hold out for a few more years, we’ll be okay and the market will be stable”; “Can we just do one part and not the rest?”
Approach: Begin the discussion on what impact could occur due to inaction. Highlight the elements of a successful approach and the dependencies. Layout the holistic point of view. Encourage outside advisors to provide an alternative but independent point of view.
- Stage 4: Depression (Average duration 2 to 3 months). Realization that a lack of financial and labor resources will hamper adoption, forces depression upon front office executives. Stakeholders express thoughts such as “We don’t have the funding, this will never go through”; “Social CRM is inevitable, but no one is trained on this stuff”; “By the time we put up the current version, the world will be two generations ahead”
Approach: Walk through the timing of cost and benefits with an emphasis on ROI. Apply good project management discipline to identify resource requirements and milestones. Identify skill gaps among the team. Highlight the road map and project plan again and show where phases could be accelerated. Identify success factors from previous projects.
- Stage 5: Acceptance (Average duration 1 to 3 months). With the facts in hand, a plan in place, and change management in effect, stakeholders may have turned the corner. A realization that customers in social media channels are here to stay. This “social thing” is cultural not a fad. The tenor in hallway conversations shifts from stages 1 and 2 to “It’s going to be okay, we can make this work.”; “I can’t fight the social trend, we may as well prepare for it and win.”; “Let’s put some money behind this but continue to monitor and test”; “Get that team ready to go.”
Approach: Don’t celebrate yet! Put the plan in place. Apply continuous monitoring and testing. Fail fast. Put those learnings back into the next iteration. Coordinate the ecosystem for success.
Saturday, October 30, 2010
Thursday, October 28, 2010
Weeks after the Wall Street Journal blew the whistle on lax data privacy standards on Facebook, a string of class action suits attempt to hold the social networking giant, as well as game company Zynga and Google, liable for what the suits contend are, 'lax practices that allow advertisers to harvest personal information on Web users.'
The suits are seeking monetary damages on behalf of potentially millions of users of Facebook, Google and game company Zynga. The suits allege that the users' personal information has been leaked to advertisers and other unauthorized individuals, in violation of the companies' privacy policies and a number of state and federal statues protecting the confidentiality of electronic communications.
This does put a handy slogan on a view about moral responsibility. On the face of it, the sayings are accurate: while a gun can be used to kill a person, guns are not themselves moral agents.
As such, a gun or any weapon, bears no moral responsibility for any deaths that it might be used to bring about.
Today, we will be looking at applying this argument to the use of hacking program, in particular, one called Firesheep, not to be confused with the user-friendly browser Firefox or the emulator Sheepshaver.
Firesheep was written by Eric Butler and brings easy to use 'hacking' functionality to the Firefox web browser. The 'add-on' allows users to view information in internet cookies at sites such as Twitter, Facebook. Flickr, Tumblr and Yelp.
Fortunately, Firesheep is limited in what it can do. It can allow a user to get usernames and session number IDs but it cannot be used to get passwords. In effect, it allows users to view information e.g. a person’s Facebook or Amazon account, but does not let users do anything that would require a password.
It is also limited to hacking on the same network. However, this means that if you are reading this blog on a public wi-fi, then someone with Firesheep could be reading through your darkest Facebook secrets. It is very popular for the 'man-in-the-middle' interceptions used in cafes and public sites. So remember that the creepy fellow sitting two tables down, may also be reading your pages and Tweets too.
The creator, Eric Butler makes it very clear that he sees himself as a white hat: he is hacking to expose vulnerabilities so that they will be fixed. Interestingly, he does directly address the moral issue at hand: “The attack that Firesheep demonstrates is easy to do using tools that have been available for years. Criminals already knew this, and I reject the notion that something like Firesheep turns otherwise innocent people evil.” (Discuss!)
Firefox's response to the topic of Firesheep and hacking on their browser
On the face of it, Butler may be correct. Firesheep, like other tools, is not some sort of cursed weapon that can possess the mind of potential victims and compel them to do evil, unlike television and other media. Clearly, the same is true of other potential harmful pieces of technology, such as guns and junk food.
Therefore, Butler and the other folks who make such tools openly available, are not directly accountable for what people do with the tools. Using the same argument as arms dealers, saying, “I just provide the weapons, the customer does the actual killing.”
Clearly, Butler has no malign intent in creating and releasing Firesheep. Rather, he seems to be like Dr. Gatling, he is hoping (albeit naively) that his creation will do good, rather than generating further evil.
There is another, deeper concern. Namely that providing the tools that makes misdeeds easier makes a person accountable to a degree. While the person who invents or distributes such tools or weapons does not make people evil or make them do misdeeds, the person does make such misdeeds easier.
Check out what Network World are saying about the Firefox and Firesheep threat
Therefore, the person providing the tool does play an indirect causal role in the misdeeds, especially if the tool or weapon serves as a “but for” cause e.g. if someone would have been unable to track down the whereabouts of, and start stalking, their Ex girl friend, without using Firesheep. The assumption is that the Ex would not have been stalked but for intervention of Firesheep. Therefore, making misdeeds easier does appear to bring with it a degree of moral accountability.
Butler answers this sort of criticism by stating that other tools already exist to do just what Firesheep does. Firesheep is just a better known and easier to use tool. So, to use an analogy, Butler is not inventing the gun, he is merely making the gun easier to use.
“Firesheep doesn’t hack. People hack with Firesheep.” You decide!
The suspected virus affects mostly children and older people, who suffer from a high fever, vomiting and headaches before succumbing, officials said Thursday.
"We are not able to identify the virus that is causing the deaths. It could be a mutant form of dengue or malaria, but we are not sure," said S.P. Ram, the state's top medical official. "Microbiologists are trying to pinpoint the exact cause."
In the state capital, Lucknow, about 340 people have been sickened and at least 51 have died, said Manish Mishra, a government spokesman.
Blood samples have been sent to the National Institute of Communicable Diseases in New Delhi to identify the disease, Mishra said.
Health authorities blamed unhygienic conditions for the spread of the disease, which has particularly hit Lucknow's Khadra neighborhood.
"We cannot give the exact reason for the deaths, but it could be due to unhygienic living conditions in Khadra," said A.K. Shukla, Lucknow's chief medical officer.
Heaps of garbage, open drains filled with fetid water and clogged sewers mark the entrance to Khadra, home to around 250,000 people. The community tap, located next to an open drain, supplies darkish brown water, which people use for drinking and cooking.
"We are living in hell. We drink muddy water and live in a neighbourhood full of filth and dirt," said Kamla Maheshwari, a housewife, as she waited for her turn at the community tap.
Monday, October 25, 2010
Communication between Wi-Fi devices isn’t specifically new. The Nintendo DS, for instance, has had device-to-device Wi-Fi interaction for some time, but the technology is proprietary.
The Wi-Fi Alliance differentiates Wi-Fi Direct by certifying the standard, ensuring interoperability. Devices stamped with the Wi-Fi Direct certification don’t need wireless networks, as they essentially become micro-hotspots.
This technology will conceivably allow devices like an Eye-Fi memory card to directly beam an image to a wireless printer. Since Wi-Fi Direct is largely software based, many recent devices should be upgradeable.
Speeds for Wi-Fi Direct are based on 802.11b/g/n channels, so we’re looking at intra-device throughput at rates upward of 300Mbps. Range will also be a major selling point, and it’s reasonable to expect that future Wi-Fi Direct devices will eventually achieve distances similar to our home wireless networks.
Bluetooth will undoubtedly be the first technology to suffer as a result of Wi-Fi Direct. Although Bluetooth is aimed, almost universally, at close connections like headsets, it will be hard to trump the speed of Wi-Fi direct. Additionally, Wi-Fi Direct would use the same transponders as other Wi-Fi functions, so device manufacturers will likely be quick to cut redundant technologies.
Here’s a quick animation that illustrates the functionality of Wi-Fi Direct:
How To Improve Memory And Concentration
If you're looking to improve your memory and concentration, the first thing is, you have to make sure that what you're trying to remember is important to you.
A colleague once said to his wife, in an unwise moment, 'I have real problem remembering people's names'. She said, 'When they start to matter to you, you'll remember their names.' So you've got to make sure that what you're memorising has significance to you.
For years observers have predicted a coming wave of e-textbooks. But so far it just hasn't happened. One explanation for the delay is that while music fans were eager to try a new, more portable form of entertainment, students tend to be more conservative when choosing required materials for their studies. For a real disruption in the textbook market, students may have to be forced to change.
That's exactly what some companies and college leaders are now proposing. They're saying that e-textbooks should be required reading and that colleges should be the ones charging for them. It is the best way to control skyrocketing costs and may actually save the textbook industry from digital piracy, they claim. Major players like the McGraw-Hill Companies, Pearson, and John Wiley & Sons are getting involved.
To understand what a radical shift that would be, think about the current textbook model. Every professor expects students to have ready access to required texts, but technically, purchasing them is optional. So over the years students have improvised a range of ways to dodge buying a new copy—picking up a used textbook, borrowing a copy from the library, sharing with a roommate, renting one, downloading an illegal version, or simply going without. Publishers collect a fee only when students buy new books, giving the companies a financial impetus to crank out updated editions whether the content needs refreshing or not.
The new plan: Colleges require students to pay a course-materials fee, which would be used to buy e-books for all of them (whatever text the professor recommends, just as in the old model).
Information technology departments have often been accused of slowing down change or innovation, since systems can take time to adapt to new processes. However, a new survey reveals CEOs view their IT departments as the best thing they have going when it comes to innovation.
These are part of the findings found in Olympus Corporation of the Americas’ recently released findings of a Harris Interactive survey of the attitudes of 304 Fortune 1000 executives toward enterprise innovation. The study had some other interesting findngs as well. For example, most CEOs want an innovation culture as a way to attract and retain employees, and most say there’s too much short-term thinking to focus on innovation. An executive summary of the survey is available here.
IT is viewed as having been the most innovative function within executives’ own companies during the past 10 years (44 percent), and by far the most likely focal point for investment (60 percent) and continued innovation (63 percent) over the next two years.
Many of the innovations that companies are depending on to compete in a hyper-competitive global economy — analytics, e-business, automation, and mobile to name a few — are all about IT.
The survey finds that executives see a culture of innovation as crucial to not only growing their businesses (95 percent) and profitability (94 percent), but also for attracting and keeping talent (86 percent). However, more than half of executives (53 percent) say their company does not focus enough on enterprise innovation, citing the following obstacles to innovation:
- Pressure to meet short-term goals and achieve quick results (64 percent);
- Other business goals or objectives taking priority (61 percent);
- Lack of incentives to inspire or reward enterprise innovation (36 percent);
- Lack of systems or tools for fostering enterprise innovation (31 percent);
- Insufficient resources to enable high-quality human capital to focus on innovation (29 percent); and,
- Lack of support from senior leadership (19 percent).
While IT is seen as the main proponent of innovation, executives and managers in this area of the business may also be stymied by short-term priorities. IT departments are often so busy fighting fires and trying to keep the lights on with an overstretched staffs that long-term efforts end up on back burners. Here’s where a close partnership with the business side can make a difference, and keep exciting new innovations on the top of the priority list.
Sunday, October 24, 2010
Unlogo is a web service that eliminates logos and other corporate signage from videos.
On a practical level, it takes back your personal media from the corporations and advertisers.
On a technical level, it is a really cool combination of some brand new OpenCV and FFMPEG functionality.
On a poetic level, it is a tool for focusing on what is important in the record of your life rather than the ubiquitous messages that advertisers want you to focus on.
For more information visit the Unlogo website here
Packed full of good economic sense, HR insight, philosophical wisdom and unique problem-solving strategies.
Saturday, October 23, 2010
However, creativity comes in much simpler forms such as formulating a solution to an everyday problem; if someone runs out of fuel on the highway, the person must think of a way to get to his/her destination, and this requires creativity even if it is in its simplest form.
Creativity can be observed in the unusual as well. For instance, Craig Wallace, now a college freshman, developed a nuclear fusion reactor out of junkyard parts and cheap finds. Creativity is not just the writings of Descartes or the oil paintings of Klimt, so what is it?
What Is Creativity?After exhaustive research, Morgan (1953) listed the universal factor for creativity to be novelty (Cropley, 1999). Novelty requires originality and newness. There must be something fresh to the idea.
Sternberg and Lubert (1995) proposed that novelty must be coupled with appropriateness for something to be considered creative. Novelty can be the coalescence of any two or more different things or thoughts. For instance,
Damien Hirst is a controversial artist who has sliced animals into fragments, but many people do not consider this creative even though it is novel and original. Many people do not recognize the factor of appropriateness in his work and consider it to be feckless.
Although creativity can be seen in the products, it can also be considered in terms of the process. Weisberg (1986) proposes that creativity can be defined by the novel use of tools to solve problems or novel problem solving. Dr. Gunther von Hagens has in the past few years started exhibiting the dissected and transfigured bodies of people.
Professor von Hagens is a medical professor at the University of Heidelberg who perfected plastic injection into bodily tissue. This is a novel use of tools to solve the problem of decay and distortion from old methods of preserving human tissue. The end product is creative because of the creative use of tools.
Ward, Finke, and Smith (1995) defined creativity in the products made, the differences in people, the pressures that motivate, and the processes behind creativity. The products made are new and fresh which is the clearest example of creativity.
However, there are defining subtleties in people; for example, some people are considered to be more creative than others, and in addition to inherent differences in people, there are different motivations for creativity (e.g., some people are driven to create).
Finally, the process for creativity can be different. Some people seclude themselves while others seek guidance and dialogue.
While there is debate over the guidelines for judging creativity, two things remain: novelty and appropriateness. These two things may be viewed in the product, the tools, the people, the motivation, and/or the processes, but these are the two necessary ingredients.
ConclusionOnce considered to be the result of insanity or divine intervention, now the mystery behind creativity is slowly being revealed. There has been much debate over what exactly creativity is, and now creativity is believed to be characterised by novel and appropriate ideas, products, and/or use of tools.
It was once thought that creativity was caused by psychoticism, but now it is considered to be a series of cognitions following some sort of process. The process is not precisely known, but there are thoughtful speculations which remove the mystery from creativity and the stigma that it is only being possessed by geniuses.
With all this new information, there is a great deal of implementation. AI is now being considered to be more alive if it possesses creativity, and theories are quickly being developed as to how to program creativity.
Education is attempting to encompass creativity in addition to the acquisition of hard facts and other skills, and business is noticing the importance of creativity in furthering growth of individual companies and departments.
To read more visit.....................A Brief Review of Creativity
Friday, October 22, 2010
Wednesday, October 20, 2010
Tuesday, October 19, 2010
Granovetter's paper was later popularised by the international bestselling book, The Tipping Point: How Little Things Can Make a Big Difference by the esteemed Malcolm Gladwell.
In his book, Gladwell teaches us how Paul Revere and this "weak-tie" phenomenon contributed to the success of The American Revolution.
Paul Revere had a broad network, a fast communication system (a horse), and a catchy phrase far less than one hundred and forty characters: "The British are coming!"
In "Small Change," Mr. Gladwell admits that social media activism is "a wonderful thing" empowering citizens with "marvelous efficiency."
The American Revolution and Civil Rights Movement were not tweeted, but to suggest that emerging tools like Twitter have no part to play in the future of meaningful change is absurd.
Little things can make a big difference and social networks are the carriers of change.
"Viva la revolución."
"Small Change" dismisses leaderless, self-organising systems as viable agents of change. A flock of birds flying around an object in flight has no leader yet this beautiful, seemingly choreographed movement is the very embodiment of change.
Rudimentary communication among individuals in real time allows many to move together as one--suddenly uniting everyone in a common goal. Lowering the barrier to activism doesn't weaken humanity, it brings us together and it makes us stronger.
Given any set of circumstances (A) and the laws of nature (L) then (on the assumption that the laws of nature are -in this universe at least- inviolable) then A plus L will inevitably lead to their consequent B.
It goes without saying that A will itself be the consequence of a set of antecedent circumstances in conjunction with L.
Determinism has been taken by many philosophers to be incompatible with Free Will on the grounds that our actions are the product of “choices” both of which are part of the natural world and are therefore subject to L.
Choices are also “events” and are therefore the inevitable consequence of some set of antecedent circumstances acted upon by L; as are the expression of those choices in action.
Determinism may or may not be true but if it is true then there is no room (so the incompatibilist argues) for free will.
Free will is an illusion: occasionally comforting, occasionally not.
Compatibilism on the other hand argues that if we allow that our choices are uncaused (call this indeterminism) then this makes them random and therefore not choices at all: the very concept of free will seems inimical to randomness. It is the belief that free will and determinism are compatible ideas, and that it is possible to believe both without being logically inconsistent
There must, therefore, be an account of free will that rescues it from determinism. Other philosophers (most notably Peter van Inwagen) have suggested that it might instead be the case that the concept of free will is incoherent since it seems inconsistent with all logically available positions regarding the truth or otherwise of determinism.
At best free will is mysterious on this view.
The paradox of the human condition, is that we are at one and the same time objects in a world of other objects and governed by the same physical laws as those objects, and simultaneously freely choosing subjects with an apparent perspective on that world of objects, from which it follows that we are apart from that world of objects.
Freedom, again, is mysterious on this view and to set up free will in competition with determinism is misconceived.
We are already well aware that there are professional and national gangs of cyberthieves crafting malicious code that can target and steal bank passwords and other sensitive corporate data online.
However, that’s not the subject under discussion today; it’s corporate carelessness, incompetence and stupidity, combined with very mobile devices whose capabilities are drawing corporate IT players out of their comfort zones.
Three out of five workers believe they need to be in the office to be productive, according to Cisco. They feel so strongly about being mobile that they’d sacrifice (a small) part of their valuable salary to retian this freedom and flexibility, even although it generally means putting in longer hours.
This is especially true outside of the Westernised world; India, China, Brazil and Spain.
Two-thirds of workers make demands on IT resources and services to enable them to use any device they believe suitable and appropriate, whether personal or corporate, to access the company network, regardless of date, time or location.
Nearly half of the IT people surveyed said they were not ready to allow or enable this, siting security as their biggest concern. Clearly, constrained budgets and the limited skills of staff is also a deterent. However users continue to make more demands on th eIT services to provide this capability and if it is not forthcoming they see IT as the obstacle.
You know that IT departments have good reasons to be concerned:
- 1 in 5 workers said they’ve noticed strangers looking at their computer screens in public – and another 1 in 5 said they don’t bother to check who’s looking at their screens.
- About 1 in 5 workers have left their computing devices unattended in public.
- Nearly 3 in 5 workers lend their devices to people they don’t work with — and then don’t supervise them.
- As for the IT people, 1 in 4 said a quarter of the devices they’ve issued to employees in the last 12 months are already either stolen or lost.
- And we are well aware of the Man-in-the-Middle tactics of public access WiFi sites and Hotspots in cafes and libraries.
This survey was actually two surveys – one of employees, the other of IT professionals, 2600 people in all – in 13 countries: the U.S., Mexico, Brazil, the U.K., France, Spain, Germany, Italy, Russia, India, China, Japan and Australia. Cisco sponsored the survey, but it was conducted by a third party,
Here’s a parting remark from the chief technologist (and futurist) of Cisco’s Internet Business Solutions Group — Dave Evans: “Work is not a place anymore. It’s a lifestyle…”
A LUCID dream has three phases. First you experience the dream as reality. Then you recognise it as a product of your mind. Finally, you gain the power of control.
Morality is proceeding along similar lines. We have long thought of moral laws as fixed points of reality, self-evident truths rooted in divine command or in some Platonic realm of absolute rights and wrongs. However, new research is offering an alternative, explaining moral attitudes in the context of evolution, culture and the neural architecture of our brains.
This apparent reduction of morality to a scientific specimen can seem threatening, but it can be explained. By unmasking our minds as the authors of our morality, we may be better able to affect the narrative arc towards a happy (or happier) ending.
One way to do this is to recognise the ways in which evolution has shaped morality. Social psychologist Jonathan Haidt asked students at the University of Virginia in Charlottesville to imagine a brother and sister engaging in secret, consensual, protected sex. Would that be wrong, he asked? Most thought so.
However, when asked why, the students floundered. Protection meant no threat of disabled children, and secrecy brought no possibility of disclosure or embarrassment (in the short term). The pair had no conscience or regrets because it was through mutual agreement and consensual. So how is it wrong?
Perhaps incest is simply an arbitrary taboo, passed on through religion, law, parents and peers or is it a societal taboo instilled in less enlightened time to restrict the genetic weakening effect of inbreeding.
Debra Lieberman, an evolutionary psychologist at the University of Miami in Florida, tested these rival hypotheses with an ingenious experiment (Proceedings of the Royal Society of London B, vol 270, p 819). She considered the ways in which evolution could have built in a "sibling detector".
For older siblings, it is easy: just watch who your mother gives birth to and who she raises as her own. For younger siblings a more subtle strategy is needed: note how many years you live in the same household as other children.
Lieberman asked over 1000 people how much the thought of incest disgusted them, and the results were clear as day: older siblings were uniformly disgusted by the thought, while younger siblings' disgust was a linear function of years of co-habitation.
Then Lieberman showed that unrelated children reared together in Israeli kibbutzim develop sexual aversions according to the same factors, even though there is no cultural taboo against relationships between them.
Finally, she showed that people's moral outrage when contemplating others engaging in incest was predicted by the level of aversion they would feel towards intercourse with their own siblings, again based on those two factors. In short, it seems that the moral injunction against incest is a product of a specifically evolved mechanism to prevent sibling sex.
Theories about the biological evolution of morality have been around for some time, but a very recent area of research is into the cultural evolution of morality. Just as we inherit genes from our parents, we inherit values from cultural sources, and just as genes adapt to environments, values evolve to match the structure of social life.