During the keynote of SMX Advanced, Matt Cutts confirmed that Google plans to release another tweak to their Panda algorithm change that deals specifically with content that is scraped and reposted on other sites. One of the most common problems in search over the last few years has been sites re-posting content and outranking the original sources and following the Panda update and its inability to solve the problem complaints got even more vocal.
Google has approved the update as part of its quality control process but has not made it live. I’m not sure what components are targeted in this change, but if you post good original content, make sure to share, promote and ping it as soon as it’s live so that you have signals showing you are the content originator. It’s easy to manipulate post dates in WordPress and other CMS platforms but manipulating third party signals is much more difficult. I would also suggest looking at the new rel=author schema as tool to claim ownership of content.
In other Panda news, an informal survey by Search Engine Roundtable claims that 85 percent of webmasters affected by Panda had not seen any recovery (towards pre-Panda traffic levels). During the SMX keynote Matt Cutts stated there have been no manual exceptions to sites that have been wrongly affected, but that sites that have seen a return to their pre-Panda traffic levels likely benefited from a recalculation.
If your site has seen a sharp drop in traffic following the Panda update, we may be able to help. We provide SEO consulting services and will be happy to review your site to see if we can identify and develop a plan to fix any problems that may be causing you to be penalized.