The original email sent to customers and clarified in both Evernote discussion forum posts and FAQ pages indicated that customers would be able to opt-out of the new automated technologies, but that customers would "not be able to fully take advantage of future advanced features," without specifying what those features would be.
It didn't take very long for customers to respond to Evernote's proposed changes --some with hesitation and questions, and others with declarations of their intent to leave the service once and for all. This prompted Evernote's CEO, Chris O'Neill, to post a clarifying letter indicating that personal information found by the machine learning service would be "masked" before any employees could potentially see it, and that any employees entrusted with this task would be personally vetted by O'Neill, but these statements did little to quell concerns from customers about privacy and encryption.
On December 19th, O'Neill posted an "Action Plan for Privacy," reiterating that Evernote would be withdrawing the previously announced changes, adding that Evernote would do three things to meet, if not exceed, customer expectations:
Heighten already strict controls on employee access levels across the company and, starting on the 19th, O'Neill will manage that process himself;
Get further data and privacy guidance from top experts around the world, including John Verdi at the Future for Privacy Forum (FPF), and leading privacy expert (and Evernote's Vice President of Legal, as well as co-chair of the Enterprise Cloud Privacy Group (ECPG) Emily Hancock to help create a new policy in early 2017;
Establish a new Evernote Customer and Community Advisory Board that will meet quarterly, starting in February in San Francisco.
What's Not Changing
As indicated above, not a whole lot is changing-- yet. Evernote may still want to find a way to allay people's privacy and security concerns while still trying to implement machine learning and other automated technologies, but they would not only have to make such services opt-in (instead of opt-out), they would have to give people reason to believe that personalizing the service further (or giving them access to "advanced features") would truly be worth the risk of exposing their data to people who O'Neill assures us he's vetted.
What You Should Do