With increasing global access to technology, there has never been more content in the world. Recent research estimates that more than 2.5 quintillion bytes of data are created every day.In the past MT was primarily a culling tool to get the “gist” of meaning from a large volume of documents, typically part of the discovery process in large cross border litigation. Relevant documents would then be professionally translated “from scratch” by humans. Until recently there was little value for MT in a corporate environment where you often see the translation of English source documents into upwards of 40 languages for product documentation, training or sales and marketing collateral. Now, with the improved output from “smart machines”, we are starting to see MT used for the initial translation with the human professionals taking on the role of “post MT” editors. These developments portend big changes both for companies that buy translation services and for language service providers (LSPs). For the former, it offers the potential for translating a higher volume of content faster and for less money. For the LSPs it means transforming their database of professional translators into professional “post editors” which is a much different skill set. The white paper also explores the role of MT engines from industry giants like Google and Microsoft. In what circumstances it is OK to use a free, public source translation machine? What is the impact on the protection of confidential information? What are the alternatives to Google Translate? About the authors – the white paper was produced by PTC member Trustpoint.One. Based downtown in Three Gateway Center, Trustpoint.One offers human and machine translation solutions to leading corporations and law firms in Pittsburgh and across the country. Listen to TrustPoint discuss this on TechVibe Radio here.