Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
You can also re-edit your video footage. If your view-through rates are low, your viewers are losing interest quickly. Try creating a shorter cut of your video that’ll be more engaging to your audience. Maybe try adding graphics to spice up the content. Although you don’t want to entirely replace your original video, creating different versions of it may bring you better results.