'videodigest' is not recognized as an internal or external command - Printable Version +- Python Forum (https://python-forum.io) +-- Forum: Python Coding (https://python-forum.io/forum-7.html) +--- Forum: General Coding Help (https://python-forum.io/forum-8.html) +--- Thread: 'videodigest' is not recognized as an internal or external command (/thread-13308.html) |
'videodigest' is not recognized as an internal or external command - MM2018 - Oct-09-2018 https://github.com/agermanidis/videodigest successfully installed, also added all path to environment variables, but i got this error: C:\Python37\Scripts>videodigest 'videodigest' is not recognized as an internal or external command, operable program or batch file. kindly help me! RE: 'videodigest' is not recognized as an internal or external command - snippsat - Oct-09-2018 It's not written for Windows and it will only work Python 2. scripts=['videodigest'], this will make videodigest in Scripts folder that work for Linux,on Windows can try to rename in Scripts folder to videodigest.py and run from command line(most be Python 2).But i think there will be other problems like codec used codec="libx264", Can use Zeranoe build for Windows and link it,but now into difficult stuff. The easiest is to use VirtualBox with a Linux distro and run it with Python 2. RE: 'videodigest' is not recognized as an internal or external command - MM2018 - Oct-12-2018 yl@yl-VirtualBox:~$ videodigest -i /media/sf_I/xm/xm.mp4 -s /media/sf_I/xm/xm.srt -L English -o rip.mp4 Traceback (most recent call last): File "/home/yl/.local/bin/videodigest", line 108, in <module> language=args.language) File "/home/yl/.local/bin/videodigest", line 75, in find_summary_regions summary = summarize(srt_file, summarizer, n_sentences, language) File "/home/yl/.local/bin/videodigest", line 60, in summarize parser = PlaintextParser.from_string(srt_to_doc(srt_file), Tokenizer(language)) File "/home/yl/.local/lib/python2.7/site-packages/sumy/nlp/tokenizers.py", line 67, in __init__ self._sentence_tokenizer = self._get_sentence_tokenizer(tokenizer_language) File "/home/yl/.local/lib/python2.7/site-packages/sumy/nlp/tokenizers.py", line 82, in _get_sentence_tokenizer "NLTK tokenizers are missing. Download them by following command: " LookupError: NLTK tokenizers are missing. Download them by following command: python -c "import nltk; nltk.download('punkt')" |