![]() G.Skill Trident Z5 2x16GB DDR5-6600 C元4 (opens in new tab) You'll need the GGML versions - we used ggml-medium.en.bin (1.42GiB) and ggml-large.bin (2.88GiB) for our testing. Besides the EXE and DLL, you'll need one or more of the OpenAI models, which you can grab via the links from the application window. (I was, though you can also try to compile the code yourself if you want.) Just grab WhisperDesktop.zip and extract it somewhere. Getting WhisperDesktop running proved very easy, assuming you're willing to download and run someone's unsigned executable (opens in new tab). It also means that it's not using special hardware like Nvidia's Tensor cores or Intel's XMX cores. That uses DirectCompute rather than PyTorch, which means it will run on any DirectX 11 compatible GPU - yes, including things like Intel integrated graphics. There's also this Const-Me project (opens in new tab), WhisperDesktop, which is a Windows executable written in C++. Of course there's the OpenAI GitHub (opens in new tab) (instructions and details below). There are a few options for running Whisper, on Windows or otherwise. We wanted to let the various GPUs stretch their legs a bit and show just how fast they can go. Real-time speech recognition only needs to keep up with maybe 100–150 words per minute (maybe a bit more if someone is a fast talker). We did not attempt to use it in that fashion, as we were more interesting in checking performance. Note also that Whisper can be used in real-time to do speech recognition, similar to what you can get through Windows or Dragon NaturallySpeaking. You can also run it on your CPU, though the speed drops precipitously. The last one is our subject today, and it can provide substantially faster than real-time transcription of audio via your GPU, with the entire process running locally for free. like OpenAI's Whisper (opens in new tab). ![]() Besides ChatGPT (opens in new tab), Bard (opens in new tab), and Bing Chat (opens in new tab) (aka Sydney), which all run on data center hardware, you can run your own local version of Stable Diffusion (opens in new tab), Text Generation (opens in new tab), and various other tools. The best graphics cards (opens in new tab) aren't just for gaming, especially not when AI-based algorithms are all the rage. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |