About 51 results
Open links in new tab
  1. Highest resolution image I've taken. You'd need 16 8k TVs to

    Feb 5, 2023 · Highest resolution image I've taken. You'd need 16 8k TVs to see the whole thing. 32k (30720 x 17280)

  2. Level 32k Enchants : r/Minecraft - Reddit

    Oct 28, 2020 · I know to get a 32k enchant on something it is like /give @p diamond_sword { Unbreakable:1,Enchantments: [ { id:sharpness,lvl:32767} ]} but does anyone know how to put …

  3. 32K Video Possible In 2020/21? Filming and Editing?

    Nov 20, 2020 · With the new releases of Blackmagic products we can make 32K HDR VIDEO!! How? With the Release of DaVinci Resolve 17, the studio version supports upto 32K and 120 …

  4. How to make a 32k in 1.20.2 java? : r/Minecraft - Reddit

    I want to make a 32k in my friend's server to give out as a prize to whoever wins a series of games. yes, I have op Edit: by '32k' I mean over enchanted items like swords and stuff, where …

  5. Training long context (32K+) 70B Llama : r/LocalLLaMA - Reddit

    32K (my goal): Needs 224GB on 4xA100 for a single batch (rank 8). Some day, perhaps I will get more A6000s to do a single batch at home (5xA6000 or 11x3090/4090 should work in theory, …

  6. solo cox with 32k points avg, whats the chance of seeing any

    The wiki has everything. So you have around a 4%-5% drop that raid. Or around a 1/30; 1/25 drop rate depending on raid and points. So realistically points don’t honestly matter. It’s more about …

  7. GPT-4 Will Probably Have 32K Tokens Context Length - Reddit

    Feb 22, 2023 · I remember when 32K was a huuuge amount of RAM memory for a computer. Now, for the first time in their lives, many younger people are going to feel something similar. …

  8. GPT-4 has 32,000 token limit or 64,000 words and still this ... - Reddit

    Mar 15, 2023 · A token is roughly half a word, so 32K tokens is more like 16K words. And standard GPT-4 (which presumably ChatGPT is using for efficiency reasons) is 8000 token limit.

  9. How much RAM is needed for llama-2 70b + 32k context?

    Jul 24, 2023 · I was testing llama-2 70b (q3_K_S) at 32k context, with the following arguments: -c 32384 --rope-freq-base 80000 --rope-freq-scale 0.5 these seem to be settings for 16k. Since …

  10. How much should I save as a 23m on 32k - Reddit

    Jul 25, 2023 · How much should I save as a 23m on 32k ? Hello, I am 23m and I have a salary of £32,000 a year. After pension, student loan, taxable benefits (company car) ect I take home …