Jump to content

Search the Community

Showing results for tags 'virtualization'.

Found 185 results

  1. Hai, I have little experience in windows virtualization (Hyper v). I want to learn more and I am from India, I am planning to built a home lab. So can you guys suggest me a good and cheapest way to built a lab. configuration which is enough to do test and further studies in virtualization and automation. and can somebody suggest me online sites or links to purchase parts for lab (Like EBay.in). and also a way to learn deep enough to be a good professional. refurbished or used parts is also ok with me just need to be functional. I heard using dual Intel Xeon Six-Core X5650 processor can build a good lab environment. Is it good enough?, If it is then please can anybody suggest me further products which must include with these like which motherboard fan etc...
  2. Some VMWare help

    Hey guys, quick question, in my company we use VMWare Horizon Client and awful thinclients to access our Network. However, the performance is appalling as the server is in another continent. All our work is web-based but we need the VMs to access shared drive. Is there a way to access the drive only from an external device, for example mapping the drive to a location on a laptop without having to work in the VM?
  3. iam going to build a pc based on ryzen 7 2700x but i will use vmware virtualization machine software but it needs amd-v it is not on official amd page ?????? pls answer
  4. Hello everyone, I'm here to ask for some advice on what should I gather to start building a virtualized environment for gaming on a Windows 10 VM: First of all, my PC has a Ryzen 5 1600, a single GPU (RX580 8GB) and 16GB RAM, with a 250GB SSD and a 1TB HDD. So with that said my main questions are: - Which OS would be the best as an Host? Should I use Server OS or Desktop is good enough? I saw that RedHat based Linux systems are a popular choice in enterprise but I'm not that familiar with Linux and I see that Ubuntu distros are the more popular and there's more material on them for troubleshooting and all that; - Should I work with QEMU, VirtualBox or there's other stuff I should use? - Do I need additional software to permanently assign ports to VMs? - Can you assign CPU power dinamically instead of just reserving cores for each VM? Thanks for the help in advance
  5. Server upgrade

    So I'm looking to upgrade my server that I've been using for the last few months because of dead hardware. I've never really tried to make one from scratch so I was looking for recommendations I mostly use it to run Virtualizations for Plex and file shares. I'm looking to build a new one but if someone can link an old eBay server that is from a good seller I've heard that may be a better choice. This is my spec that I came up with: https://pcpartpicker.com/user/Benidict/saved/p73THx Edit: I've also been looking at something like this but the price is really similar to what I speced out and seems like it would have less power: https://amazon.com/dp/B075MZTQBT/?coliid=I3VDOQCJWCZS32&colid=2XU8BLRD2D5XG&psc=1&ref_=lv_ov_lig_dp_it I'm still really new to the server atmosphere and please don't be afraid to call me out for being stupid (only if you offer a proper solution). Thanks!
  6. 4 Gamers 1 CPU Success

    It's story time, folks! I work for a non-profit child care center, where education is focused around computers. We run Minecraft servers, CodeAcademy, etc. The kids learn how to code, write mods, and are even working on a full-feature LEGO stop motion film. The place is called ArtsROC (artsroc.net). Our computers are pretty old and nasty, not to mention that having over 30 computers takes up a LOT of space. We wanted to start replacing systems, while trying to bring the amount of space for towers down. Being an LTT fan, I knew there was only one solution - virtualization. To start, we decided on doing a 4 gamers 1 cpu build. After some Amazon shenanigans, all the parts arrived. We decided on the following parts: Core i7 7820X MSI X299 Pro Gaming Carbon AC Corsair H100i v2 4x8GB DDR4 G.Skill Trident Z RGB @ 3000MHz 5x Kingston A400 120GB SSDs Corsair RMx 1000W 2x GTX 1050 2x GT 710 Corsair Crystal Series 570X (love this case) The original plan was to basically copy the tutorial from LTT's "2 Gamers 1 CPU" build from a few years ago by using unRAID. The system POSTed fine, and all hardware was detected by unRAID. After following the guide exactly (using Windows 7 Professional ISOs), we could only get 1 VM working. Frustrated, we scrapped unRAID and decided to install Windows 10 Pro and use some janky "multi keyboard/mouse" software instead... but it sucked. Then the weekend came and went. This morning (monday), I was back at work and was determined to get this stupid thing working.... my job was on the line. First, I tried Windows Multipoint Server 2012... a virtualization tool based on Windows Server 2012 designed for this exact type of use. Unfortunately, it refused to create VM templates. Scrapped that idea. I went back to unRAID and tried using Windows 10 ISOs... this time, we couldn't get a single VM working... not to mention that the system kept auto restarting. It was clear that unRAID wasn't the way to go. Finally, I installed Windows Server 2016 and enabled Multipoint Services. @LinusTech himself said "it doesn't look like this will work very well". But I had to try. A couple hours later... IT WORKED. 4 VMs CREATED, INSTALLED NVIDIA DRIVERS, AND WE ARE GOOD TO GO! I was surprised how scalable that Multipoint Services is. It could be relatively easy to expand it. We need to replace all our systems by the summer, so we're thinking of doing a 16 or 20 gamers, 1 tower build... TAKE THAT LINUS. But for now, I am calm, and satisfied. It works, and I am beside myself. I have no experience in server management or IT, so I was kinda winging it this whole time. Bottom line: It works, and I still have my job. Here are some pics for your enjoyment... Hey Linus, if you wanna partner with ArtsROC and help roll out a 20 gamers 1 tower build, hit me up
  7. For people who have experience with unRAID virtualization. I've posted this on the unRAID support forums and contacted support but received no responses via either method. Hello. This is my first time attempting any sort of virtualization. The goal is to create an LTT-style 4 gamers on 1 tower. Below are my system specs: Core i7 7820X MSI X299 Gaming Pro Carbon AC 32GB DDR4 3000 non-ECC 2x GTX 1050 2x GT 710 5x Kingston A400 120GB SSD The system posts just fine and all hardware is detected in the BIOS and by unRAID. First Question: Since the 7820X doesn't have an on board GPU, must I use one of the NVIDIA cards, or can I disable unRAID's video output so I can pass that GPU through to a VM? I don't use the command line for setup, I use the web GUI. I'd like to be able to pass through each GPU so I can have 4 VMs. Second Question: In the VMs tools in the web gui, I've created 4 Windows 7 Pro VMs, but only one will actually turn on. Would switching to Windows 10 VMs make any difference? Third Question: We'd like to use 4 of the same mouse and keyboard for each VM. Can we pass through a different USB hub to each VM, then plug our peripherals into those? I don't have access to the system until Monday (it's at work), so I can't send any info. But are these questions/problems common and easy fixes? Thanks!
  8. Installing a new Linux system, and am planning on running hardware virtualization for windows platform. System specs Threadripper 1920x 64GB Ram GTX 1080ti 10TB SSD Raid 0 I need some recommendations for a Linux system that runs well with virtualization, specifically hardware passthrough. I know there will be some performance degredation due to the relay but I'm tired of Windows updates breaking things. Suggestions please.
  9. Hi everyone I want to repurpose an old Fujitsu Celsius workstation into a cheap type 1 hypervisor its a single 1366 socket. I need some recommendations for a CPU - six or eight cores and 2,5 - 3,0 Ghz Thanks in advance
  10. Hello there. Next year, I'm going to college. I'll need a laptop that can handle virtualization however. I was thinking of an HP Envy x360 15-bq100nd. Does Virtualization work on that laptop/cpu?
  11. Hey question, new to posting on forum, Does anyone know if Is it possible on the Asus x299 deluxe motherboard with unraid to passthrough the Included Thunderbolt 3 to a VM windows PC. ?
  12. Hi! I've encountered a problem which makes me sick... I'm trying to avoid classic dual boot for better solution which is virtual Windows machine. But I can't deal with the problem that my GPU won't send any signal to HDMI or DVI-D port. But first.... My PC specs: CPU: Intel Core i7 6700K GPU: GeForce GTX 1070 MOBO: AsRock Fatality K6 Z170 RAM: G.Skill RipjawsV (16 GB) I've built linux-vfio from AUR (thanks to yaourt) and I'm using it right now. I've edited mkinitcpio.conf and added some modules. Also I'm using systemd-boot instead of GRUB, so I've done editing loader.conf and entries/linux-vfio.conf with proper stuff. My vfio.conf and blacklist.conf are in place and qemu can use UEFI to boot my virtual machine. I'm using virt-manager which is configured and ready to work. Here are my configs: /etc/mkinitcpio.conf: https://pastebin.com/Thqa3KJ1/etc/modprobe.d/vfio.conf: https://pastebin.com/QUac89Wj/etc/modprobe.d/blacklist.conf: https://pastebin.com/JPXsUC1Z/boot/loader/loader.conf: https://pastebin.com/kPDV428p/boot/loader/entries/linux-vfio.conf: https://pastebin.com/XdRmzbwS/etc/libvirt/qemu.conf: https://pastebin.com/2p04UXN1 And some outputs (look: SSs) I'm sure that I've done everything correctly but I can't get any output from my GPU under virt machune.... Any ideas? Help? Please, I'm dealing with this problem for 3 days ;-; And sorry for any mistakes, english isn't my mother language.
  13. Dear communauts, I am writing on the forum for the first time to get some advice regarding the upgrade of my video setup. I own a system with quite interesting specs: it is a testbench for a bigger project I want to promote and it serves as a terminal sever (or, to better say, a VM host) for the computer needs of my family. In addition this computer can be used as a multi-seat gaming machine, both locally as well as remotely (it's equipped with a teradici adapter from EVGA). This computer was built with expandability in mind, using LGA 2011-3 dual CPU motherboard with more than enough RAM capacity, a lot of PCIe slots, and so on and so forth. To keep the overall price under control I kept my old and valuable GTX Titans (yes, the original ones, 6GB of GDDR5 and first consumer card to have DP units). Now I would like to upgrade the video section of the machine, given that I am about to buy the still-missing monitor. My choice is the LG43UD79, for quality, built-in KVM switch, wonderful PbP (picture by picture) modes and more than positive reviews. Given this 4K screen can be seen as four side-by-side FullHD monitors, given the multi-seated vocation of the machine, I was oriented on getting four single slot GPUs. Actually my GTX Titans are dual slot because of the DVI port (that I don't want to chop ) and they fulfil the physical expandability options of both my case and my motherboard. Moving from 2 dual slot GPUs to 4 single slot ones gives me two advantages, especially if combined with the splitting capabilities of the monitor: Easy peasy lemon squeezy: with four GPUs I will be able to virtualize four gaming/workstation machines. Combined with the display I'm going to buy this will allow a console-like experience with as much as 4 players in front of the same split screen. Combined with teradici tech I'll be able to stream up to 4 VMs across the LAN (WAN). SLI without SLI: a single (virtual or physical) machine can use the 4K display as if it was an array of 2x2 FullHD displays. This aspect might allow a good-yet-not-perfect scalability of 3D applications independently from the presence of SLI support in the used OS (think about Unix... And I say Unix and not only Linux for a specific reason ). For this reason I first aimed at the Galax GTX1070 Katana cards. After then I thought I could have pushed it harder by using something like a Tesla K1,K2,M10. But the second option was destroyed by a rapid search on the pricing of such server cards and the complete lack of local video output methods... In addition, lead by some LTT videos about playing on super high res monitor arrays, I started thinking about the "jelly-effect" due to no-sync between adjacent displays and I digged a little deeper. At the moment I am quite impressed with the performances per watt and characteristics of the Nvidia Quadro P4000 video cards. These seem to be the perfect go for my project. Single slot, ~100W power consumption, sync-able, nvidias (I work with GPU programming and I want to be able to run both OpenCL and CUDA) and, even if Quadros, quite impressive in 3D performances compared to my actual setup and last, but not least, the virtualisation support (read: enablement) of the quadro cards. I have however one big complain. My system is made to use GPU watercooling, that seems to be a no-go with quadros. Using 4 single slot GPUs in 4 adjacent slot would potentially lead to overheating for them. I would like to have some thermal performance data, but I cannot find any. At the moment I am thinking to mount the GPUs in a Y-N-Y pattern, putting a shorter PCIe card (NVME SSD, Audio card, teradici adapter) between each couple of cards. Here I come to my questions: how do you feel about the situation? Have you some other suggestions for single slot GPUs which can handle this workload? Have you ever played on a Quadro P4000? Have you ever dealt with the Sync technology? Just share your thoughts and help me taking this tough decision! Thanks in advance Slid
  14. Hi All, I would like ask for some assistance on how can i delete or merge the snapshot i created on oracle virtualbox because its taking up huge amount of space on my drive. thanks
  15. Most of every processor Intel and AMD has a Virtualization if you check on Task Manager > Performance. I wonder those feature just let it turn ENABLE or DISABLE. There're anything will impact on performance? So far Virtualization is common use for if user want to create another virtual PC only with single processor that currently use by original PC. So how about you guys, got any advice for this feature?
  16. Guys help me

    I don't know if motherboard that I'm planing to buy supports virtualization. I I'm planing to buy Asus B350-Plus paired with Amd ryzen 5 1600. I'm planing to use it as Nas (used from time to time) and gaming (gtx 1060 16gb of ram) rig. I'm planing to have 4 1TB HDD and 1 120GB samung 850 evo. Four HDDs will be in RAID 5 array. How powerfull psu do i need and how much ram do i need to give to Nas and how much to Windows. Thanks for all your help! Jack
  17. I'm constructing a 19 card mining rig using: Asus B250 Mining Expert MOBO 6x Vega 56 1x Vega 64 12x GTX 1060 Pentium G4560 8GB ram 1TB hdd There are 8 card limits in the drivers so I was wondering about running multiple VMs simultaneously, each with their own set of cards. Preferably I would like to use windows. Virtualization is where my computer knowledge falls off so any pointers are welcome!
  18. Long Story Short Hello there! I need a machine that can handle some intense coding sessions with two monitors and various open programs at all times. There will be minimal to no gaming on this machine, at least until I add a graphics card at a later date. I would also like to be able to run two VMs, so two people can code at the same time on separate monitors from the same machine. I live in Cyprus and I shop across the EU in euros. My budget is 1000 euros and it's for the tower only. The build must support at least two monitors. Current parts I took the liberty of selecting a few parts for starters. You can improve on this list or completely ignore it and suggest something else. All links are from a Greek site that delivers to Cyprus. CPU: Intel i5 8600K @ 279e MB: Gigabyte Z370 HD3P @ 164,99e PSU: Corsair SF450 @ 94,90e RAM: Corsair Vengeance LPX 16GB DDR4 @ 2400MHZ @ 205e SSD: Samsung 850 EVO 250GB @ 93,90e CASE: Corsair Carbide 100R @ 53,50e TOTAL: 891,29e plus around 27e shipping. Concerns Will this power supply be enough if I add a 1060/70/80 after a few months? Will said card fit in this case? Should I go with a Ryzen build instead? What about graphics in that case? Original, much more detailed post
  19. Hey all, I´d like to spend some money on a new gaming / video-editing system for me and my wife. I saw the older "2 Gaming Rigs - 1 Tower" video of linus and loved the idea to only have 1 case staying around, but still having enough power to run at least 2 operating systems at full load without affecting each others experience. Before I go into detail about budget, the detailed specs and stuff, I´d like to know, if there is a significant difference in the power of each virtualized rig, compared to separated systems?! And another more specific question: Let´s take the case of a virtualized 16-core system (for simplicity reasons: 8-cores per system) - will there be, more or less, half the amount of power-consumption if I only boot one of the systems compared to starting both? Thanks in advance Simson
  20. I may be in way over my head here, I want to virtualize my computer into 2 computers, and I had a couple questions about the hardware and software behind it. I have a 5800k APU and an R7 260x with 8 gigabytes of DDR3. Is my hardware good enough for virtualization? I'm not sure if you can split integrated graphics with dedicated graphics. Do I need 2 separate operating systems on different drives or if 1 drive and a some virtual boxes? How does one split USB ports between the different OS'? Yes, you have my permission to call me lacking in knowledge about these things.
  21. Previously, it was cheaper to just buy a few modestly spec’ed PCs instead of getting a high core-count PC and setting up multiusers environment. But now with enormous multicore CPUs being affordable, we thought it makes more sense than ever for a single PC to serve multiple users at the same time. We can plug in multiple monitors, keyboards and mice and just use. Me and my fellow teammates think this is the future, and we wanted to make this concept easily accessible to everyone. I want to hear what you all think of this. While there are many guides on how to set this up, almost all of those are overwhelmingly complicated - matching hardware spec and using advanced Linux techniques. As a person who was interested in this tech years before 6 cores, 8 cores CPUs for mainline was a thing, I can comfortably say that UX was/is horrible. We had to use multiple graphics cards since QEMU’s integrated graphics stack is not performant, and USB devices was a nightmare, especially when used headlessly. We wanted to avoid all of these major issues and allow non-techies to use this advanced tech as well. People can just avoid using these type of solutions and get cheap alternatives such as Raspberry Pi and Chromebook. But still, there are nothing like running a full and proper desktop OS such as Windows, and that’s what we’re targeting at. Using multi-session RDP is also not comparable, imo, as that still needs some kind of a client device to remotely login. I know many people would be wondering how gaming will be handled. While the majority of people here would be gamers, we’re currently targeting value over performance - people who’re fine with IGP-level performance. This mean that we want to enable multiple OSes to use a single graphics card. This needs a virtualized graphics device to be passed to the OS, which in term, means worsened graphics performance. We have been able to achieve 1080p60-ish performance using a single RX460 with 4 concurrent users, while the stock graphics driver in QEMU - QXL doesn’t even come close. We want to wait until SR-IOV powered graphics cards to be a thing before experimenting with gaming. I’m sorry if this sounds way too much like an advertisement, but I’m genuinely curious. What do you think? We think this is the future. If there was a simple and elegant solution to this concept, we think the entire world can save some serious amount of resources(hardware, power, wastes, etc).
  22. I do a fair bit of virtualization, a lot less gaming than I used to, more and more scraping, and voip -- sometimes all at the same time -- I know... yolo. I had 16 GB of RAM, but was starting to feel it drag. It isn't dragging anymore, but now I want to squeeze every last bit of performance out of this system. Where do I start? 3rd party software? Power profiles? Advanced system settings? Is there a knob somewhere I can crank up to the max? In VirtualBox I literally increase memory allocated by using a slider. What about my in native system, Windows 10? Other key specs i7-7700HQ @ 2.8 GHz 250 GB SSD to boot 1 TB HD Storage GTX 1050 ti Something else cool that I can't think of here Oh, by the way, I'm on a 32-bit operating system. JK, its 64. All suggestions are super welcome, eh!
  23. I need some advice to know if my PC can run 2 virtual machines for gaming purposes. Basically the same thing Linus built in this video. A few games I want to play simultaneously or make 2 instances of: Resident Evil 7 Monster Hunter World (unreleased) Dark Souls 1/2/3 Assetto Corsa My specs are: Intel 4930K (with a ~5-10% OC potential lol) Gigabyte GA-X79S-UP5-WIFi 24GB DDR3 RAM [512GB SSD + 1TB SSHD] + 2TB NAS GTX 780 Ti Seasonic 650W PSU (individually sleeved !) Custom water cooling Can my CPU manage to handle 2 VM's running A3 games? I'm aware I'll need to get another GPU or two new ones. Also, is there a better way to build what Linus built since the video is 2years old now?
  24. I work in my college IT department and I've been working the last couple days on helping trying to distribute GPU power between different virtual machines, and what we are trying to use is 2 10 core Xeons and 2 GTX 1080's, running Windows Server. Any idea on the best way to do this? The purpose is for running programs that require more graphical power than the NUC's in the labs for the college can deliver.
  25. Aim: I'm putting together a computer for flashing firmware via USB onto some custom hardware I'm manufacturing. Unfortunately, the flashing software will only run on Windows XP, and I can only flash one board at a time. The flashing process itself isn't memory or processing intensive in any way though, and it will run fine in a virtual machine - I've tested it with unRAID and it ran fine. However, each instance needs a separate PCIe USB controller passed through to it, both because I need to be able to easily hot-swap the boards as they finish, and also due to a bug in libusb that causes the boards to detach whenever I've tried to program them. Budget: However much it costs. Note that I'm interested more in the per VM cost. If it's cheaper to just build a bunch of separate low-end computers or use a blade server type deal, those are also valid options. Monitors: None. Everything gets controlled remotely. Peripherals: Only the board being flashed. One possible build I'm considering: SuperMicro SYS-5019A-FTN4, $900 : https://newegg.com/Product/Product.aspx?Item=9SIA5EM67J7117 1x Hynix 8GB DDR4-2400 ECC UDIMM, $150 : https://newegg.com/Product/Product.aspx?Item=9SIA5EM62K8176 (validated) 1x Samsung SM961 128GB NVMe SSD, $130 : https://newegg.com/Product/Product.aspx?Item=9SIA5EM5Z40944 (validated) 1x SuperMicro PCIe x8 riser, $15 : https://newegg.com/Product/Product.aspx?Item=9SIA5EM2JS8113 1x StarTech 4-port Quad Bus PCIe USB 3.0 Adapter, $90 : https://newegg.com/Product/Product.aspx?Item=9SIAB274849872 unRAID, $60 Total = $1345 for four VMs, or about $335 each The hope is that it will be possible to split up the quad-bus USB card and pass through the separate controllers to 4 different VMs, but I'm not 100% sure that will actually work. If there's some relatively-cheap server board with a lot of PCIe x1 slots, I might consider using that with just a whole lot of separate USB adapters, but I haven't researched that proposition yet. Any other possible sorts of builds or options I should consider here?