<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom"><title>Ed Parcell's Blog</title><link href="https://www.edparcell.com/" rel="alternate"/><link href="https://www.edparcell.com/feeds/all.atom.xml" rel="self"/><id>https://www.edparcell.com/</id><updated>2023-09-02T17:00:00-06:00</updated><entry><title>Remote Desktop for Pair Programming Interview</title><link href="https://www.edparcell.com/remote-desktop.html" rel="alternate"/><published>2023-09-02T17:00:00-06:00</published><updated>2023-09-02T17:00:00-06:00</updated><author><name>Ed Parcell</name></author><id>tag:www.edparcell.com,2023-09-02:/remote-desktop.html</id><summary type="html">&lt;p&gt;A seamless, cost-effective shared cloud desktop setup for Python pair programming interviews that outperforms traditional screen-sharing platforms in usability and latency.&lt;/p&gt;</summary><content type="html">&lt;p&gt;Some of the best interview processes I've been through have required me to write code - the job will require coding, so it is only natural to check people can actually code prior to making a hire. I've found that companies that do this typically have great competent colleagues, and as a result tend to be highly productive environments. To allow me to adopt a similar process with remote candidates, I've created a shared cloud desktop setup that we use to evaluate interview candidate's Python credentials. I thought it might be beneficial for others so I am sharing it here.&lt;/p&gt;
&lt;h2&gt;About This Setup&lt;/h2&gt;
&lt;p&gt;We use Apache Guacamole running on a standard Linux Virtual Machine on Digital Ocean. This involves running X on VNC, and then we set up a Python environment with a suitable set of packages and then install VS Code and Jupyter Lab. There are a few driving forces behind these choice:&lt;/p&gt;
&lt;p&gt;Concurrent: Two or more people can be remoted in to the cloud desktop at the same time, which is critical for pair-programming interviews.&lt;/p&gt;
&lt;p&gt;Usability: The low latency means that both interviewer and interviewee can interact with the code almost as if they were sitting side by side. &lt;/p&gt;
&lt;p&gt;Cheap: Our setup can be run on standard cloud VMs for well under $1/hour.&lt;/p&gt;
&lt;p&gt;Browser Accessibility: One of the key goals was to minimize the setup required for the candidate. Providing access through the browser means no cumbersome software installation, allowing us to focus on the task at hand.&lt;/p&gt;
&lt;p&gt;Security: Avoiding sharing desktops eliminates the possibility of a candidate potentially seeing sensitive information in email notifications, for example. Also, because it is relatively quick to create a new environment, we can tear it down after each interview.&lt;/p&gt;
&lt;p&gt;Compared to alternatives like Zoom and Teams, we like the low latency of our approach - the remote desktop would be quite usable as an everyday environment. We also looked at using VDI provider such as AWS WorkSpaces, together with remote connectivity software such as Team Viewer, but disliked  the subscription cost for solutions that would sit idle most of them time. Let me just say that I think those are all great solutions, but they are great solutions for other problems - we wanted to create a usable but disposable sandbox development environment.&lt;/p&gt;
&lt;h2&gt;My Interviewing Philosophy&lt;/h2&gt;
&lt;p&gt;When I interview for Python-related roles, my objective isn’t to set up a series of hoops for candidates to jump through. Instead, it's about witnessing firsthand how a potential team member thinks, codes, and collaborates. Here's my approach:&lt;/p&gt;
&lt;p&gt;Hands-on Collaborative Exercises: Different positions require varied skill sets. Depending on the role, candidates might find themselves cleaning up a dataset using pandas, diving deep into data analysis, or designing an algorithm. We aim to make the tasks representative - we want assess skills that will actually be used on the job.&lt;/p&gt;
&lt;p&gt;Open Book Policy: The real world isn't an examination hall. Developers routinely consult Google, StackOverflow, or tap the collective wisdom of their peers. Recognizing this, our exercises are open book. Candidates can freely seek out resources or even ask me questions directly.&lt;/p&gt;
&lt;p&gt;Supportive Environment: I'm acutely aware of the stress associated with coding in an observed environment. That's why, if I spot errors that are going to eat time, I'll step in, offering fixes and guidance. The goal isn't to trip up the candidate but to put them in the best light to see how they respond, adapt, and move forward.&lt;/p&gt;
&lt;h2&gt;How to create the shared cloud desktop&lt;/h2&gt;
&lt;p&gt;To create a shared cloud desktop:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Modify the shell script below. Specifically you will want to replace &lt;code&gt;GUACAMOLE_USER&lt;/code&gt;, &lt;code&gt;GUACAMOLE_PASSWORD&lt;/code&gt;, &lt;code&gt;VNC_PASSWORD&lt;/code&gt; and &lt;code&gt;UNIX_USER_PASSWORD&lt;/code&gt; Remember them - you will need them later. Use temporary passwords - you will have to provide &lt;code&gt;GUACAMOLE_USER&lt;/code&gt; and &lt;code&gt;GUACAMOLE_PASSWORD&lt;/code&gt; to the candidate.&lt;/li&gt;
&lt;li&gt;Create a Digital Ocean droplet. Use an Ubuntu 22.04 LTS x64 image. Make sure you provide an appropriate authentication method. This recipe has been tested on a Digital Ocean droplet with 4 CPUs and 16GB RAM. It takes about 7 minutes for the script to run. You can probably use other VMs or other OS versions, but you will probably need to adapt.&lt;/li&gt;
&lt;li&gt;Connect to your VM with ssh. Log in as root.&lt;/li&gt;
&lt;li&gt;Copy the shell script to your virtual machine. The simplest way to do this is to run &lt;code&gt;nano b.sh&lt;/code&gt;, copy the text in, then ctrl-O to save and ctrl-X to exist.&lt;/li&gt;
&lt;li&gt;Do &lt;code&gt;chmod 700 b.sh&lt;/code&gt; and &lt;code&gt;./b.sh&lt;/code&gt; to run the script. It will install packages, build Guacamole, configure Tomcat, install Chrome and VS Code, create a user called "ed", and create a virtual environment as "ed".&lt;/li&gt;
&lt;li&gt;Switch to user "ed" (&lt;code&gt;su - ed&lt;/code&gt;) and run VNC with &lt;code&gt;vncserver :1 -geometry 1920x1080&lt;/code&gt;. Guacamole will connect to this session and serve it over the web. You will need to provide the &lt;code&gt;VNC_PASSWORD&lt;/code&gt; from step 1 - if these do not match, guacamole will not be able to connect to VNC and expose the desktop to the web.&lt;/li&gt;
&lt;li&gt;In your browser, go to &lt;code&gt;http://[DROPLET IP]:8080/guacamole&lt;/code&gt; and log in with your GUACAMOLE credentials that you set in step 1. You should now see the desktop environment in your browser.&lt;/li&gt;
&lt;li&gt;Start a terminal in the desktop and do &lt;code&gt;source ~/jupyter_env/bin/activate&lt;/code&gt; to activate the Python virtual environment and run &lt;code&gt;jupyter-lab --browser=google-chrome&lt;/code&gt; to  start Jupyter.&lt;/li&gt;
&lt;li&gt;You can run VS Code from the start menu at the bottom-left of the screen.&lt;/li&gt;
&lt;li&gt;Give the candidate the url &lt;code&gt;http://[DROPLET IP]:8080/guacamole&lt;/code&gt; and the &lt;code&gt;GUACAMOLE_USER&lt;/code&gt; and &lt;code&gt;GUACAMOLE_PASSWORD&lt;/code&gt; so they can connect too.&lt;/li&gt;
&lt;li&gt;Enjoy your shared session!&lt;/li&gt;
&lt;/ol&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="ch"&gt;#!/bin/bash&lt;/span&gt;

&lt;span class="nb"&gt;set&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;-euxo&lt;span class="w"&gt; &lt;/span&gt;pipefail
&lt;span class="nb"&gt;trap&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;echo &amp;quot;Error: Command failed. Exiting.&amp;quot; &amp;gt;&amp;amp;2&amp;#39;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;ERR

&lt;span class="k"&gt;if&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;[[&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;$EUID&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;-ne&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="m"&gt;0&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;]]&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;then&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;&amp;quot;This script must be run as root&amp;quot;&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="nb"&gt;exit&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="m"&gt;1&lt;/span&gt;
&lt;span class="k"&gt;fi&lt;/span&gt;

&lt;span class="nb"&gt;export&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nv"&gt;DEBIAN_FRONTEND&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;noninteractive
apt&lt;span class="w"&gt; &lt;/span&gt;update
apt&lt;span class="w"&gt; &lt;/span&gt;upgrade&lt;span class="w"&gt; &lt;/span&gt;-y&lt;span class="w"&gt; &lt;/span&gt;--no-install-recommends
apt&lt;span class="w"&gt; &lt;/span&gt;install&lt;span class="w"&gt; &lt;/span&gt;-y&lt;span class="w"&gt; &lt;/span&gt;build-essential&lt;span class="w"&gt; &lt;/span&gt;libcairo2-dev&lt;span class="w"&gt; &lt;/span&gt;libjpeg-turbo8-dev&lt;span class="w"&gt; &lt;/span&gt;libpng-dev&lt;span class="w"&gt; &lt;/span&gt;libtool-bin&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;libossp-uuid-dev&lt;span class="w"&gt; &lt;/span&gt;libvncserver-dev&lt;span class="w"&gt; &lt;/span&gt;freerdp2-dev&lt;span class="w"&gt; &lt;/span&gt;libssh2-1-dev&lt;span class="w"&gt; &lt;/span&gt;libtelnet-dev&lt;span class="w"&gt; &lt;/span&gt;libwebsockets-dev&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;libpulse-dev&lt;span class="w"&gt; &lt;/span&gt;libvorbis-dev&lt;span class="w"&gt; &lt;/span&gt;libwebp-dev&lt;span class="w"&gt; &lt;/span&gt;tomcat9&lt;span class="w"&gt; &lt;/span&gt;tomcat9-admin&lt;span class="w"&gt; &lt;/span&gt;tomcat9-common&lt;span class="w"&gt; &lt;/span&gt;tomcat9-user&lt;span class="w"&gt; &lt;/span&gt;nginx&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;libavcodec-dev&lt;span class="w"&gt; &lt;/span&gt;libavutil-dev&lt;span class="w"&gt; &lt;/span&gt;libswscale-dev&lt;span class="w"&gt; &lt;/span&gt;libfreerdp-client2-2&lt;span class="w"&gt; &lt;/span&gt;libpango1.0-dev&lt;span class="w"&gt; &lt;/span&gt;libssh-dev&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;libssl-dev&lt;span class="w"&gt; &lt;/span&gt;libvorbis-dev&lt;span class="w"&gt; &lt;/span&gt;libwebp-dev&lt;span class="w"&gt; &lt;/span&gt;python3-pip&lt;span class="w"&gt; &lt;/span&gt;xfce4&lt;span class="w"&gt; &lt;/span&gt;xfce4-goodies&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;tightvncserver&lt;span class="w"&gt; &lt;/span&gt;lxde&lt;span class="w"&gt; &lt;/span&gt;tmux&lt;span class="w"&gt; &lt;/span&gt;python3.10-venv&lt;span class="w"&gt; &lt;/span&gt;software-properties-common&lt;span class="w"&gt; &lt;/span&gt;apt-transport-https&lt;span class="w"&gt; &lt;/span&gt;wget

wget&lt;span class="w"&gt; &lt;/span&gt;https://apache.org/dyn/closer.lua/guacamole/1.5.3/source/guacamole-server-1.5.3.tar.gz?action&lt;span class="o"&gt;=&lt;/span&gt;download&lt;span class="w"&gt; &lt;/span&gt;-O&lt;span class="w"&gt; &lt;/span&gt;guacamole-server-1.5.3.tar.gz
tar&lt;span class="w"&gt; &lt;/span&gt;-xvf&lt;span class="w"&gt; &lt;/span&gt;guacamole-server-1.5.3.tar.gz
&lt;span class="nb"&gt;pushd&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;guacamole-server-1.5.3/
./configure&lt;span class="w"&gt; &lt;/span&gt;--with-init-dir&lt;span class="o"&gt;=&lt;/span&gt;/etc/init.d
make
make&lt;span class="w"&gt; &lt;/span&gt;install
&lt;span class="nb"&gt;popd&lt;/span&gt;
ldconfig
systemctl&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nb"&gt;enable&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;guacd
systemctl&lt;span class="w"&gt; &lt;/span&gt;start&lt;span class="w"&gt; &lt;/span&gt;guacd

wget&lt;span class="w"&gt; &lt;/span&gt;https://apache.org/dyn/closer.lua/guacamole/1.5.3/binary/guacamole-1.5.3.war?action&lt;span class="o"&gt;=&lt;/span&gt;download&lt;span class="w"&gt; &lt;/span&gt;-O&lt;span class="w"&gt;  &lt;/span&gt;guacamole-1.5.3.war
mkdir&lt;span class="w"&gt; &lt;/span&gt;-p&lt;span class="w"&gt; &lt;/span&gt;/etc/guacamole
mv&lt;span class="w"&gt; &lt;/span&gt;guacamole-1.5.3.war&lt;span class="w"&gt; &lt;/span&gt;/etc/guacamole/guacamole.war
ln&lt;span class="w"&gt; &lt;/span&gt;-s&lt;span class="w"&gt; &lt;/span&gt;/etc/guacamole/guacamole.war&lt;span class="w"&gt; &lt;/span&gt;/var/lib/tomcat9/webapps/
mkdir&lt;span class="w"&gt; &lt;/span&gt;-p&lt;span class="w"&gt; &lt;/span&gt;/etc/guacamole/&lt;span class="o"&gt;{&lt;/span&gt;extensions,lib&lt;span class="o"&gt;}&lt;/span&gt;

&lt;span class="nb"&gt;echo&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;&amp;quot;guacd-hostname: ::1&lt;/span&gt;
&lt;span class="s2"&gt;guacd-port: 4822&lt;/span&gt;
&lt;span class="s2"&gt;user-mapping: /etc/guacamole/user-mapping.xml&amp;quot;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&amp;gt;&lt;span class="w"&gt; &lt;/span&gt;/etc/guacamole/guacamole.properties

mkdir&lt;span class="w"&gt; &lt;/span&gt;-p&lt;span class="w"&gt; &lt;/span&gt;/usr/share/tomcat9/.guacamole/
ln&lt;span class="w"&gt; &lt;/span&gt;-s&lt;span class="w"&gt; &lt;/span&gt;/etc/guacamole/guacamole.properties&lt;span class="w"&gt; &lt;/span&gt;/usr/share/tomcat9/.guacamole/

cat&lt;span class="w"&gt; &lt;/span&gt;&amp;gt;&lt;span class="w"&gt; &lt;/span&gt;/etc/guacamole/user-mapping.xml&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s"&gt;&amp;lt;&amp;lt;EOL&lt;/span&gt;
&lt;span class="s"&gt;&amp;lt;user-mapping&amp;gt;&lt;/span&gt;
&lt;span class="s"&gt;    &amp;lt;authorize username=&amp;quot;GUACAMOLE_USER&amp;quot; password=&amp;quot;GUACAMOLE_PASSWORD&amp;quot;&amp;gt;&lt;/span&gt;
&lt;span class="s"&gt;        &amp;lt;connection name=&amp;quot;Ubuntu Desktop&amp;quot;&amp;gt;&lt;/span&gt;
&lt;span class="s"&gt;            &amp;lt;protocol&amp;gt;vnc&amp;lt;/protocol&amp;gt;&lt;/span&gt;
&lt;span class="s"&gt;            &amp;lt;param name=&amp;quot;hostname&amp;quot;&amp;gt;localhost&amp;lt;/param&amp;gt;&lt;/span&gt;
&lt;span class="s"&gt;            &amp;lt;param name=&amp;quot;port&amp;quot;&amp;gt;5901&amp;lt;/param&amp;gt;&lt;/span&gt;
&lt;span class="s"&gt;            &amp;lt;param name=&amp;quot;password&amp;quot;&amp;gt;VNC_PASSWORD&amp;lt;/param&amp;gt;&lt;/span&gt;
&lt;span class="s"&gt;        &amp;lt;/connection&amp;gt;&lt;/span&gt;
&lt;span class="s"&gt;    &amp;lt;/authorize&amp;gt;&lt;/span&gt;
&lt;span class="s"&gt;&amp;lt;/user-mapping&amp;gt;&lt;/span&gt;
&lt;span class="s"&gt;EOL&lt;/span&gt;

ufw&lt;span class="w"&gt; &lt;/span&gt;allow&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="m"&gt;4822&lt;/span&gt;,8080,8888/tcp

wget&lt;span class="w"&gt; &lt;/span&gt;https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb
apt&lt;span class="w"&gt; &lt;/span&gt;install&lt;span class="w"&gt; &lt;/span&gt;./google-chrome-stable_current_amd64.deb&lt;span class="w"&gt; &lt;/span&gt;-y

wget&lt;span class="w"&gt; &lt;/span&gt;-q&lt;span class="w"&gt; &lt;/span&gt;https://packages.microsoft.com/keys/microsoft.asc&lt;span class="w"&gt; &lt;/span&gt;-O-&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;|&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;apt-key&lt;span class="w"&gt; &lt;/span&gt;add&lt;span class="w"&gt; &lt;/span&gt;-
add-apt-repository&lt;span class="w"&gt; &lt;/span&gt;-y&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;&amp;quot;deb [arch=amd64] https://packages.microsoft.com/repos/vscode stable main&amp;quot;&lt;/span&gt;
apt&lt;span class="w"&gt; &lt;/span&gt;install&lt;span class="w"&gt; &lt;/span&gt;code


systemctl&lt;span class="w"&gt; &lt;/span&gt;restart&lt;span class="w"&gt; &lt;/span&gt;tomcat9

&lt;span class="k"&gt;if&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;id&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;&amp;quot;ed&amp;quot;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;&amp;amp;&lt;/span&gt;&amp;gt;/dev/null&lt;span class="p"&gt;;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="k"&gt;then&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;&amp;quot;User ed already exists. Skipping user creation.&amp;quot;&lt;/span&gt;
&lt;span class="k"&gt;else&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;useradd&lt;span class="w"&gt; &lt;/span&gt;-m&lt;span class="w"&gt; &lt;/span&gt;ed&lt;span class="w"&gt; &lt;/span&gt;-s&lt;span class="w"&gt; &lt;/span&gt;/bin/bash
&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;&amp;quot;ed:UNIX_USER_PASSWORD&amp;quot;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;|&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;chpasswd
&lt;span class="k"&gt;fi&lt;/span&gt;

su&lt;span class="w"&gt; &lt;/span&gt;-&lt;span class="w"&gt; &lt;/span&gt;ed&lt;span class="w"&gt; &lt;/span&gt;-c&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;&amp;quot;mkdir -p ~/.vnc&amp;quot;&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;#!/bin/sh&lt;/span&gt;
&lt;span class="s1"&gt;xrdb $HOME/.Xresources&lt;/span&gt;
&lt;span class="s1"&gt;startlxde &amp;amp;&amp;#39;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&amp;gt;&lt;span class="w"&gt; &lt;/span&gt;/home/ed/.vnc/xstartup
chown&lt;span class="w"&gt; &lt;/span&gt;ed:ed&lt;span class="w"&gt; &lt;/span&gt;/home/ed/.vnc/xstartup
chmod&lt;span class="w"&gt; &lt;/span&gt;+x&lt;span class="w"&gt; &lt;/span&gt;/home/ed/.vnc/xstartup

cat&lt;span class="w"&gt; &lt;/span&gt;&amp;gt;&lt;span class="w"&gt; &lt;/span&gt;/home/ed/requirements.txt&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s"&gt;&amp;lt;&amp;lt;EOL&lt;/span&gt;
&lt;span class="s"&gt;jupyterlab&lt;/span&gt;
&lt;span class="s"&gt;pandas&lt;/span&gt;
&lt;span class="s"&gt;matplotlib&lt;/span&gt;
&lt;span class="s"&gt;numpy&lt;/span&gt;
&lt;span class="s"&gt;seaborn&lt;/span&gt;
&lt;span class="s"&gt;attrs&lt;/span&gt;
&lt;span class="s"&gt;EOL&lt;/span&gt;
chown&lt;span class="w"&gt; &lt;/span&gt;ed:ed&lt;span class="w"&gt; &lt;/span&gt;/home/ed/requirements.txt

su&lt;span class="w"&gt; &lt;/span&gt;-&lt;span class="w"&gt; &lt;/span&gt;ed&lt;span class="w"&gt; &lt;/span&gt;-c&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;&amp;quot;python3 -m venv ~/jupyter_env&amp;quot;&lt;/span&gt;
su&lt;span class="w"&gt; &lt;/span&gt;-&lt;span class="w"&gt; &lt;/span&gt;ed&lt;span class="w"&gt; &lt;/span&gt;-c&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;&amp;quot;source ~/jupyter_env/bin/activate &amp;amp;&amp;amp; pip install -r ~/requirements.txt&amp;quot;&lt;/span&gt;
su&lt;span class="w"&gt; &lt;/span&gt;-&lt;span class="w"&gt; &lt;/span&gt;ed&lt;span class="w"&gt; &lt;/span&gt;-c&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;&amp;quot;mkdir ~/notebooks&amp;quot;&lt;/span&gt;

&lt;span class="nb"&gt;echo&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s"&gt;&amp;lt;&amp;lt; EOL&lt;/span&gt;
&lt;span class="s"&gt;    Run as ed: vncserver :1 -geometry 1920x1080&lt;/span&gt;
&lt;span class="s"&gt;    Then connect to http://[DROPLET IP]:8080/guacamole and run the following in a terminal:&lt;/span&gt;
&lt;span class="s"&gt;    source ~/jupyter_env/bin/activate&lt;/span&gt;
&lt;span class="s"&gt;    jupyter-lab --browser=google-chrome&lt;/span&gt;
&lt;span class="s"&gt;EOL&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</content><category term="Howtos"/></entry><entry><title>Real-time Object Detection with OpenCV and YOLO v7</title><link href="https://www.edparcell.com/pgw-yolov7.html" rel="alternate"/><published>2023-04-30T14:10:00-06:00</published><updated>2023-04-30T14:10:00-06:00</updated><author><name>Ed Parcell</name></author><id>tag:www.edparcell.com,2023-04-30:/pgw-yolov7.html</id><summary type="html">&lt;p&gt;Discover how to create a simple notebook for real-time object detection using OpenCV and YOLO v7 in this beginner-friendly Python Quick Wins tutorial.&lt;/p&gt;</summary><content type="html">&lt;p&gt;&lt;em&gt;Please note that some of the text and code for this blog post was generated by ChatGPT, under my guidance. While ChatGPT was instrumental in the process, I exercised direction and judgment, wrote and adapted code, and carefully considered how to simplify the content for the user. The collaboration between ChatGPT and myself has hopefully resulted in a superior outcome than either of us could have achieved in isolation.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;Welcome to the first article in the Python Quick Wins (PQW) blog series! The goal of this series is to deliver straightforward and digestible tutorials on loading and utilizing a wide range of Python libraries, with a focus on machine learning and computer vision. We aim to bridge the gap for beginners or those unfamiliar with specific libraries by offering clear, concise introductions without overwhelming the reader with complex examples.&lt;/p&gt;
&lt;p&gt;In this inaugural tutorial, we'll be focusing on creating a simple notebook that captures video from a webcam, identifies objects in each frame using YOLO v7, and adds labels to the live video. This tutorial serves as a gentle introduction to object detection with YOLO v7 and OpenCV, demonstrating how to integrate these powerful tools for real-time object detection in a live video stream.&lt;/p&gt;
&lt;p&gt;To support this tutorial and the entire PQW series, we've created an accompanying &lt;a href="https://github.com/edparcell/python-quick-wins"&gt;GitHub repository&lt;/a&gt; containing working Jupyter Notebooks for each tutorial. For this YOLO v7 blog post, you can find the corresponding &lt;a href="https://github.com/edparcell/python-quick-wins/tree/main/yolov7"&gt;repository directory&lt;/a&gt; and the &lt;a href="https://github.com/edparcell/python-quick-wins/blob/main/yolov7/YOLO%20v7%20Hello%20World.ipynb"&gt;Jupyter Notebook&lt;/a&gt; containing all the code from this blog post.&lt;/p&gt;
&lt;p&gt;The code in this blog post is derived from the YOLO v7 &lt;code&gt;detect.py&lt;/code&gt; example, which is released under the GPL-3 license. As a result, all code snippets in this blog post and the accompanying notebook are also covered by the GPL-3 license. Now, let's dive in and explore the world of Python Quick Wins!&lt;/p&gt;
&lt;h2&gt;1. Setting Up the Environment&lt;/h2&gt;
&lt;p&gt;Before diving into the implementation, we need to set up a suitable environment for the project. To do so, follow these steps:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Clone the YOLO v7 repository&lt;/strong&gt;: Clone the YOLO v7 repository from &lt;a href="https://github.com/WongKinYiu/yolov7"&gt;https://github.com/WongKinYiu/yolov7&lt;/a&gt; to your local machine.
&lt;code&gt;git clone https://github.com/WongKinYiu/yolov7.git&lt;/code&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Download the &lt;code&gt;yolov7.pt&lt;/code&gt; file&lt;/strong&gt;: Download the &lt;code&gt;yolov7.pt&lt;/code&gt; file from &lt;a href="https://github.com/WongKinYiu/yolov7/releases"&gt;https://github.com/WongKinYiu/yolov7/releases&lt;/a&gt; and place it in the base directory of the cloned YOLO v7 repository.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Create a new environment using the &lt;code&gt;environment.yml&lt;/code&gt; file&lt;/strong&gt;: Use the &lt;code&gt;environment.yml&lt;/code&gt; file from &lt;a href="https://github.com/edparcell/python-quick-wins/blob/main/yolov7/environment.yml"&gt;https://github.com/edparcell/python-quick-wins/blob/main/yolov7/environment.yml&lt;/a&gt; to create a new environment. This is my personal environment, and contains more dependencies than are needed for this project.
&lt;code&gt;conda env create -f environment.yml&lt;/code&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Activate the new environment&lt;/strong&gt;: Activate the newly created environment using the following command:
&lt;code&gt;conda activate pqw-20230430&lt;/code&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;h3&gt;Testing Your Environment&lt;/h3&gt;
&lt;p&gt;To ensure your environment is set up correctly, perform the following tests:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Run YOLO v7 on sample images&lt;/strong&gt;: Run the &lt;code&gt;detect.py&lt;/code&gt; script in the base directory of the YOLO v7 repository. This will run YOLO v7 on sample images.
&lt;code&gt;python detect.py&lt;/code&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Test YOLO v7 with your webcam&lt;/strong&gt;: Run the &lt;code&gt;detect.py&lt;/code&gt; script with the &lt;code&gt;--source&lt;/code&gt; flag set to &lt;code&gt;0&lt;/code&gt;. This will display a video capture window with labeled regions from YOLO v7. If you have multiple webcams or virtual webcam devices, you may need to experiment with other numbers for the source.
&lt;code&gt;python detect.py --source 0&lt;/code&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;If the above tests work as expected, you're ready to proceed with the tutorial. In the next section, we'll show you how to create a simple OpenCV capture and display loop.&lt;/p&gt;
&lt;h2&gt;2. Capturing Video with OpenCV&lt;/h2&gt;
&lt;p&gt;&lt;a href="https://opencv.org/"&gt;OpenCV&lt;/a&gt; (Open Source Computer Vision Library) is an open-source computer vision and machine learning software library. It provides a wide range of tools for image and video processing, including capturing video from cameras, reading and writing video files, and displaying images.&lt;/p&gt;
&lt;p&gt;In this section, we'll guide you through creating a simple OpenCV capture and display loop to obtain video from your webcam.&lt;/p&gt;
&lt;h3&gt;Creating a Simple OpenCV Capture and Display Loop&lt;/h3&gt;
&lt;p&gt;Below is sample code to create a basic OpenCV capture and display loop:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;cv2&lt;/span&gt;

&lt;span class="c1"&gt;# Open the default camera for capturing video.&lt;/span&gt;
&lt;span class="n"&gt;cap&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;VideoCapture&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Loop until the camera is closed.&lt;/span&gt;
&lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="n"&gt;cap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;isOpened&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="c1"&gt;# Read a frame from the camera and ensure successfully read&lt;/span&gt;
        &lt;span class="n"&gt;ret&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;img&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;read&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="k"&gt;assert&lt;/span&gt; &lt;span class="n"&gt;ret&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;&amp;quot;Failed to read&amp;quot;&lt;/span&gt;

        &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;imshow&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;&amp;quot;YOLO v7 Demo&amp;quot;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Exit the loop if the user presses the &amp;#39;q&amp;#39; key.&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;waitKey&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt; &lt;span class="mh"&gt;0xFF&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="nb"&gt;ord&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;q&amp;#39;&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
            &lt;span class="k"&gt;break&lt;/span&gt;
&lt;span class="k"&gt;finally&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="c1"&gt;# Release the camera and close all windows.&lt;/span&gt;
    &lt;span class="n"&gt;cap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;release&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;destroyAllWindows&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;In this example, we follow a common template for approaching computer vision tasks using OpenCV. We begin by opening a video capture device, then run a loop to capture a frame, process it, and display the result. During the loop, we also check if the user wants to quit the application. Finally, when the loop ends, we clean up resources by releasing the capture device and closing any open windows. This template can be easily adapted for various computer vision tasks by modifying the processing step.&lt;/p&gt;
&lt;p&gt;Now that we can capture and display video using OpenCV, let's move on to integrating YOLO v7 for object detection.&lt;/p&gt;
&lt;h2&gt;3. Import YOLO v7 and Loading Models&lt;/h2&gt;
&lt;p&gt;YOLO v7 (You Only Look Once version 7) is a state-of-the-art object detection model that can quickly and accurately detect objects in images. In this section, we'll explain how to import YOLO v7 .&lt;/p&gt;
&lt;h3&gt;Adding the YOLO v7 Repo to the Python Path&lt;/h3&gt;
&lt;p&gt;First, we need to add the YOLO v7 repository to the Python path to make it accessible for import. Replace &lt;code&gt;[Path to yolo v7 repo]&lt;/code&gt; with the actual path to the cloned YOLO v7 repository:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;pathlib&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;sys&lt;/span&gt;

&lt;span class="n"&gt;pth_yolov7&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;pathlib&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Path&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;r&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;[Path to yolo v7 repo]&amp;#39;&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;init_file&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;pth_yolov7&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="s2"&gt;&amp;quot;__init__.py&amp;quot;&lt;/span&gt;
&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;init_file&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;exists&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="n"&gt;init_file&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;touch&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pth_yolov7&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;sys&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;path&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;sys&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;path&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pth_yolov7&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Creating an &lt;code&gt;__init__.py&lt;/code&gt; file is necessary so we can import modules from the YOLO v7 repo. This file also ensures that when we load the model, it can create objects with the classes defined in the YOLO v7 repo.&lt;/p&gt;
&lt;h3&gt;Loading the YOLO v7 Model&lt;/h3&gt;
&lt;p&gt;Next, we'll create a CUDA device object and load the YOLO v7 model onto it:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;cv2&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;torch&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;torch.nn&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nn"&gt;nn&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;torch.backends.cudnn&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nn"&gt;cudnn&lt;/span&gt;

&lt;span class="n"&gt;cudnn&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;benchmark&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kc"&gt;True&lt;/span&gt;

&lt;span class="n"&gt;device&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;device&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;cuda:0&amp;#39;&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;model_path&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pth_yolov7&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="s1"&gt;&amp;#39;yolov7.pt&amp;#39;&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;ckpt&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;load&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;map_location&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;device&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;ckpt&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;model&amp;#39;&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;float&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;fuse&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;eval&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;m&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;modules&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="nb"&gt;type&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;m&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Hardswish&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;LeakyReLU&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ReLU&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ReLU6&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SiLU&lt;/span&gt;&lt;span class="p"&gt;]:&lt;/span&gt;
        &lt;span class="n"&gt;m&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;inplace&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kc"&gt;True&lt;/span&gt;
    &lt;span class="k"&gt;elif&lt;/span&gt; &lt;span class="nb"&gt;type&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;m&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="ow"&gt;is&lt;/span&gt; &lt;span class="n"&gt;nn&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Upsample&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;m&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;recompute_scale_factor&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kc"&gt;None&lt;/span&gt;
&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;half&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;We're using a CUDA device for faster processing, but you can also use a CPU device, albeit with slower performance. Because we're using a CUDA device, we need to use half-precision, as indicated by &lt;code&gt;model.half()&lt;/code&gt;. The loop over the modules with changes to some is necessary to ensure the loaded objects are valid in more recent versions of PyTorch. Setting cudnn.benchmark to True enhances performance.&lt;/p&gt;
&lt;p&gt;We also extract the stride of the model, which is used to ensure our input images are appropriately sized as a multiple of the stride:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="n"&gt;stride&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;stride&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;max&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Now that we've created a CUDA device and loaded the YOLO v7 model into our project, we'll move on to processing the image data for the model and displaying the results.&lt;/p&gt;
&lt;h2&gt;4. Preparing the Image for Object Detection&lt;/h2&gt;
&lt;p&gt;Before feeding an image into the YOLO v7 model for object detection, it's crucial to resize and crop the image to ensure optimal performance and accurate results. Resizing the image to a smaller size reduces computational complexity, while cropping ensures each dimension is a multiple of the model's stride.&lt;/p&gt;
&lt;p&gt;The following function, &lt;code&gt;letterbox&lt;/code&gt;, resizes the input image while maintaining its aspect ratio, and then trims it to ensure its height is a multiple of the model's stride:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;letterbox&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;im&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;new_width&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;stride&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="sd"&gt;&amp;quot;&amp;quot;&amp;quot;Resizes image to new width while maintaining aspect ratio, and trims to ensure height is a multiple of stride.&amp;quot;&amp;quot;&amp;quot;&lt;/span&gt;
    &lt;span class="n"&gt;new_width&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;new_width&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;h&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;w&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;im&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;shape&lt;/span&gt;&lt;span class="p"&gt;[:&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="n"&gt;r&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new_width&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="n"&gt;w&lt;/span&gt;
    &lt;span class="n"&gt;scaled_height&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;r&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;h&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;im&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;resize&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;im&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;new_width&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;scaled_height&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;interpolation&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;INTER_LINEAR&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;trim_rows&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;scaled_height&lt;/span&gt; &lt;span class="o"&gt;%&lt;/span&gt; &lt;span class="n"&gt;stride&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;trim_rows&lt;/span&gt; &lt;span class="o"&gt;!=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;final_height&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;scaled_height&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;trim_rows&lt;/span&gt;
        &lt;span class="n"&gt;offset&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;trim_rows&lt;/span&gt; &lt;span class="o"&gt;//&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;
        &lt;span class="n"&gt;im&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;im&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;offset&lt;/span&gt;&lt;span class="p"&gt;:(&lt;/span&gt;&lt;span class="n"&gt;offset&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;final_height&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;im&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;This function first calculates the aspect ratio and resizes the image accordingly. Then, it trims the image to ensure its height is a multiple of the stride, by removing a balanced number of rows from the top and bottom of the image if necessary.&lt;/p&gt;
&lt;p&gt;With the &lt;code&gt;letterbox&lt;/code&gt; function ready, we can now preprocess our images before passing them to the YOLO v7 model for object detection, by adding the following line to our OpenCV capture loop:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="n"&gt;img&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;letterbox&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;im0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;640&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;stride&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;h2&gt;5. Processing Image Data and Model Outputs&lt;/h2&gt;
&lt;p&gt;In this section, we will explain how to send the preprocessed image data to the YOLO v7 model for object detection and how to interpret and process the model outputs for display (e.g., object bounding boxes, labels, confidence scores).&lt;/p&gt;
&lt;h3&gt;Running the Model on Preprocessed Image Data&lt;/h3&gt;
&lt;p&gt;After preprocessing the image, we can use the following function &lt;code&gt;run_model&lt;/code&gt; to run the YOLO v7 model on the input image tensor:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;run_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;device&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="sd"&gt;&amp;quot;&amp;quot;&amp;quot;Runs a PyTorch model on the input image tensor after preprocessing it.&amp;quot;&amp;quot;&amp;quot;&lt;/span&gt;
    &lt;span class="n"&gt;img&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;expand_dims&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;img&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;[:,&lt;/span&gt; &lt;span class="p"&gt;:,&lt;/span&gt; &lt;span class="p"&gt;:,&lt;/span&gt; &lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;transpose&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;img&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ascontiguousarray&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;img&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;from_numpy&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;to&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;device&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;half&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;img&lt;/span&gt; &lt;span class="o"&gt;/=&lt;/span&gt; &lt;span class="mf"&gt;255.0&lt;/span&gt;

    &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;no_grad&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;)[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;This function handles additional preprocessing and loads the image data to the CUDA device. We use &lt;code&gt;torch.no_grad()&lt;/code&gt; to avoid calculating gradients during evaluation, as they are only needed for training and would cause a GPU memory leak here.&lt;/p&gt;
&lt;p&gt;Add the following line to the OpenCV capture loop to run the model on the image:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="n"&gt;pred&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;run_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;device&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;h3&gt;Interpreting and Processing Model Outputs&lt;/h3&gt;
&lt;p&gt;Now, we'll apply Non-Maximum Suppression (NMS) to the model output to remove overlapping boxes and filter out low-confidence detections:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;utils.general&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;non_max_suppression&lt;/span&gt;

&lt;span class="n"&gt;pred&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;non_max_suppression&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pred&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Add this line to the OpenCV capture loop as well.&lt;/p&gt;
&lt;p&gt;The &lt;code&gt;non_max_suppression&lt;/code&gt; function from the YOLO v7 repo filters the predicted boxes based on their confidence scores and suppresses overlapping boxes that have a high Intersection over Union (IoU). The function returns a list of detections, with one (n, 6) tensor per image, where n is the number of remaining detections for that image, and the 6 columns represent the bounding box coordinates in (xmin, ymin, xmax, ymax) format, the objectness score, and the predicted class index.&lt;/p&gt;
&lt;p&gt;With the model outputs processed, we can now display the object detection results on the live video stream.&lt;/p&gt;
&lt;h2&gt;6. Adding Labels to the Frame Image&lt;/h2&gt;
&lt;p&gt;In this section, we'll show you how to add labels to the frame image for each detected object. To do this, we'll use two functions: &lt;code&gt;plot_one_box&lt;/code&gt; and &lt;code&gt;plot_boxes&lt;/code&gt;.&lt;/p&gt;
&lt;h3&gt;Drawing Bounding Boxes and Labels&lt;/h3&gt;
&lt;p&gt;&lt;code&gt;plot_one_box&lt;/code&gt; is a function that takes the bounding box coordinates, the image, a label, and a color. It draws a rectangle on the image, adds the label with the given color, and writes it on the image.&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;plot_one_box&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;label&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;color&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="sd"&gt;&amp;quot;&amp;quot;&amp;quot;Draws a rectangle on the input image, adds a label with the given color, and writes it on the image.&amp;quot;&amp;quot;&amp;quot;&lt;/span&gt;
    &lt;span class="n"&gt;c1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;c2&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;int&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]),&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;])),&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;int&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;]),&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;]))&lt;/span&gt;
    &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;rectangle&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;c1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;c2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;color&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;thickness&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;lineType&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;LINE_AA&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;t_size&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;getTextSize&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;label&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;fontScale&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;thickness&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="n"&gt;c2&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;c1&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;t_size&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;c1&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;t_size&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;
    &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;rectangle&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;c1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;c2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;color&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;LINE_AA&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;putText&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;label&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;c1&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;c1&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;225&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;255&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;255&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;thickness&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;lineType&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;LINE_AA&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;&lt;code&gt;plot_boxes&lt;/code&gt; is a function that takes an image, a list of detection predictions, a list of class names, and a list of colors. It draws rectangles and writes labels on the input image for each detection prediction.&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;plot_boxes&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;pred&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;names&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;colors&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="sd"&gt;&amp;quot;&amp;quot;&amp;quot;Draws rectangles and writes labels on an input image for each detection prediction from a list.&amp;quot;&amp;quot;&amp;quot;&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;det&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;pred&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="n"&gt;xyxy&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;conf&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="bp"&gt;cls&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nb"&gt;reversed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;det&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
            &lt;span class="n"&gt;label&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;names&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nb"&gt;int&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="bp"&gt;cls&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s1"&gt; &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;conf&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="s1"&gt;.2f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;&lt;/span&gt;
            &lt;span class="n"&gt;plot_one_box&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;xyxy&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;label&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;colors&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nb"&gt;int&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="bp"&gt;cls&lt;/span&gt;&lt;span class="p"&gt;)])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;h3&gt;Integrating the Functions into the OpenCV Loop&lt;/h3&gt;
&lt;p&gt;Add the following line to the OpenCV capture loop to draw the bounding boxes and labels on the image:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="n"&gt;plot_boxes&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;pred&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;names&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;colors&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;To get the class names from the model and generate random colors for the bounding boxes, use the following code at initialization, outside the OpenCV loop:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="n"&gt;names&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;names&lt;/span&gt;
&lt;span class="n"&gt;colors&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[[&lt;/span&gt;&lt;span class="n"&gt;random&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;randint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;255&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nb"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;names&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;With these additions to the capture loop, the object detection results, including bounding boxes and labels, will be displayed on the live video stream.&lt;/p&gt;
&lt;p&gt;In this tutorial, we have shown you how to integrate YOLO v7 with OpenCV for object detection in a live video stream. The final edited code for initialization and the OpenCV loop with object detection is provided below:&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;code&gt;&lt;span class="c1"&gt;# Set the input image size and enable benchmark mode for CuDNN to speed up inference.&lt;/span&gt;
&lt;span class="n"&gt;imgsz&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;640&lt;/span&gt;
&lt;span class="n"&gt;cudnn&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;benchmark&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kc"&gt;True&lt;/span&gt;

&lt;span class="c1"&gt;# Get the class names for the model and generate random colors for drawing boxes on the image.&lt;/span&gt;
&lt;span class="n"&gt;names&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;names&lt;/span&gt;
&lt;span class="n"&gt;colors&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[[&lt;/span&gt;&lt;span class="n"&gt;random&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;randint&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;255&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nb"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;names&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;

&lt;span class="c1"&gt;# Open the default camera for capturing video.&lt;/span&gt;
&lt;span class="n"&gt;cap&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;VideoCapture&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Loop until the camera is closed.&lt;/span&gt;
&lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="n"&gt;cap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;isOpened&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="c1"&gt;# Read a frame from the camera and ensure successfully read&lt;/span&gt;
        &lt;span class="n"&gt;ret&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;im0&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;read&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="k"&gt;assert&lt;/span&gt; &lt;span class="n"&gt;ret&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;&amp;quot;Failed to read&amp;quot;&lt;/span&gt;

        &lt;span class="c1"&gt;# Resize and pad the image to the specified size while maintaining the aspect ratio.&lt;/span&gt;
        &lt;span class="n"&gt;img&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;letterbox&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;im0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;imgsz&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;stride&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Run the model on the preprocessed image.&lt;/span&gt;
        &lt;span class="n"&gt;pred&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;run_model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;device&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Perform non-maximum suppression to remove overlapping boxes.&lt;/span&gt;
        &lt;span class="n"&gt;pred&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;non_max_suppression&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pred&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Draw the boxes on the image and display it.&lt;/span&gt;
        &lt;span class="n"&gt;plot_boxes&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;pred&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;names&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;colors&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;imshow&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;&amp;quot;YOLO v7 Demo&amp;quot;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;img&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Exit the loop if the user presses the &amp;#39;q&amp;#39; key.&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;waitKey&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt; &lt;span class="mh"&gt;0xFF&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="nb"&gt;ord&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;q&amp;#39;&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
            &lt;span class="k"&gt;break&lt;/span&gt;
&lt;span class="k"&gt;finally&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="c1"&gt;# Release the camera and close all windows.&lt;/span&gt;
    &lt;span class="n"&gt;cap&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;release&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;destroyAllWindows&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;h2&gt;Conclusion&lt;/h2&gt;
&lt;p&gt;In this tutorial, we've covered the following steps:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Installing necessary dependencies&lt;/li&gt;
&lt;li&gt;Capturing video with OpenCV&lt;/li&gt;
&lt;li&gt;Integrating YOLO v7 for object detection&lt;/li&gt;
&lt;li&gt;Preparing the image for object detection&lt;/li&gt;
&lt;li&gt;Processing image data and model outputs&lt;/li&gt;
&lt;li&gt;Adding labels to the frame image&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;The full notebook with all the code can be found at &lt;a href="https://github.com/edparcell/python-quick-wins/blob/main/yolov7/YOLO%20v7%20Hello%20World.ipynb"&gt;this link&lt;/a&gt;. We encourage you to experiment with the notebook and explore further applications of OpenCV and YOLO v7 for object detection in various domains. By doing so, you'll gain a deeper understanding of how to adapt and expand these techniques to fit your specific use cases.&lt;/p&gt;</content><category term="Python-Quick-Wins"/></entry><entry><title>Best Things I've Read</title><link href="https://www.edparcell.com/best-things-ive-read.html" rel="alternate"/><published>2022-09-06T17:00:00-06:00</published><updated>2022-09-06T17:00:00-06:00</updated><author><name>Ed Parcell</name></author><id>tag:www.edparcell.com,2022-09-06:/best-things-ive-read.html</id><summary type="html">&lt;p&gt;List of the best books, articles and blogs I've read&lt;/p&gt;</summary><content type="html">&lt;h1&gt;Best Things I've Read&lt;/h1&gt;
&lt;p&gt;This is a list of the best things I've ever read on various topics. The criteria for inclusion in this list are simple:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Simple&lt;/strong&gt;. It has to be approchable without a deep background on the subject material. Most of us don't have time to become experts &lt;em&gt;before&lt;/em&gt; we read a great work.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Engaging&lt;/strong&gt;. There is plenty of worthy material in the world that I will never be able to plow my way through. I won't include anything on my list that I haven't read completely.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Provokes Change&lt;/strong&gt;. The greatest things I have read irrecovably change the way I think and the way I see the world. Often they provide me with new tools and approaches as I try to understand the world and to create new things.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;Sorry that I haven't provided detailed descriptions at this time. I hope to come back and flesh this out with what each book/article covers, why it is profound, and what you will get out of reading it. In the meantime, trust me I guess? These are all great.&lt;/p&gt;
&lt;h2&gt;Software Engineering&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;README from the original commit of git (&lt;a href="https://github.com/git/git/blob/e83c5163316f89bfbde7d9ab23ca2e25604af290/README"&gt;Text File&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Git Book, Chapter 10: Git Internals (&lt;a href="https://git-scm.com/book/en/v2/Git-Internals-Plumbing-and-Porcelain"&gt;Online Book&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;The Log: What every software engineer should know about real-time data's unifying abstraction (&lt;a href="https://engineering.linkedin.com/distributed-systems/log-what-every-software-engineer-should-know-about-real-time-datas-unifying"&gt;Article&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Machine Learning recommendation: Python Machine Learning by Raschka and Mirjalili (Buy from &lt;a href="https://www.packtpub.com/product/python-machine-learning-third-edition/9781789955750"&gt;Packt&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Machine Learning recommendation: Hands-On Machine Learning with Scikit-Learn, Keras and TensorFlow by Aurélien Géron (Buy from [O'Reilly])(https://www.oreilly.com/library/view/hands-on-machine-learning/9781491962282/)&lt;/li&gt;
&lt;li&gt;Feynman's Appendix to the Rogers Commission Report on the Challenger Disaster (&lt;a href="https://www.refsmmat.com/files/reflections.pdf"&gt;PDF&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;The Mythical Man-Month: Essays on Software Engineering by Fred Brooks ([Wikipedia Page])(https://en.wikipedia.org/wiki/The_Mythical_Man-Month)&lt;/li&gt;
&lt;li&gt;I would love to be able to include Structure and Interpretation of Computer Programs by Abelson and Sussman on this list, but unfortunately I've never read it. I did watch their excellent MIT lecture series of the same material. (&lt;a href="https://www.youtube.com/watch?v=2Op3QLzMgSY"&gt;YouTube&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a href="http://numerical.recipes/"&gt;Numerical Recipes in C++&lt;/a&gt;)&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Economics&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;The End of the World is Just the Beginning by Peter Zeihan (&lt;a href="https://zeihan.com/end-of-the-world/"&gt;Author's Homepage&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Principles for Dealing with the Changing World Order: Why Nations Succeed and Fail by Ray Dalio (&lt;a href="https://www.principles.com/#get-the-books"&gt;Author's Homepage&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Goliath by Matt Stoller (&lt;a href="https://www.simonandschuster.com/books/Goliath/Matt-Stoller/9781501182891"&gt;Publisher's Page&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;The Fourth Turning by William Strauss and Neil Howe&lt;/li&gt;
&lt;li&gt;Red Plenty by Francis Spufford&lt;/li&gt;
&lt;li&gt;Factfulness by Hans Rosling (&lt;a href="https://www.gapminder.org/factfulness-book/"&gt;Author's Foundation's Page&lt;/a&gt;)&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;How to Live&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;A Guide to the Good Life: The Ancient Art of Stoic Joy (&lt;a href="https://www.barnesandnoble.com/w/a-guide-to-the-good-life-william-b-irvine/1112547265"&gt;Barnes &amp;amp; Noble&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Deep Work by Cal Newport (&lt;a href="https://www.calnewport.com/books/deep-work/"&gt;Author's Page&lt;/a&gt;)&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Biographies&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;A Man for All Markets - Autobiography by Edward O.Thorp (&lt;a href="http://www.edwardothorp.com/books/a-man-for-all-markets/"&gt;Author's Page&lt;/a&gt;)&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Math&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;The Pleasures of Counting by T.W. Körner (&lt;a href="https://www.cambridge.org/us/academic/subjects/mathematics/recreational-mathematics/pleasures-counting?format=PB"&gt;Cambridge University Press&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Algebra Volume 1 by P.M. Cohn (&lt;a href="https://www.maths.ed.ac.uk/~v1ranick/papers/cohnalg1.pdf"&gt;PDF&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Elementary Probability by Stirzaker (&lt;a href="https://www.cambridge.org/core/books/elementary-probability/56C52DDC8C3F59615331783E66DB2AC5"&gt;Cambridge University Press&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Stochastic Differential Equations by Øksendal (&lt;a href="https://link.springer.com/book/10.1007/978-3-642-14394-6"&gt;Springer&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Gaussian Processes for Machine Learning by Rasmussen and Williams (&lt;a href="https://gaussianprocess.org/gpml/"&gt;Book's Website&lt;/a&gt;)&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Business&lt;/h2&gt;
&lt;p&gt;http://www2.csudh.edu/ccauthen/576f12/frankfurt__harry_-_on_bullshit.pdf))
* The Competitive Strategy: Techniques for Analyzing Industries and Competitors (&lt;a href="https://www.hbs.edu/faculty/Pages/item.aspx?num=195"&gt;HBS&lt;/a&gt;)&lt;/p&gt;
&lt;h2&gt;Miscellaneous&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;On Bull*** by Harry G. Frankfurt (&lt;a href="https://press.princeton.edu/books/hardcover/9780691122946/on-bullshit"&gt;Princeton University Press&lt;/a&gt;) ([PDF](&lt;/li&gt;
&lt;/ul&gt;</content><category term="Recommendations"/></entry><entry><title>Recommended Monitor Setup</title><link href="https://www.edparcell.com/best-monitor-2022.html" rel="alternate"/><published>2022-09-02T17:00:00-06:00</published><updated>2022-09-02T17:00:00-06:00</updated><author><name>Ed Parcell</name></author><id>tag:www.edparcell.com,2022-09-02:/best-monitor-2022.html</id><summary type="html">&lt;p&gt;Recommendations for monitor setup for work in 2022.&lt;/p&gt;</summary><content type="html">&lt;h1&gt;Recommendations&lt;/h1&gt;
&lt;p&gt;My ideal monitor setup is a central 43" 4K monitor, with two 24" 1440p wings in portrait mode.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Recommended 43" monitor for central monitor&lt;/strong&gt;: &lt;a href="https://www.lg.com/us/monitors/lg-43un700-b-4k-uhd-led-monitor"&gt;LG 43UN700-B&lt;/a&gt; ($599 at time of writing)&lt;ul&gt;
&lt;li&gt;Alternative 43" monitor for central monitor: &lt;a href="https://www.dell.com/en-us/shop/dell-ultrasharp-43-4k-usb-c-monitor-u4320q/apd/210-avke/monitors-monitor-accessories"&gt;Dell U4320Q&lt;/a&gt; ($975 at time of writing)&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Recommended 24" monitor for wing monitors&lt;/strong&gt;: &lt;a href="https://www.lg.com/us/monitors/lg-24qp500-b"&gt;LG 24QP500-B&lt;/a&gt; ($200 at time of writing)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Recommended window management software&lt;/strong&gt;: FancyZones which is part of Microsoft PowerToys. &lt;a href="https://docs.microsoft.com/en-us/windows/powertoys/"&gt;Download here&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Recommended color temperature software&lt;/strong&gt;: &lt;a href="https://justgetflux.com/"&gt;F.lux&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;YMMV monitor arms: I use an &lt;a href="https://www.ergotron.com/en-us/products/product-details/45-475#?color=white"&gt;Ergotron HX&lt;/a&gt; arm for the central monitor and &lt;a href="https://www.ergotron.com/en-us/products/product-details/45-241#?color=white&amp;amp;attachment%20option=2-Piece%20Clamp&amp;amp;buynow=0"&gt;Ergotron LX&lt;/a&gt; arms for the wing monitors. ($329 and $189 respectively at time of writing)&lt;/li&gt;
&lt;/ul&gt;
&lt;h1&gt;What is your recommended monitor setup?&lt;/h1&gt;
&lt;p&gt;My ideal monitor setup is a central 43" 4K monitor, with two 24" 1440p wings in portrait mode.&lt;/p&gt;
&lt;p&gt;I use the &lt;a href="https://docs.microsoft.com/en-us/windows/powertoys/fancyzones"&gt;FancyZones&lt;/a&gt; tool in Microsoft PowerToys to manage windows, so I can set zones on each screen and maximize windows to zones. Microsoft PowerToys is free. Microsoft PowerToys is superior to the alternatives because it is simple: there are only two things to remember - shift-drag a window to maximize it to a zone, and Windows key+Backtick to enter the editor to edit zones. &lt;/p&gt;
&lt;p&gt;I typically have the central monitor divided into 4 in a 2x2 grid, or into 3 columns, with the central column having two 1920x1080-sized zones, and the side columns having 3 smaller zones each (see below). The wing monitors are almost always divided into 3 zones.&lt;/p&gt;
&lt;p&gt;&lt;img alt="Central monitor layouts" src="https://www.edparcell.com/images/FancyZones Layouts.png"&gt;&lt;/p&gt;
&lt;p&gt;I have my monitors mounted on Ergotron arms, which makes set up and adjustment easier, but re-aligning still takes a little time, so I only make adjustments infrequently. Ergotron arms are high quality, but expensive - moreso in the last few years. 43" monitors are heavy, but it might be worth looking at alternatives that can support the weight safely. The wing monitors are almost the same height as the central monitor, and can be slightly angled in so everything is pointing at me, and about the same distance away.&lt;/p&gt;
&lt;p&gt;I had to play with the brightness settings a bit to get the LG monitors to match - they are not perfect, and but good enough that I don't notice it. I use &lt;a href="https://justgetflux.com/"&gt;f.lux&lt;/a&gt; to limit blue light at night, which is important to me with so much screen real estate. F.lux is free.&lt;/p&gt;
&lt;h1&gt;Why is it so good?&lt;/h1&gt;
&lt;p&gt;This is the largest amount of screen real estate I can reasonably use without having to move around. Zones (areas I can quickly and easily maximize a window to) has always been the most important metric for me. That is why 3 smaller screens have traditionally been more productive than one large screen. A giant screen with one window is less useful than being able to  see a code window, an interactive notebook, a spreadsheet or two, some documentation, a git client... &lt;strong&gt;ALL AT THE SAME TIME&lt;/strong&gt;. So FancyZones makes large monitors incredibly usable, as they can be divided up into smaller "screens", and the layout can be changed on the fly for different tasks.&lt;/p&gt;
&lt;p&gt;The wing monitors are great for monitoring markets or system logs, or for throwing windows that you will need to access, but don't want to be distracted by - like multiple Windows Explorers if you are moving around files. Having a central monitor with two wings, rather than two large monitors, say, also means that there is no seam down the middle. Middle seams are incredibly distracting, and tend to drive you to use one monitor while the other sits largely idle. &lt;/p&gt;
&lt;p&gt;This setup is also good when you have a task that needs you to concentrate on a single window. Minimize everything else (Windows key + M), and then put the single window in the center of the central monitor, often expanded fully vertically.&lt;/p&gt;
&lt;h1&gt;Any drawbacks?&lt;/h1&gt;
&lt;p&gt;&lt;strong&gt;Videoconferencing compromised&lt;/strong&gt;. Having so much screen in front of you pushes your webcam out of the way. People will be looking down (or up) at you in Zoom meetings. This is ameliorated by setting a virtual background and putting your Zoom window close to where the camera is, rather than full screen - looking close to the camera, and without the background cues, it becomes less obvious that the camera is high. Alternatively, get a tripod you can quickly set up before calls (Recommended for desk webcams: &lt;a href="https://benrousa.com/bk10-mini-tripod-and-selfie-stick/"&gt;Benro BK10 selfie stick&lt;/a&gt;).&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Flat screens&lt;/strong&gt;. The screens are flat, not curved. Some people prefer curved monitors. I'm ambivalent. The ability to turn the wings in gives me what I need. I'm a little off-axis for the extreme corners of the central monitor, so there is some loss of color fidelity, but this is unimportant to me. Curved monitors seem to be all-or-nothing for glare control, where as flat monitors pick up everything behind you somewhere on the screen. So if you're going to use this setup in front of windows, you will probably need decent blinds or shutters.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Pixel density&lt;/strong&gt;. These monitors are not the highest pixel density. If you read a lot and are sensitive to this, then you might prefer a smaller 4K monitor. I grew up in the 80s, with CRT monitors, so all modern monitors seem amazing to me.&lt;/p&gt;
&lt;h1&gt;How could it be improved?&lt;/h1&gt;
&lt;p&gt;If I ran LG, I would make a large monitor that was the same size as these three monitors together. It would be curved, and have Retina-level pixel density. I hope that the success of ultrawide monitors for gaming drives the development of large format ultrawide monitors, which would be very similar to what I have described.&lt;/p&gt;
&lt;h1&gt;Final thoughts&lt;/h1&gt;
&lt;p&gt;Since I first used a computer in 1984, I have always wanted more screen space so I could see more code, more information. I think we have now reached a point where this setup provides the maximum amount of practically usable real estate without having to move. I tried to use two 43" monitors, and the distance between the edges is about 80", which is too large for a sitting setup. So until we go to virtual reality, I think this is the setup for me. I hope this post is useful to people trying to optimize their working setups - I've experimented for several years to get to this point. Would be happy to hear any suggestions people have - this forum software doesn't have comments, but you can email me at &lt;a href="&amp;#109;&amp;#97;&amp;#105;&amp;#108;&amp;#116;&amp;#111;&amp;#58;&amp;#101;&amp;#100;&amp;#64;&amp;#101;&amp;#100;&amp;#112;&amp;#97;&amp;#114;&amp;#99;&amp;#101;&amp;#108;&amp;#108;&amp;#46;&amp;#99;&amp;#111;&amp;#109;"&gt;&amp;#101;&amp;#100;&amp;#64;&amp;#101;&amp;#100;&amp;#112;&amp;#97;&amp;#114;&amp;#99;&amp;#101;&amp;#108;&amp;#108;&amp;#46;&amp;#99;&amp;#111;&amp;#109;&lt;/a&gt;.&lt;/p&gt;</content><category term="Recommendations"/></entry><entry><title>Computer Recommendation: $600-$700 Touchscreen Laptop (Q4 2020)</title><link href="https://www.edparcell.com/computer-recommendation-600-700-touchscreen-laptop-q4-2020.html" rel="alternate"/><published>2020-11-06T14:18:00-07:00</published><updated>2020-11-06T14:18:00-07:00</updated><author><name>Ed Parcell</name></author><id>tag:www.edparcell.com,2020-11-06:/computer-recommendation-600-700-touchscreen-laptop-q4-2020.html</id><summary type="html">&lt;p&gt;Answer first: I recommend the &lt;a class="reference external" href="https://deals.dell.com/en-us/mpp/productdetail/5uky"&gt;Dell Inspiron 14 5000 2-in-1 Laptop for $636.99 on Early Black Friday deal&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;If you have a reputation of knowing about computers, sometimes people will ask you what they should get. This is tricky, because there are a raft of PC brands, and each …&lt;/p&gt;</summary><content type="html">&lt;p&gt;Answer first: I recommend the &lt;a class="reference external" href="https://deals.dell.com/en-us/mpp/productdetail/5uky"&gt;Dell Inspiron 14 5000 2-in-1 Laptop for $636.99 on Early Black Friday deal&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;If you have a reputation of knowing about computers, sometimes people will ask you what they should get. This is tricky, because there are a raft of PC brands, and each has a baffling proliferation of product lines. When I get these requests, and have time to research a decent recommendation, I'll try to write them up so more than one person can make use of them.&lt;/p&gt;
&lt;p&gt;In this case, a friend who is a teacher wanted a recommendation for a laptop that she could use for remote teaching. She is currently using a chromebook, but would like a touchscreen to make it easier to highlight things, and an external screen. It needs to be portable around the house, but won't be carried around all day every day, and of course it needs a webcam and mic.&lt;/p&gt;
&lt;p&gt;I looked through some reviews and Dell, Lenovo and Asus's product line-ups, as those are brands I trust and have experience with. It's a little tricky at the moment because stock is thin. In a normal November I'd advising waiting for Black Friday, but it's not clear whether there will be availability, or how much savings will be possible.&lt;/p&gt;
&lt;div class="line-block"&gt;
&lt;div class="line"&gt;My recommendation on specifications in order of importance:&lt;/div&gt;
&lt;div class="line"&gt;Definitely a 14&amp;quot; or 15&amp;quot; 1080p screen. Dell have some models with lower resolution screens. Not a good place to compromise.&lt;/div&gt;
&lt;div class="line"&gt;Strongly prefer 8GB RAM and 256GB SSD. Assuming decent build quality, I think these will determine the longevity of the machine.&lt;/div&gt;
&lt;div class="line"&gt;Recommend a Intel i3 or i5 processor (not Celeron or Pentium), or a Ryzen 5 processor. The revision number is less important, but the Ryzen 5 4500U is significantly more powerful than the 3500U. Anything that starts with a 9, 10 or 11 on the Intel side should be fine.&lt;/div&gt;
&lt;/div&gt;
&lt;p&gt;Weight will be 3-4lbs, which is fine to carry around the house, but maybe a little higher than you'd want to carry on your shoulder all day.&lt;/p&gt;
&lt;p&gt;On recommending specific models, ideally you could see them in person, but I don't think the precise models are in stock in Best Buy right now. As a result, it's probably best to stick to Dell or Lenovo, which are safer brands. I personally have an Asus, and like it fine, but that's because I got it last year so I saw it in person to make sure the screen, mouse pad, build quality were good - their model line-up is weird, so it's hard to recommend buying blind.&lt;/p&gt;
&lt;p&gt;I think most of the built in cameras are 720p. For what it's worth, my webcam recommendations are &lt;a class="reference external" href="https://www.bestbuy.com/site/razer-kiyo-webcam/6289641.p?skuId=6289641"&gt;Razer Kiyo&lt;/a&gt;, &lt;a class="reference external" href="https://www.bestbuy.com/site/logitech-c920s-hd-webcam/6321794.p?skuId=6321794"&gt;Logitech C920S&lt;/a&gt; or &lt;a class="reference external" href="https://www.bestbuy.com/site/avermedia-live-streamer-cam-313-webcam/6410286.p?skuId=6410286"&gt;Avermedia 313&lt;/a&gt;, and my microphone recommendation is the &lt;a class="reference external" href="https://www.amazon.com/Shure-MV5-Condenser-Microphone-Lightning/dp/B010W6W9EQ"&gt;Shure MV5&lt;/a&gt; ($80) or &lt;a class="reference external" href="https://www.amazon.com/AmazonBasics-Desktop-Mini-Condenser-Microphone/dp/B076ZSR6BB"&gt;AmazonBasics Desktop Mini&lt;/a&gt; ($45) if you can find it.&lt;/p&gt;
&lt;p&gt;The Lenovos have the numeric keypads, but I think they are a little overpriced. Amazon will sell you an &lt;a class="reference external" href="https://www.amazon.com/Numeric-Keypads/b?ie=UTF8&amp;amp;node=2998471011"&gt;external numeric keypad starting from $10&lt;/a&gt; if that helps. I'd spend the extra $7 on the &lt;a class="reference external" href="https://www.amazon.com/Mechanical-Numeric-Backlit-Desktop-Computer/dp/B07FFLNF5C/ref=sxin_9?_encoding=UTF8&amp;amp;ascsubtag=amzn1.osa.3654fdc5-e4a7-4eea-a328-b844bdbd5112.ATVPDKIKX0DER.en_US&amp;amp;c=ts&amp;amp;creativeASIN=B07FFLNF5C&amp;amp;cv_ct_cx=Numeric+Keypads&amp;amp;cv_ct_id=amzn1.osa.3654fdc5-e4a7-4eea-a328-b844bdbd5112.ATVPDKIKX0DER.en_US&amp;amp;cv_ct_pg=search&amp;amp;cv_ct_we=asin&amp;amp;cv_ct_wn=osp-single-source-gl-ranking&amp;amp;dchild=1&amp;amp;keywords=Numeric+Keypads&amp;amp;linkCode=oas&amp;amp;pd_rd_i=B07FFLNF5C&amp;amp;pd_rd_r=1b1d8602-8246-4945-bef6-2a84f6460a12&amp;amp;pd_rd_w=lqbUW&amp;amp;pd_rd_wg=psc6f&amp;amp;pf_rd_p=a731ca96-f731-4b16-b32d-9bb548e9542b&amp;amp;pf_rd_r=TQSN80JK6N6VVFMKNZY8&amp;amp;s=pc&amp;amp;sr=1-2-d9dc7690-f7e1-44eb-ad06-aebbef559a37&amp;amp;tag=livescienceoa-20&amp;amp;ts_id=2998471011"&gt;Rottay one with the mechanical switches&lt;/a&gt; because I am a nerd.&lt;/p&gt;
&lt;p&gt;To connect to an external display, ideally you want an HDMI port. I think only one of my options doesn't have it, but check. I think they should support 4k TVs, although maybe only at 30 frames/sec. I think that'll be fine if you're not playing twitch-reaction shooting games. I'll also mention that TVs aren't quite as good as computer monitors for displaying text, they say. I haven't noticed it myself, and I think 4k TVs are magical future technology. But just want to give you all the information I have. I don't think any of these laptops have mini-display port connectors, but if so, it's just a matter of buying a different connector (mini displayport to hdmi rather than hdmi to hdmi). One of the laptops doesn't have an HDMI port. You can get a USB dock to connect to a display, but it's an added cost (&lt;a class="reference external" href="https://www.amazon.com/Hiearcool-USB-Hub-11-Compatiable/dp/B07QNRM45T"&gt;Hiearcool USB C Hub&lt;/a&gt;, for example, is $80).&lt;/p&gt;
&lt;p&gt;Here are my recommendations:&lt;/p&gt;
&lt;ul class="simple"&gt;
&lt;li&gt;&lt;a class="reference external" href="https://deals.dell.com/en-us/mpp/productdetail/5uky"&gt;Dell Inspiron 14 5000 2-in-1 Laptop&lt;/a&gt; (i5-1135G7, 8GB RAM, 256GB SSD, 1920x1080 14&amp;quot; touchscreen - check resolution.. do not get 1366x768 version) - $636.99 on Early Black Friday deal&lt;/li&gt;
&lt;li&gt;&lt;a class="reference external" href="https://www.lenovo.com/us/en/laptops/yoga/yoga-2-in-1-series/Lenovo-Yoga-C740-15/p/88YGC701293"&gt;Lenovo Yoga C740&lt;/a&gt; (i5-10210U, 8GB RAM, 256GB SSD, 1920x1080 15.6&amp;quot; touchscreen, no HDMI port so would need a dock to connect to an external monitor?) - $699.99&lt;/li&gt;
&lt;li&gt;&lt;a class="reference external" href="https://www.lenovo.com/us/en/laptops/ideapad/ideapad-500-series/ideapad-5-15are05/p/81YQ0007US"&gt;Lenovo IdeaPad 5&lt;/a&gt; (Ryzen 5 4500U, 8GB RAM, 512GB SSD, 1920x1080 15.6&amp;quot; touchscreen, BUT NO TABLET-STYLE CONFIGURATION) - $689.99&lt;/li&gt;
&lt;li&gt;Asus Flip 14 TP412FA or Asus Flip 14 TM420IA - not in stock anywhere&lt;/li&gt;
&lt;li&gt;&lt;a class="reference external" href="https://www.microcenter.com/product/628544/asus-q406da-br5t6-14-2-in-1-laptop-computer---silver"&gt;Asus Q406DA&lt;/a&gt; (Ryzen 5 3500U (slower), 8GB RAM, 256GB SSD, 1920x1080 14&amp;quot; touchscreen, DO NOT RECOMMEND WITHOUT SEEING) - $549.99&lt;/li&gt;
&lt;li&gt;Without the touch screen you might look at &lt;a class="reference external" href="https://deals.dell.com/en-us/mpp/productdetail/5uks"&gt;Dell Inspiron 15 5000&lt;/a&gt; (Ryzen 5 4500U, 8GB RAM, 256GB SSD, 1920x1080 15&amp;quot; monitor NO TOUCH SCREEN) - $519.39&lt;/li&gt;
&lt;li&gt;Or you could look at this, which compromises screen resolution, memory and hard drive, but the price is pretty good &lt;a class="reference external" href="https://www.dell.com/en-us/member/shop/2-in-1-laptops/new-inspiron-14-5000-2-in-1-laptop-dune/spd/inspiron-14-5406-2-in-1-laptop/n25406ejuch"&gt;Dell Inspiron 14 5000 2-in-1 Laptop&lt;/a&gt; (i3-1115G4, 4GB RAM, 128GB SSD, 1366x768 14&amp;quot; touchscreen) - $421.39&lt;/li&gt;
&lt;li&gt;I don't rate Acer that highly, but some reviews had good things to say about the Acer Aspire 5. I don't think they do a 1080p touchscreen though.&lt;/li&gt;
&lt;/ul&gt;
</content><category term="Recommendations"/><category term="computer-recommendation"/></entry><entry><title>Sim Racing Rig #3: High-End Rig</title><link href="https://www.edparcell.com/sim-racing-rig-3-high-end-rig.html" rel="alternate"/><published>2020-10-07T02:30:00-06:00</published><updated>2020-10-07T02:30:00-06:00</updated><author><name>Ed Parcell</name></author><id>tag:www.edparcell.com,2020-10-07:/sim-racing-rig-3-high-end-rig.html</id><summary type="html">&lt;p&gt;In previous posts in this series, I have described a set of components that you might want to use as templates for a &lt;a class="reference external" href="https://www.edparcell.com/sim-racing-rig-1-basic-rig.html"&gt;basic rig that you might use to get started sim racing&lt;/a&gt; and a more expensive &lt;a class="reference external" href="https://www.edparcell.com/sim-racing-rig-2-an-intermediate-rig.html"&gt;intermediate rig with higher quality components&lt;/a&gt;. In this final post, I'll describe …&lt;/p&gt;</summary><content type="html">&lt;p&gt;In previous posts in this series, I have described a set of components that you might want to use as templates for a &lt;a class="reference external" href="https://www.edparcell.com/sim-racing-rig-1-basic-rig.html"&gt;basic rig that you might use to get started sim racing&lt;/a&gt; and a more expensive &lt;a class="reference external" href="https://www.edparcell.com/sim-racing-rig-2-an-intermediate-rig.html"&gt;intermediate rig with higher quality components&lt;/a&gt;. In this final post, I'll describe what a top-of-the-line rig might look like, and how much you can expect to spend getting there.&lt;/p&gt;
&lt;div class="section" id="pc-2000"&gt;
&lt;h2&gt;PC - $2000&lt;/h2&gt;
&lt;p&gt;At this level, you'll be looking at an Intel Core i7 or an AMD Ryzen 7 (or maybe even an i9 or Ryzen 9). For video card, I previously would have recommended a card based on the NVidia RTX 2080 or 2080Ti, but I understand the successor model, the RTX 3080 has some stability issues - these will likely be resolved over time, but at the moment, research carefully (and I will update this section as new information becomes available). Although it won't make much difference at the moment, 32GB and a 1TB SSD are probably sensible at this price point. If you want to build your own, pcpartpicker's &lt;a class="reference external" href="https://pcpartpicker.com/guide/LRxFf7/magnificent-amd-gamingstreaming-build"&gt;Magnificent AMD&lt;/a&gt; and &lt;a class="reference external" href="https://pcpartpicker.com/guide/GTgXsY/magnificent-intel-gamingstreaming-build"&gt;Magnificent Intel&lt;/a&gt; builds are decent templates, and if you want to purchase, then &lt;a class="reference external" href="https://www.microcenter.com/product/626341/powerspec-g466-gaming-computer"&gt;Microcenter's PowerSpec G466&lt;/a&gt; or &lt;a class="reference external" href="https://www.dell.com/en-us/member/shop/gaming-and-games/alienware-aurora-ryzen-edition-r10-gaming-desktop/spd/alienware-aurora-r10-desktop/wdryzr1040"&gt;Dell's Alienware Aurora Ryzen Edition R10&lt;/a&gt; will be good.&lt;/p&gt;
&lt;div class="section" id="pc-accessories-1000"&gt;
&lt;h3&gt;PC Accessories - $1000&lt;/h3&gt;
&lt;p&gt;As before, for a pure racing rig, there is no reason to go beyond a basic wired keyboard and mouse. &lt;a class="reference external" href="https://www.amazon.com/Logitech-Desktop-Durable-Comfortable-keyboard/dp/B003NREDC8"&gt;Logitech MK120 Combo&lt;/a&gt; is $15 at Amazon. You can get a crazy looking keyboard with RGB lighting and mechanical switches I guess. I couldn't tell you which is good though.&lt;/p&gt;
&lt;p&gt;Again, with headphones, I don't think there is much point in going beyond the &lt;a class="reference external" href="https://www.amazon.com/HyperX-Cloud-Alpha-Gaming-Headset/dp/B074NBSF9N"&gt;HyperX Cloud Alpha&lt;/a&gt; ($100). I love hi-fi, and I love engine noises, but I just don't think racing simulations justify anything higher end than this.&lt;/p&gt;
&lt;p&gt;However, I would recommend taking a look at having 3-4 Buttkickers, and using the SimVibe software. Each &lt;a class="reference external" href="https://www.amazon.com/ButtKicker-Mini-Subwoofer-Home-Theater/dp/B0052AXFKK/ref=sr_1_1?crid=17198DOJJ8K1Q&amp;amp;dchild=1&amp;amp;keywords=buttkicker+mini+lfe&amp;amp;qid=1601332010&amp;amp;sprefix=buttkickre+m%2Caps%2C180&amp;amp;sr=8-1"&gt;Buttkicker mini LFE&lt;/a&gt; will be $100, and I'd budget the same for amplifiers, although I don't have a specific recommendation. The &lt;a class="reference external" href="https://www.simxperience.com/en-us/products/simvibe/simvibesoftware.aspx"&gt;SimVibe software&lt;/a&gt; is $90&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;div class="section" id="wheel-pedals-4800"&gt;
&lt;h2&gt;Wheel &amp;amp; Pedals - $4800&lt;/h2&gt;
&lt;p&gt;Definitely look at direct drive wheel bases. The extra torque they offer won't make a huge amount of difference - in fact, most people turn them down to avoid risking injury in crashes. The big difference as far as driving is the speed of the wheel - it's ability to suddenly change the amount of torque it is providing, so that the wheel will tend to move very quickly in oversteer conditions for example, making correcting over the limit a lot more feasible. They also provide far better detail - with no slack in gears or belts they can provide very fine details, even under high continuous torques, and the more information you have about the track and the car's dynamics the better you can drive.&lt;/p&gt;
&lt;p&gt;We live in a golden age of direct drive wheel bases with offerings from &lt;a class="reference external" href="https://fanatec.com/us-en/racing-wheels-wheel-bases/wheel-bases/podium-wheel-base-dd1"&gt;Fanatec&lt;/a&gt;, &lt;a class="reference external" href="https://simxperience.com/en-us/products/accessories/accuforcesteering.aspx"&gt;Accuforce&lt;/a&gt;, &lt;a class="reference external" href="https://virtualracingschool.com/academy/hardware/vrs-directforce-pro-wheel-base-pricing-update/"&gt;VRS&lt;/a&gt; and &lt;a class="reference external" href="http://www.leobodnar.com/shop/index.php?main_page=index&amp;amp;cPath=101&amp;amp;zenid=99b9148f3a2cdac2a7fead591c29d346"&gt;Bodnar&lt;/a&gt;. My &lt;a class="reference external" href="https://www.simucu.be/sc2pro-direct-drive-wheel-base"&gt;SimuCube 2 Pro&lt;/a&gt; for $1500 from &lt;a class="reference external" href="https://www.simcraft.com/simucube-2-directdrive-forcefeedback-technology-pro-ff-simracing-racing-simulators-simucube2/"&gt;SimCraft&lt;/a&gt; in the US. From what I've read and seen, they are all excellent, with the SimuCube 2 Pro hitting a sweet spot and beating similar priced competition - it's noticeably better than cheaper alternatives, but above this level diminishing returns start to kick in. That said, I haven't seen any reviewer unhappy with any direct drive wheel base, regardless of what they are used to.&lt;/p&gt;
&lt;p&gt;With your direct drive wheel base, you are going to want either a formula wheel rim, a GT wheel rim, or more likely, both. One thing to note is that if you have a Fanatec wheel base, you will be stuck using Fanatec wheel rims for the most part. Otherwise, the world is your oyster. You can repurpose real racing wheels for your rig, but easier is to look at purpose-made sim racing wheels based on real wheel. Your choice will come down to personal taste (and budget). For me, I'd be tempted by the Cube Controls &lt;a class="reference external" href="https://cubecontrols.com/product/formula-pro-classic/"&gt;Formula Pro&lt;/a&gt; (778 EUR) and &lt;a class="reference external" href="https://cubecontrols.com/product/gt-pro-omp-classic/"&gt;GT Pro OMP&lt;/a&gt; (688 EUR), and Ascher Racing have the &lt;a class="reference external" href="https://www.ascher-racing.com/shop/f64-usb/"&gt;F64-USB&lt;/a&gt; (974 EUR).&lt;/p&gt;
&lt;p&gt;Finally, you'll need some pedals. Heusinkveld pedals are excellent, and their top-of-the-line pedals are the &lt;a class="reference external" href="https://heusinkveld.com/products/sim-pedals/sim-pedals-ultimate/?q=%2Fproducts%2Fsim-pedals%2Fsim-pedals-ultimate%2F&amp;amp;v=7516fd43adaa"&gt;Sim Pedals Ultimate&lt;/a&gt;, at 1100 EUR for a three pedal set. If you want something slightly cheaper, the &lt;a class="reference external" href="https://heusinkveld.com/products/sim-pedals/sim-pedals-sprint/?q=%2Fproducts%2Fsim-pedals%2Fsim-pedals-sprint%2F&amp;amp;v=7516fd43adaa"&gt;Sim Pedals Sprint&lt;/a&gt; are 578 EUR for a three pedal set, with the main difference being the 65kg maximum brake force rather than 136kg. Unless you are training to drive real F1 cars, you won't need that. Both are fully adjustable.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="monitor-1200"&gt;
&lt;h2&gt;Monitor - $1200&lt;/h2&gt;
&lt;p&gt;Ultrawides are possible, but I think triples still beat them at this time, as they fill in slightly more of your peripheral vision, and can give you slightly more information about what the people beside you are doing. VR is also possible - my experience with an Oculus Rift was that it was very immersive, but that after 20 minutes or so of hard racing, my eyes were very sweaty, so not ideal for 30+ minute races. Maybe higher end VR equipment has deals with this better, but I haven't experienced it, so I can't make that recommendation at this time.&lt;/p&gt;
&lt;p&gt;In terms of triples, you'll want to look for 144hz refresh rate with Freesync or G-Sync, 1440p resolution, and 27&amp;quot; size. I believe NVidia and Radeon cards both support Freesync these days. I don't have a specific recommendation. I have Dell S2716DGs, which are great. Reviewers didn't like the color fidelity, but I haven't found it an issue for sim racing. Unfortunately, those seem to be discontinued. I've budgeted $400 per monitor, but you may want to spend more. &lt;a class="reference external" href="https://www.tomshardware.com/reviews/best-gaming-monitors,4533.html"&gt;Tom's Hardware's &amp;quot;Best 1440p Gaming Monitor&amp;quot;&lt;/a&gt;, the Asus ROG Strix XG279Q 27&amp;quot; is $600, for example.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="rig-2200"&gt;
&lt;h2&gt;Rig - $2200&lt;/h2&gt;
&lt;p&gt;Rigs made of 80-20 aluminium extrusion are the best rigs. They are not the prettiest, but they are infinitely expandable, and super-rigid. You can design and build your own, but easier is to buy the pre-designed &lt;a class="reference external" href="https://sim-lab.eu/shop/product/p1-x-sim-racing-cockpit-black-533#attr=323,498,434"&gt;Simlab P1-X&lt;/a&gt; for 750 EUR. You will also need to add a &lt;a class="reference external" href="https://sim-lab.eu/shop/product/triple-monitor-mount-19-42-vesa-675#attr=360,580"&gt;Triple Monitor Mount&lt;/a&gt; (200 EUR), &lt;a class="reference external" href="https://sim-lab.eu/shop/product/bucket-seat-bracket-set-410?page=4#attr="&gt;bucket seat bracket&lt;/a&gt; (40 EUR), seat slider (40 EUR) and &lt;a class="reference external" href="https://sim-lab.eu/shop/product/pedal-slider-baseplate-623#attr=584,439,692"&gt;pedal slider baseplate&lt;/a&gt; (150 EUR). You should take a look through the entire Simlab list when you are ordering to make sure you get everything that you might need.&lt;/p&gt;
&lt;p&gt;You will also need a seat. I recommend getting a real racing seat for immersion, although it is overkill as it will never be subject to a real crash that it is designed for. You should go to a dealer in person to try out seats, as you will want to make sure that you get something that is comfortable and suitably sized for you. If you're near Denver, Sonoma or Sebring, I recommend speaking to &lt;a class="reference external" href="https://winecountrymotorsports.com/"&gt;Wine Country Motorsports&lt;/a&gt; - super-helpful.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="total-11200"&gt;
&lt;h2&gt;Total - $11200&lt;/h2&gt;
&lt;p&gt;Woof. Over $11K on what people will definitely call &amp;quot;a game&amp;quot;. It seems like a lot when you put it that way. On the other hand, it's a lot cheaper than even a basic track car, and you'll be able to go drive for an hour in the evening, rather than having to go to the track for entire days. And the equipment on this list will take a long time to go out of style - PCs evolve slower than they used to, and the mechanical parts will give great service for years, and likely won't become outdated. So if you want a top-of-the-line sim experience, this would be how to get it.&lt;/p&gt;
&lt;div class="section" id="note-on-currencies-and-customs"&gt;
&lt;h3&gt;Note on Currencies and Customs&lt;/h3&gt;
&lt;p&gt;A few of the items listed above are priced in EUR and ship from Europe. Be aware that currencies do fluctuate. Also, you will be liable for import taxes on these items. Depending on the shipper, they may invoice you separately for this.&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;div class="section" id="taking-it-further"&gt;
&lt;h2&gt;Taking it Further&lt;/h2&gt;
&lt;p&gt;One thing I haven't covered is motion systems. Frankly, I don't know much about them. If you want a good one, I think they start in low 5 digits. I will never have one because my sim-rig is on the upper story of my house which has a wooden structure, and I fear that a motion rig would slower start disassembling it.&lt;/p&gt;
&lt;p&gt;A unfortunate trend recently, with the growth in popularity of e-sports, is that there are starting to emerge companies that make very expensive products just because there are some people willing to pay those amounts. Aston Martin, for example, charges $74,000 for their &lt;a class="reference external" href="https://www.caranddriver.com/news/a34018316/aston-martin-driving-simulator-revealed-expensive/"&gt;AMR-CO1&lt;/a&gt;. Look, it's your money, I love carbon fiber too, and I'm sure those things will look very pretty sitting in Aston Martin showrooms. But that rig does not offer any functional advantage over the rig I've outlined here, and in fact, offers less flexibility. I don't fault Aston Martin for making the thing if there are buyers, but the lesson here is that paying more is not automatically better.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="related-posts"&gt;
&lt;h2&gt;Related Posts&lt;/h2&gt;
&lt;ul class="simple"&gt;
&lt;li&gt;&lt;a class="reference external" href="https://www.edparcell.com/sim-racing-rig-1-basic-rig.html"&gt;Sim Racing Rig #1: Basic Rig&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="reference external" href="https://www.edparcell.com/sim-racing-rig-2-an-intermediate-rig.html"&gt;Sim Racing Rig #2: Intermediate Rig&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Sim Racing Rig #3: High-End Rig&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
</content><category term="Sim Racing"/></entry><entry><title>Sim Racing Rig #2: Intermediate Rig</title><link href="https://www.edparcell.com/sim-racing-rig-2-an-intermediate-rig.html" rel="alternate"/><published>2020-09-30T14:18:00-06:00</published><updated>2020-09-30T14:18:00-06:00</updated><author><name>Ed Parcell</name></author><id>tag:www.edparcell.com,2020-09-30:/sim-racing-rig-2-an-intermediate-rig.html</id><summary type="html">&lt;p&gt;In the &lt;a class="reference external" href="https://www.edparcell.com/sim-racing-rig-1-basic-rig.html"&gt;first post in this series&lt;/a&gt;, I went through a set of components that you might want to consider when you are first starting out in sim-racing, so that you can get started as cheaply as possible. In this post, I'll look at some higher quality equipment that comes …&lt;/p&gt;</summary><content type="html">&lt;p&gt;In the &lt;a class="reference external" href="https://www.edparcell.com/sim-racing-rig-1-basic-rig.html"&gt;first post in this series&lt;/a&gt;, I went through a set of components that you might want to consider when you are first starting out in sim-racing, so that you can get started as cheaply as possible. In this post, I'll look at some higher quality equipment that comes with a higher price tag.&lt;/p&gt;
&lt;p&gt;There are a couple of reasons you might want to look at this price bracket. If you're using your rig as a training tool, then you will find equipment at this level starts to replicate the feel of a real car better, and the experience can be more immersive. That can be important if you're trying to develop specific techniques that you want to carry across to the real world. Alternatively, if your main focus is sim-racing, your enjoyment will still benefit from improved immersion, while a steering wheel with more faithful force feedback and higher quality pedals will allow you to have a better level of control, and better results.&lt;/p&gt;
&lt;p&gt;Finally, you might want to look at individual components on this list as upgrades to the Basic Rig, or whatever you currently have. At the end I give some thoughts on which you might want to prioritize and why.&lt;/p&gt;
&lt;div class="section" id="pc-900"&gt;
&lt;h2&gt;PC - $900&lt;/h2&gt;
&lt;p&gt;You'll be looking at a AMD Ryzen 5 or Intel Core i5 processor, and an NVidia GTX1660 GPU or similar. Again, RAM is 16GB (there isn't benefit in more) and storage is a 500GB SSD (you can always add more later if necessary). If you want to build your own PC, &lt;a class="reference external" href="https://pcpartpicker.com/guide/2cLrxr/modest-amd-gaming-build"&gt;Pcpartpicker.com's Modest AMD build&lt;/a&gt; is a pretty good template here (don't forget to budget for a copy of Microsoft Windows). If you want to purchase a machine, &lt;a class="reference external" href="https://www.microcenter.com/product/624825/powerspec-g228-gaming-computer"&gt;Microcenter's PowerSpec G228&lt;/a&gt;, or &lt;a class="reference external" href="https://www.dell.com/en-us/shop/cty/pdp/spd/alienware-aurora-r11-desktop"&gt;Dell's Alienware Aurora R11&lt;/a&gt; might be reasonable options.&lt;/p&gt;
&lt;div class="section" id="pc-accessories-120"&gt;
&lt;h3&gt;PC Accessories - $120&lt;/h3&gt;
&lt;p&gt;There is no point in spending more on a keyboard and mouse, so look to the same $15 &lt;a class="reference external" href="https://www.amazon.com/Logitech-Desktop-Durable-Comfortable-keyboard/dp/B003NREDC8"&gt;Logitech MK120 Combo&lt;/a&gt; at Amazon that we recommended for the Basic Rig.&lt;/p&gt;
&lt;p&gt;If you want to spend a little more on headphones, I've used the &lt;a class="reference external" href="https://www.amazon.com/HyperX-Cloud-Alpha-Gaming-Headset/dp/B074NBSF9N"&gt;HyperX Cloud Alpha&lt;/a&gt; gaming headphones ($100) quite happily for several years. For sim-racing, they are decently immersive, and I can hear the others around me fine. The mic is also good, and so they can hear me.&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;div class="section" id="wheel-700"&gt;
&lt;h2&gt;Wheel - $700+&lt;/h2&gt;
&lt;p&gt;At this price point, I'd look to Fanatec. They make good quality equipment - it feels sold and works reliably. Decently high force feedback levels will be available, so you should experience little unrealistic clipping, and better pedals and higher fidelity feedback through the wheel will help you control the car better and more consistently. Immersion will also be better - the construction quality materials used are more reminiscent of an actual car than a piece of consumer electronics.&lt;/p&gt;
&lt;p&gt;One good option is the &lt;a class="reference external" href="https://fanatec.com/us-en/bundles/csl-elite-ps4-starter-kit"&gt;Fanatec CSL Elite PS4 Starter Kit&lt;/a&gt;, which bundles the CSL Elite Racing Wheel and CSL Elite Pedals for $570. I'd strongly recommend considering the optional &lt;a class="reference external" href="https://fanatec.com/us-en/accessories/pedal-accessories/csl-elite-pedale-loadcell-kit"&gt;Loadcell Kit&lt;/a&gt; for an additional $140. This means that the brake sensor will be based on force, rather than position, which is more realistic representation of the brake in a real car, and it will greatly help with braking consistency and immersion.&lt;/p&gt;
&lt;p&gt;If you want to push the budget out further, the &lt;a class="reference external" href="https://fanatec.com/us-en/racing-wheels-wheel-bases/wheel-bases/clubsport-wheel-base-v2.5"&gt;Fanatec CSW 2.5 base&lt;/a&gt; offers higher forces and faster feedback for $550. You will need to buy pedals and wheel rims separately. The &lt;a class="reference external" href="https://fanatec.com/us-en/pedals/clubsport-pedals-v3"&gt;Fanatec ClubSport Pedals V3&lt;/a&gt; are decent by all accounts for $360. For wheel rims, the &lt;a class="reference external" href="https://fanatec.com/us-en/steering-wheels/clubsport-lenkrad-porsche-918-rsr"&gt;ClubSport Porsche 918 RSR&lt;/a&gt; ($400) and &lt;a class="reference external" href="https://fanatec.com/us-en/steering-wheels/clubsport-lenkrad-formula-v2"&gt;ClubSport Formula V2&lt;/a&gt; ($370) are both nice, depending on whether you want to look at closed- or open-wheel cars. All together that'd be $1680.&lt;/p&gt;
&lt;p&gt;If you want to go even further, replace any of the pedals above with the excellent &lt;a class="reference external" href="https://heusinkveld.com/products/sim-pedals/sim-pedals-sprint/?q=%2Fproducts%2Fsim-pedals%2Fsim-pedals-sprint%2F&amp;amp;v=7516fd43adaa"&gt;Heusinkveld Sim Pedals Sprint&lt;/a&gt; for 500EUR (~$580).&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="monitor-400"&gt;
&lt;h2&gt;Monitor - $400&lt;/h2&gt;
&lt;p&gt;If you can, it's definitely a good idea to see a monitor in person before you buy it. After all, you're going to spend a lot of time looking at it.&lt;/p&gt;
&lt;p&gt;At this price point, you can consider a low-end ultrawide monitor, which will give a give a decent field of view, allowing you to spot apexes and competitors that aren't directly in front of you. Filling your peripheral vision will also provide a great sense of speed.&lt;/p&gt;
&lt;p&gt;As before, the main things to care about are field of vision and latency. Look for gaming monitors which will have better latency than regular monitors or TVs, and look for higher refresh rates (75hz is good, 100+hz is better).&lt;/p&gt;
&lt;p&gt;Most monitors of this type will come with AMD Freesync, which allows the monitor to adaptively lower the refresh rate without tearing the image if the graphics card can't supply frames fast enough. I'm still not entirely clear on the situation, but it seems like modern AMD and NVidia cards both now support Freesync. GSync, NVidia's similar technology, seems to be on its way to joining Betamax in the place where marginally better, but less supported, technologies go to die.&lt;/p&gt;
&lt;p&gt;I haven't used it, but if I was in the market for one, I'd try to find a showroom I could look at the snappily named &lt;a class="reference external" href="https://www.amazon.com/LG-34WK650-W-34-UltraWide-21/dp/B078GSH1LV/ref=sr_1_1?dchild=1&amp;amp;keywords=LG+34BK650-W&amp;amp;qid=1601329688&amp;amp;sr=8-1"&gt;LG 34BK650-W&lt;/a&gt;, a 34&amp;quot; 1080p ultrawide. It supports AMD Freesync and has a refresh rate of 75hz.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="rig-1150"&gt;
&lt;h2&gt;Rig - $1150&lt;/h2&gt;
&lt;p&gt;Lots of choices here. I'm a strong proponent of rigs that are built with 80-20 aluminium extrusion. They tend to be very strong, and they also give you a lot of flexibility to add components to your rig from a variety of suppliers, or even to create your own custom designs.&lt;/p&gt;
&lt;p&gt;There are a couple of downsides. To some eyes it's less attractive than all-in-one game seats. For me, my driving room is not my living room, and no-one is going in there to behold the beauty of the furniture.&lt;/p&gt;
&lt;p&gt;The other downside is the cost. I think it's safe to say that if you are spending this much on sim racing equipment, it's something you intend to do for a while, and it'll be a lot cheaper to buy a rig that you can expand as you go, than buying something else that looks nice, and then later buying an 80-20 rig anyway. Trust me.&lt;/p&gt;
&lt;p&gt;You can build your own, but there are several suppliers that provide everything you need for standard designs. Simlab is excellent, and their &lt;a class="reference external" href="https://sim-lab.eu/shop/product/gt1-evo-sim-racing-cockpit-black-446#attr="&gt;GT1-Evo&lt;/a&gt; at 400 EUR looks great. You will also need their &lt;a class="reference external" href="https://sim-lab.eu/shop/product/single-monitor-tv-stand-654#attr=364"&gt;TV stand&lt;/a&gt; (150 EUR), &lt;a class="reference external" href="https://sim-lab.eu/shop/product/bucket-seat-bracket-set-410?page=4#attr="&gt;bucket seat bracket&lt;/a&gt; (40 EUR), and you will probably want to add a seat slider (40 EUR) and maybe a &lt;a class="reference external" href="https://sim-lab.eu/shop/product/pedal-slider-baseplate-623#attr=584,439,692"&gt;pedal slider baseplate&lt;/a&gt; (150 EUR). The sliders are unnecessary, as you can move the parts anyway, but the sliders make it easier to make adjustments, particularly if more than one person will use the rig. That total 730 EUR (around $850).&lt;/p&gt;
&lt;p&gt;It's worth taking a look through Simlab's full list to see what else you might want - mounting points for shifters, button boxes, speakers, buttkickers and so on.&lt;/p&gt;
&lt;div class="section" id="seats"&gt;
&lt;h3&gt;Seats&lt;/h3&gt;
&lt;p&gt;You will also need a seat. Simlab sell them if you are in Europe, but do not ship to the States. A lot of racing seats are pretty constrictive so if you want one, it is definitely worth going to a dealer in person, so you can try out options and see what works for you. That said, the OMP classic seat ($300) or Sparco R100 ($330) seem like a good starting points. I have an Evo XL QRT, and if anything, it's a bit too large, but there's no real downside. If you want to save money, you might look to a seat from a regular car from a salvage yard - the G-forces in a sim crash are substantially lower than a real one, so as long as the seat is sufficiently rigid, this will be fine, and probably more comfortable.&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;div class="section" id="total-3000"&gt;
&lt;h2&gt;Total - $3000&lt;/h2&gt;
&lt;p&gt;For this amount of money, you're getting something pretty special. All the equipment will be good quality and last for many years. Your experience driving in sim will be pretty great, and while there is nicer equipment available, it's probably not going to make you significantly faster above this price point.&lt;/p&gt;
&lt;div class="section" id="note-on-currencies-and-customs"&gt;
&lt;h3&gt;Note on Currencies and Customs&lt;/h3&gt;
&lt;p&gt;A few of the items listed above are priced in EUR and ship from Europe. Be aware that currencies do fluctuate. Also, you will be liable for import taxes on these items. Depending on the shipper, they may invoice you separately for this.&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;div class="section" id="optional-items"&gt;
&lt;h2&gt;Optional Items&lt;/h2&gt;
&lt;p&gt;The items below are not strictly necessary, but you might consider adding them to improve your experience.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="buttkickers"&gt;
&lt;h2&gt;Buttkickers&lt;/h2&gt;
&lt;p&gt;Buttkickers are transducers that put low frequency sounds directly into your seat, kind of like a super-effective subwoofer. For sim racing, they really bring engines to life, and they also make running over kerbs quite visceral. That improves immersion. In my case, it also makes the sim-car seem more physical and I find I spin on kerbs less as I'm more respectful of the limits of the car under load in those situations... I'm skeptical myself, reading that, but others have reported similar things.&lt;/p&gt;
&lt;p&gt;You should be aware that the sound will travel between rooms and floors, particular in houses with a wood structure. I turn the Buttkicker off if anyone is sleeping in the house.&lt;/p&gt;
&lt;p&gt;If you have an 80-20 rig you will probably need a &lt;a class="reference external" href="https://www.amazon.com/ButtKicker-Mini-Subwoofer-Home-Theater/dp/B0052AXFKK/ref=sr_1_1?crid=17198DOJJ8K1Q&amp;amp;dchild=1&amp;amp;keywords=buttkicker+mini+lfe&amp;amp;qid=1601332010&amp;amp;sprefix=buttkickre+m%2Caps%2C180&amp;amp;sr=8-1"&gt;Buttkicker mini LFE&lt;/a&gt; ($100), an amplifier (&lt;a class="reference external" href="https://www.amazon.com/ButtKicker-BKA-130-C-Transducer-Amplifier-Remote/dp/B00A4V22EM/ref=sr_1_3?dchild=1&amp;amp;keywords=buttkicker+amp&amp;amp;qid=1601332073&amp;amp;sr=8-3"&gt;Buttkicker make one for $160&lt;/a&gt;, but other options are available), and &lt;a class="reference external" href="https://sim-lab.eu/shop/product/buttkicker-mounting-plate-411#attr=594"&gt;Simlab mounting plate&lt;/a&gt; (14 EUR). You will also need some speaker cable, and you'll need to splice it onto the Buttkicker mini LFE's cables. If you have a tubular steel rig then you can use the &lt;a class="reference external" href="https://www.amazon.com/Buttkicker-BK-GR2-ButtKicker-BK-GR-Gamer/dp/B000AOTLP6"&gt;Buttkicker Gamer 2&lt;/a&gt; for $160.&lt;/p&gt;
&lt;p&gt;You can use a 3.5mm splitter to split the audio signal between headphones and the buttkicker amplifier. It is worth making sure that both left and right signals are being fed to the amplifier - otherwise you only get kerb sensations from one side of the car. I can't remember my solution for this - if it's critical to you, ping me and I'll rewrite this section.&lt;/p&gt;
&lt;p&gt;You can also use the &lt;a class="reference external" href="https://www.simxperience.com/en-us/products/simvibe/simvibesoftware.aspx"&gt;SimVibe software&lt;/a&gt; which provides an entirely separate bass channel to one or more Buttkickers. You'll need a separate sound card to drive that. Some people swear by it. I'm interested, and when I have impressions to give, I'll give them.&lt;/p&gt;
&lt;div class="section" id="sequential-h-pattern-gear-shifter-250"&gt;
&lt;h3&gt;Sequential/H-Pattern Gear Shifter - $250&lt;/h3&gt;
&lt;p&gt;These don't work as well as the other equipment - most race cars these days use paddle shifters, and most sims focus on these. The big problem with H-pattern gear shifters and sims is that if you shift too fast, the physical gear lever will be in gear, but the simulator will make gear-crunching sounds at you and will be in neutral. It's pretty unintuitive and breaks immersion horribly to sort that out, usually in the middle of corner entry when you are quite busy steering. That said, if you're going to be driving a classic Lotus F1 car, you're not going to want to use paddle shifters are you?&lt;/p&gt;
&lt;p&gt;The shifter to get is &lt;a class="reference external" href="https://fanatec.com/us-en/shifters-others/clubsport-shifter-sq-v-1.5"&gt;Fanatec's Clubsport Shifter SQ V1.5&lt;/a&gt; for $250. By all accounts it has a good feel, and can be switched between sequential an H-pattern with a single switch.&lt;/p&gt;
&lt;p&gt;Another option is the &lt;a class="reference external" href="https://www.amazon.com/Thrustmaster-PC-PS3-PS4-Xbox-One/dp/B005L0Z2BQ"&gt;Thrustmaster TH8A&lt;/a&gt; for $170. I have one. I don't love it, but it serves its purpose.&lt;/p&gt;
&lt;p&gt;Don't forget to make sure your pedal option has a clutch pedal!&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="button-box-160"&gt;
&lt;h3&gt;Button Box - $160&lt;/h3&gt;
&lt;p&gt;As a sim racer, you always want more buttons available. Tweak brake balance. Mute current speaker. Car ahead lap times. There are always more things you could want to access than available buttons. Derek Speare Designs make good quality button boxes, and I'd recommend the &lt;a class="reference external" href="http://derekspearedesigns.com/dsd-race-king.html"&gt;DSD Race King&lt;/a&gt; for $160.&lt;/p&gt;
&lt;p&gt;I definitely would not recommend their button boxes that attach directly to wheels, because I cackhandedly removed the wrong screws from my CSW 2.5 and had to send it for warranty repair a few weeks after I got it. Yes, that's on me. But it made me sad, and I want you to be happy.&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;div class="section" id="upgrading-from-a-basic-rig"&gt;
&lt;h2&gt;Upgrading from a Basic Rig&lt;/h2&gt;
&lt;p&gt;You might also want to consider individual items on this list as upgrades from a &lt;a class="reference external" href="https://edparcell.wordpress.com/2020/09/27/sim-racing-rig-1-basic-rig/"&gt;Basic Rig&lt;/a&gt;. In this case, it's partly a matter of taste which you would focus on first, but for me, I'd prioritize the things that are going to improve my driving, and then those that improve immersion. I'd say the order of importance is:&lt;/p&gt;
&lt;ul class="simple"&gt;
&lt;li&gt;Pedals. Particularly, a load cell brake pedal. Modulation of braking force in the braking zone for a corner can be quite subtle. Being able to do it consistently, lap after lap, will make a huge difference to your lap time, and you'll feel more comfortable in the car also.&lt;/li&gt;
&lt;li&gt;Wheel. A mid-level wheel will have enough headroom in torque that it won't clip often. But more importantly, it will respond to torque requests from the sim's physics engine a lot faster. This means you will be able to consistently drive closer to the limit, and have better odds of saving the car when you go over the limit. It will probably feel less &amp;quot;grainy&amp;quot; or &amp;quot;notchy&amp;quot; also.&lt;/li&gt;
&lt;li&gt;Rig. A good solid rig will improve immersion - it's the difference between feeling like you are driving a car, or feeling like you are controlling a car from your desk.&lt;/li&gt;
&lt;li&gt;Screen. The main thing to focus on is latency. PC gaming monitors will typically have better latency than TVs or monitors focused on office or design work. Higher refresh rate screens will typically have lower latency (otherwise, what's the point of higher refresh rates). Having a 144hz monitor can make braking points and other marks slightly easier to hit, but the effect is marginal. Resolution makes very little difference.&lt;/li&gt;
&lt;li&gt;PC. I don't think this is hugely important unless it's a limiting factor for something else in your setup. Even the budget PC will support reasonable graphics settings in VR, and when you're flying through the Italian mountains or wheel-to-wheel racing with this season's foe at Suzuka, you won't notice the difference between pretty good graphics and really good graphics. If you do upgrade PC, the main things to look at are GPU (for higher graphical settings, triple screens, higher refresh rates, higher resolutions) and CPU (if you want to stream).&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;div class="section" id="related-posts"&gt;
&lt;h2&gt;Related Posts&lt;/h2&gt;
&lt;ul class="simple"&gt;
&lt;li&gt;&lt;a class="reference external" href="https://www.edparcell.com/sim-racing-rig-1-basic-rig.html"&gt;Sim Racing Rig #1: Basic Rig&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Sim Racing Rig #2: Intermediate Rig&lt;/li&gt;
&lt;li&gt;&lt;a class="reference external" href="https://www.edparcell.com/sim-racing-rig-3-high-end-rig.html"&gt;Sim Racing Rig #3: High-End Rig&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
</content><category term="Sim Racing"/><category term="equipment"/><category term="simracing"/></entry><entry><title>Sim Racing Rig #1: Basic Rig</title><link href="https://www.edparcell.com/sim-racing-rig-1-basic-rig.html" rel="alternate"/><published>2020-09-27T03:27:00-06:00</published><updated>2020-09-27T03:27:00-06:00</updated><author><name>Ed Parcell</name></author><id>tag:www.edparcell.com,2020-09-27:/sim-racing-rig-1-basic-rig.html</id><summary type="html">&lt;p&gt;In the last year or so, a lot of people have asked me how to get started sim racing. Some of the interest has come from gamers, who want to take their immersion to the next level, and hope to improve their times moving up from X-Box and Playstation controllers …&lt;/p&gt;</summary><content type="html">&lt;p&gt;In the last year or so, a lot of people have asked me how to get started sim racing. Some of the interest has come from gamers, who want to take their immersion to the next level, and hope to improve their times moving up from X-Box and Playstation controllers. And some has come from people who see sim racing as a training tool for the real world, or a way to keep themselves sharp during the winter season.&lt;/p&gt;
&lt;p&gt;I wanted to gather a set of advice in one place for people to refer to, hence this series of blog posts, where I will outline the options available at various price points. You should think of these as jumping off points. If you hear some brand is better than another, or you have specific aims or needs, nothing in these lists is sacrosanct. But if you have no idea where to start, hopefully they are useful to you.&lt;/p&gt;
&lt;p&gt;The first rig I'll describe is a basic rig. If you're coming from a controller, this will feel great, help you drive more consistently, and be a lot more immersive. If you're looking to use your sim rig as a training tool for driving real cars on track, you might want to look at higher budget options for wheels and pedals. We live in a golden age for sim racing, and these pieces of equipment are fine, but there is a limit to how faithfully they can replicate the controls of real cars at this price point, and hence how useful they are as training tools.&lt;/p&gt;
&lt;p&gt;The aim here is to get started as cheaply as possible so you can decide whether sim racing is something you actually enjoy. If you can salvage or re-use things you already have (especially a PC), then so much the better.&lt;/p&gt;
&lt;div class="section" id="pc-600"&gt;
&lt;h2&gt;PC - $600&lt;/h2&gt;
&lt;p&gt;You will need a PC. Console racing games are fun, but they are games, and not generally trying to simulate actual racing. If you have an existing PC you can use, definitely do it. iRacing particularly is undemanding at its lower graphical settings, so even an older or non-gaming machine may be adequate.&lt;/p&gt;
&lt;p&gt;At this price point (as of September 2020), you should be able to build a system with an Intel Core i3 or AMD Ryzen 3 processor, an AMD Radeon RX 570 GPU, 16GB of RAM and around 500GB of SSD. This will be sufficient to run most racing or driving simulators at medium to high graphical settings.&lt;/p&gt;
&lt;p&gt;I like to build my own PCs. That way you know exactly what has gone into your PC, and if you want to upgrade components later, then you'll know exactly which areas to focus on. I think it's also slightly more cost effective, if you don't mind spending the time to assemble the components. If you want to go this direction, definitely take a look at &lt;a class="reference external" href="https://pcpartpicker.com/"&gt;pcpartpicker.com&lt;/a&gt;. Particularly their entry level &lt;a class="reference external" href="https://pcpartpicker.com/guide/zTgXsY/entry-level-amd-gaming-build"&gt;AMD&lt;/a&gt; and &lt;a class="reference external" href="https://pcpartpicker.com/guide/BnBD4D/entry-level-intel-gaming-build"&gt;Intel&lt;/a&gt; gaming builds will be useful. You will also need to budget $100 for a copy of Microsoft Windows.&lt;/p&gt;
&lt;p&gt;If not, &lt;a class="reference external" href="https://www.microcenter.com/"&gt;Microcenter&lt;/a&gt; offers gaming desktops under their PowerSpec brand, or &lt;a class="reference external" href="https://www.dell.com/en-us/member/shop/desktop-computers/sr/desktops/g-series-desktops?~ck=bt"&gt;Dell&lt;/a&gt; under their G5 line&lt;/p&gt;
&lt;p&gt;PC Accessories - $50&lt;/p&gt;
&lt;p&gt;You will need a mouse and keyboard. You won't be using these to write essays, so no need to spend much. Amazon have a &lt;a class="reference external" href="http://Logitech%20MK120%20Combo"&gt;Logitech MK120 Combo&lt;/a&gt; for $15.&lt;/p&gt;
&lt;p&gt;You'll also need some speakers or headphones. I've always used headphones for racing, so I don't bother people around me too much. The &lt;a class="reference external" href="https://www.amazon.com/HyperX-Cloud-Stinger-Core-PlayStation/dp/B083Q6Q41G/ref=sr_1_2?crid=397ROY3U2ECU&amp;amp;dchild=1&amp;amp;keywords=hyperx+cloud+stinger&amp;amp;qid=1601196782&amp;amp;s=electronics&amp;amp;sprefix=hyperx+%2Celectronics%2C184&amp;amp;sr=1-2"&gt;HyperX Cloud Stinger&lt;/a&gt; has good reviews, and I've had good experiences with other headphones HyperX produce. Currently $40, but I've seen them for $30.&lt;/p&gt;
&lt;p&gt;One note on accessories: Wires are your friend. A lot of people get wireless headphones and then have the frustrating experience of losing sound during a race - makes it difficult to know where to shift. You're going to be sitting in one place, so there is very little to be gained by paying more for wireless, and only downsides. Wires are your friend.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="wheel-300"&gt;
&lt;h2&gt;Wheel - $300&lt;/h2&gt;
&lt;p&gt;At this price point, Thrustmaster and Logitech are the best options. I used a Thrustmaster TX Leather when I was getting started, and enjoyed it a lot. But I think if I was starting now, I'd be most interested in the &lt;a class="reference external" href="https://www.amazon.com/Logitech-Dual-motor-Feedback-Responsive-PlayStation/dp/B00Z0UWWYC"&gt;Logitech G29&lt;/a&gt;, used by one of my favorite sim racing streamers, &lt;a class="reference external" href="https://www.youtube.com/user/TheWorldofKinduci/about"&gt;Kinduci&lt;/a&gt;. He's a competitive driver, and also, from the amount of racing content he publishes, loves driving. I don't think there could be a higher endorsement for a wheel.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="tv-monitor-200"&gt;
&lt;h2&gt;TV/Monitor - $200&lt;/h2&gt;
&lt;p&gt;As a sim racer, you will care about slightly different qualities in your monitor than other people.&lt;/p&gt;
&lt;p&gt;First, you'll care about field of vision (FOV) - how wide of an angle you can see about you. You will need to see apexes of hairpins that are not directly in front of you. You will also need to see what your competitors are doing in front of, but also beside you. So it makes sense to prioritize size.&lt;/p&gt;
&lt;p&gt;The other main thing you'll care about is latency. Modern TVs and monitors do a lot of sophisticated digital processing on the images they show. This means they can be a delay in the tenths of seconds before the image from your PC is shown on screen, which can make the difference between controlling an oversteering car, or having an accident. &lt;a class="reference external" href="https://displaylag.com/"&gt;Displaylag.com&lt;/a&gt; does a great job measuring this aspect of monitors and TVs. Refresh rate is definitely related to latency, but at this price point, you won't have much choice of refresh rate. I'll cover it in a later post, when it's more relevant.&lt;/p&gt;
&lt;p&gt;Things you don't particularly care about are resolution (1080p is fine, more is gravy, but unnecessary - I believe Max Verstappen uses 1080p monitors), color fidelity (maybe marginal for immersion, but all modern monitors are great).&lt;/p&gt;
&lt;p&gt;Probably the best option, for the money, is to get a cheap but low latency TV. Something around 40&amp;quot; should be great. Or, to save money, maybe just use your current living room TV, assuming that you have space to push your rig out of the way when you're not using it.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="rig-150"&gt;
&lt;h2&gt;Rig - $150&lt;/h2&gt;
&lt;p&gt;If you want to save money, or at least spend it progressively, a good first option is just to bolt your wheel to a desk.&lt;/p&gt;
&lt;p&gt;I freely admit that I don't know a huge amount about rigs at this price point. I looked around a little, and the &lt;a class="reference external" href="https://www.amazon.com/RACING-STEERING-SUITABLE-LOGITECH-PlayStation-4/dp/B07HM6DR81"&gt;GT Omega APEX Racing Wheel Stand&lt;/a&gt; seems like a decent pick. It's a wheel stand, rather than a full rig, so you'll need to find a table to put your TV, and a chair to sit on. The base itself should provide a solid support for your wheel and pedals. If you have a wheely chair, I understand a good trick is to use tethers to tie it to the base so you don't fly away under hard braking.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="total-cost-1300"&gt;
&lt;h2&gt;Total Cost - $1300&lt;/h2&gt;
&lt;p&gt;The total cost of the components here is $1300. Hopefully you've been able to re-use a TV and a PC to get down towards $500, or even lower if you are using an existing desk for now. For that, you get a system that you can go racing on happily for many years. You'll also find out how much you like sim-racing. If it's not your jam after all, then at least you found out as cheaply as possible. If you do love sim-racing, you'll probably want to go further with your setup over time, and I'll outline some options for that in my next post, covering a mid-range rig.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="final-thoughts"&gt;
&lt;h2&gt;Final thoughts&lt;/h2&gt;
&lt;p&gt;The equipment listed here is perfectly functional, but there are a few compromises compared to more expensive equipment. The physical quality and durability is going to be lower. It's not going to be as faithful a substitute for a real car - drivers looking to train for the real world might want more. And the pedals and wheel aren't going to be able to relay feedback as well, so it will be hard to have Senna-like car control with these setups.&lt;/p&gt;
&lt;p&gt;Nonetheless, it should be possible to be decently competitive. You will be doing the same things that you'd be doing at the track - spotting marks, controlling the car, and dealing with the traffic around you. And most important of all, it should be pretty easy to have a great time with this equipment as you get started sim racing.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="related-posts"&gt;
&lt;h2&gt;Related Posts&lt;/h2&gt;
&lt;ul class="simple"&gt;
&lt;li&gt;Sim Racing Rig #1: Basic Rig&lt;/li&gt;
&lt;li&gt;&lt;a class="reference external" href="https://www.edparcell.com/sim-racing-rig-2-an-intermediate-rig.html"&gt;Sim Racing Rig #2: Intermediate Rig&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a class="reference external" href="https://www.edparcell.com/sim-racing-rig-3-high-end-rig.html"&gt;Sim Racing Rig #3: High-End Rig&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
</content><category term="Sim Racing"/><category term="equipment"/><category term="simracing"/></entry></feed>