I won't lie:
In-depth SEO auditing is a big problem.
And, as an
SEO consultant, there are some sweet words: "Your audit looks great! When
can we get you on board?
Even if you
aren't actively looking for a new concert, knowing that your SEO audit was a
big incentive to arrogance.
But are you
scared to start? Is this your first SEO audit? Or do you not know where to
start? Sending a great SEO audit to a potential client puts you in the best
possible position.
This is a
rare opportunity for you to organize your processes and relieve your potential
client from bad habits (301 * cough without cough redirecting pages *) and
freezing bark like lint in your dryer.
So take your
time. Remember: Your main purpose is to add value to your client, including
your site's recommendations, both in the short and long term.
After that,
when I got a new client I collected the steps needed to create a small vision
of the SEO audit and the first step of my processes. It is divided into
sections below. If you think you understand a particular category well, feel
free to jump to the next section.
This is a
series, so get in touch for more SEO audit love.
Go to:
• When
should I do my SEO audit?
What you need from a client before evaluating
SEO
Tools for
SEO SEO Monitoring
•
Technology> Deepcroll
•
Technology> screaming frogs
•
Technical> Google Search Console and Bing Webmaster Tools
•
Technical> Google Analytics
When should I audit SEO?
After
sending an email to a prospective client asking me to work together and
responding to my survey, we set up an introductory call (Skype or Google
Hangouts are preferred).
Before
giving the call, I do my own quick SEO mini-audit based on your survey responses
to familiarize yourself with the landscape of your market (I spend at least an
hour investigating by myself). It's like dating someone you've never met.
Obviously
you will publish them on Facebook, Twitter, Instagram and all other channels
that are #SiCrypt public.
Here is an example of how my survey looked:
Here are
some key questions you would like to ask the client during the first meeting:
1.What is the
purpose of your general business? What are your channel objectives (public
relations, social networks, etc.)?
2. Who is
your target audience?
3. Do you have a business association?
4. How often is the website updated? Do you have a web developer or an IT
department?
5. Have you ever worked with an SEO consultant? Or, has any SEO work been
done before?
Suzanne
Patel also has great recommendations for questions to ask a new SEO client.
After the
call, if I think we are a good couple, I will send my formal offer and my
contract (thanks Halloween for making it an easy process for me!).
To begin
with, I always like the first month's offer to my clients as a trial period to
make sure we have a vibration.
It gives the
client and me the opportunity to make friends before leaving. This month I will
take my time to do a deeper SEO audit.
These SEO
audits can take 40 to 60 hours, depending on the size of the website. These
audits are divided into three separate sections and presented with a Google
presentation.
• Technical:
Tracking errors, indexing, hosting, etc.
• Content:
Keyword research, competition analysis, content maps, metadata, etc.
• Links:
Backlink profile analysis, growth strategies, etc.
After that
first month, if the client likes my work, we will begin implementing SEO audit
recommendations. And in the future, I will do a monthly mini-audit and a deep
quarterly audit.
To recover,
I performed an SEO audit for my clients:
• The first
month
Monthly
(mini-audit)
quarterly
(in-depth monitoring)
What You
Need From a Client Before Testing SEO
When a
customer and I start working together, I'll share a Google document with them
for a list of passwords and providers.
These include:
Access Google Analytics and any third
party analytics tool.
• Google and
Bing Ads
Mas
Webmaster Tools
• Backend
access to the website
• Social
media accounts
। List of sellers
Internal List of internal team
members (any job they outsource)
SEO audit
tools
Before
starting your SEO audit, here's a summary of the tools I use:
• Frog
screams
• Integrity
(for Mac users) and GenuSluth (for PC users)
• SEO
Browser
• Web
machines
Step 1: Add
Site to Deepcroll and Shout Frog
Equipment:
• Dipcrol
• Copyscape
• Frog
screams
• Google
Analytics
• Integrity
• Google Tag
Manager
• Google
Analytics Code
What to
look for when using dipcroll
The first
thing I do is add my client's site to DeepCroll. Depending on the size of your
client's site, it may take a day or two to track the results.
Once I get
the results of DeepCroll, these are the things I'm looking for:
Duplicate
content
See the
"Duplicate Pages" report to identify duplicate content.
If duplicate
content is identified, I will prioritize it in my recommendations to the client
to rewrite these pages, and in the meantime I will add the <meta name =
"robot" content = "noedex, nofollow"> tag to duplicate
the pages.
Common
errors with duplicate content that you will discover:
meta title
and meta description duplicate
Duplicate
the body content of the tag pages (I will use Copyscape to determine if
anything is stolen).
• Two domains
(ex: yourwebsite.com, yourwebsite.com)
• Subdomains
(e.g. jobs.uurwebsite.com)
Similar
content in a different domain
Pages The pages are incorrectly
implemented (see below).
How to fix:
Pages Add
canonical tags to your pages so Google knows what your preferred URL is.
Do not allow
incorrect URLs in the robots.txt file.
Content
rewrite (including copyright body and metadata)
Here is an
example of a duplicate content problem I had with one of my clients, as you can
see below, they have URL parameters without canonical tags.
Here are the
steps I took to solve the problem:
Fixes a 301
redirect problem.
A canonical
tag was added to the page, I want to track Google.
Update
Google Search Console parameter settings to exclude any parameters that Unique
does not create unique content.
The task of
disallowing robots.txt files was added to the wrong URLs to improve the URL
tracking budget.
Sheeting
There are
two reports to review:
• First
pages: To find out which pages use the pages, review the "First
Pages" report. Then, you can manually review the pages on the site that
use it to find out if the paging was implemented correctly.
Linked Page
Pages: To find out if the page works correctly, the "Incognito Pages
Page" report will tell you whether rel = "next" and rel =
"prev" are linked to previous and next pages.
In this example below, I find out
that a client has a mutual submission tag using dipcroll:
How to fix:
. If you
have a "View All" or "Load More" page, add the rel =
"canonical" tag. Here is an example from Crutchfield:
If you have
all your pages in separate pages, add rel = "next" and rel =
"previous" standard markup. Here is an example of Massey:
If you use infinite scrolling, add the same
page page URL to your Javascript. Here is an example of American ag golf.
Most
redirects
Review the
"Maximum redirects" report to see all redirected pages more than 4
times. John Mueller noted at 25 that Google could stop following the redirects
if there were more than five.
Some people
refer to these crawl errors as eating "crawl budgets," while Gary
Ellis refers to this as "host load." It's important to make sure your
pages are properly rendered because you want your host load to be used
efficiently.
Here's an
overview of the response codes you can see:
1 301 - This
is the majority of the code that you will see during your study. 301 redirects
are fine as long as there is only one redirect and no redirect loop.
2 302 -
These codes are OK, but if left out for 3 months or more I will manually change
them to 301s so that they last. This is an error code that I often see with
e-commerce sites when a product is out of stock.
• 400 -
Users cannot access the page.
3 403 -
Users are denied access to the page.
4 404 - Page
not found (usually client 301 deletes a page without redirection).
• 500 -
Internal server error that you need to connect to the web development team to
determine the cause.
How to fix:
Remove any internal links pointing to the old
404 pages and update them with the internal link on the redirect page.
Undo the
redirect chains by removing intermediate redirects. For example, if A goes to
redirect B, C, and D, you will want to undo the B and C redirects. The final
result will redirect from A to D.
If you're
using that version, there is a way to do this in Screaming Frog and Google
Search Console below.
What to look
for when using a screaming frog
The second the thing I do when I get a new client site is to add their URL to the screaming
frog.
Depending on
the size of your client's site I can configure settings to crawl specific areas
of the site at once.
Here's a
look at my Screaming Frog Spider configurations:
You can do
this in your spider settings or by excluding site fields.
Once you've
got your screaming frog results back, here are the things I'll look for:
Google
Analytics Code
Shout Frog
can help you identify which pages are missing Google Analytics code (UA-1234568-9).
To find the missing Google Analytics code, follow these steps:
Go to
'Configuration' in the navigation bar, then Custom.
Add Analy Analytics \ .js to Filter
1, then change the drop-down to 'Does not contain'.
How to fix:
Ent Contact
your client's developers and ask them to add the code to specific pages that
are missing the code.
Visit the
Google Analytics section below for more Google Analytics information.
Google Tag
Manager
Shouting
frogs can help you find what pages are missing the Google Tag Manager snippet
with a similar action:
Go to the
'Configuration' tab in the navigation bar, then Custom.
If filter
includes <iframe src - "// www.googletagmanager.com/ with 'none'
selected.
How to fix:
To see if
there are any errors and go to Google Tag Manager where
Share the
code with your client's developers to see if they can add the site back to the
site.
Schema
You may also
want to check if your client's site is using schema markup on their site.
Schema or structured data helps search engines to understand what a page on the
site is.
To check the
schema markup on the screaming frog, follow these steps:
Go to the
'Configuration' tab in the navigation bar, then 'Custom'
.
Add itemType = "http: // schema.
Org. / With the selected 'container' in the filter
Indexing
Want to
determine how many pages are being indexed for your client, follow this with a
shouting frog:
After
loading the site on your screaming frog, go to Directory> Filter> Index
to review if there is any missing piece of code.
How to fix:
. If the
site is new, Google may not yet have it indexed.
Robots.org
makes sure you are not missing out on anything Google wants you to crawl. Check
the file.
Google Check
to make sure you've submitted your client's Sitemap to Google Search Console
and Bing Webmaster Tools.
। Conduct manual research (see below).
Flash
Google
announced in 2016 that Chrome will start blocking Flash due to slow page load
times. So, if you're monitoring, you want to find out if your new client is
using Flash.
To do this
in a screaming frog, try it out:
Go to
'Spider Configuration' in the navigation.
Click
"Check SWF."
After the
crawl is done, filter the 'Internal' tab with 'Flash'.
How to fix:
YouTube YouTube
opt for HTML5
standards when adding a video.
Here’s an example of HTML5 code for adding a
video:
<video controls="controls" width="320"
height="240">>
<source class="hiddenSpellError"
data-mce-bogus="1"
/>src="/tutorials/media/Anna-Teaches-SEO-To-Small-Businesses.mp4"
type="video/mp4">
<source src="/tutorials/media/Anna-Teaches-SEO-To-Small-Businesses.ogg"
type="video/ogg" />
Your browser does not support the video tag.</video>
Javascript
According
to Google’s announcement in 2015, JavaScript is okay to use for your website as
long as you’re not blocking anything in your robots.txt (we’ll dig into this
deeper in a bit!). But, you still want to take a peek at how the Javascript is
being delivered to your site.
How
to fix:
•Review
Javascript to make sure it’s not being blocked by robots.txt
•Make
sure Javascript is running on the server (this helps produce plain text data vs
dynamic).
•If
you’re running Angular JavaScript, check out this article by Ben Oren on why it
might be killing your SEO efforts.
•In
Screaming Frog, go to the Spider Configuration in the navigation bar and click
‘Check JavaScript.’ After the crawl is done, filter your results on the
‘Internal’ tab by ‘JavaScript.’
Robots.txt
When
you’re reviewing a robots.txt for the first time, you want to look to see if
anything important is being blocked or disallowed.
For
example, if you see this code:
User-agent:
*
Disallow:
/
Your client’s website is blocked from all web crawlers.
But,
if you have something like Zappos robots.txt file, you should be good to go.
#
Global robots.txt as of 2012-06-19
User-agent: *
Disallow: /bin/
Disallow: /multiview/
Disallow: /product/review/add/
Disallow: /cart
Disallow: /login
Disallow: /logout
Disallow: /register
Disallow: /account
They
are only blocking what they do not want web crawlers to locate. This content
that is being blocked is not relevant or useful to the web crawler.
How
to fix:
•Your
robots.txt is case-sensitive so update this to be all lowercase.
•Remove
any pages listed as Disallow that you want the search engines to crawl.
•Screaming
Frog by default will not be able to load any URLs disallowed by robots.txt. If
you choose to switch up the default settings in Screaming Frog, it will ignore
all the robots.txt.
•You
can also view blocked pages in Screaming Frog under the ‘Response Codes’ tab,
then filtered by ‘Blocked by Robots.txt’ filter after you’ve completed your
crawl.
•If
you have a site with multiple subdomains, you should have a separate robots.txt
for each.
•Make
sure the sitemap is listed in the robots.txt.
Crawl
Errors
I
use DeepCrawl, Screaming Frog, and Google and Bing webmaster tools to find and
cross-check my client’s crawl errors.
To
find your crawl errors in Screaming Frog, follow these steps:
•After
the crawl is complete, go to ‘Bulk Reports.’
•Scroll
down to ‘Response Codes,’ then export the server side error report and the
client error report.
How
to fix:
Free
Google Ads report finds improvements in 60 seconds
Based
on actual data from your own campaigns.
ADVERTISEMENT
•The
client error reports, you should be able to 301 redirect the majority of the
404 errors in the backend of the site yourself.
•The
server error reports, collaborate with the development team to determine the
cause. Before fixing these errors on the root directory, be sure to backup the
site. You may simply need to create a new .html access file or increase PHP
memory limit.
•You’ll
also want to remove any of these permanent redirects from the sitemap and any
internal or external links.
•You
can also use ‘404’ in your URL to help track in Google Analytics.
Redirect
Chains
Redirect
chains not only cause poor user experience, but it slows down page speed,
conversion rates drop, and any link love you may have received before is lost.
Fixing
redirect chain is a quick win for any company.
How to fix:
•In
Screaming Frog after you’ve completed your crawl, go to ‘Reports’ >
‘Redirect Chains’ to view the crawl path of your redirects. In an excel
spreadsheet, you can track to make sure you’re 301 redirects are remaining 301
redirects. If you see a 404 error, you’ll want to clean this up.
Internal
& External Links
When
a user clicks on a link to your site and gets a 404 error, it’s not a good user
experience.
And,
it doesn’t help your search engines like any better either.
To
find my broken internal and external links I use Integrity for Mac. You can
also use Xenu Sleuth if you’re a PC user.
I’ll
also show you how to find these internal and external links in Screaming Frog
and DeepCrawl if you’re using that software.
How
to fix:
•If
you’re using Integrity or Xenu Sleuth, run your client’s site URL and you’ll
get a full list of broken URLs. You can either manually update these yourself
or if you’re working with a dev team, ask them for help.
•If
you’re using Screaming Frog, after the crawl is completed, go to ‘Bulk Export’
in the navigation bar, then ‘All Outlinks.’ You can sort by URLs and see which
pages are sending a 404 signal. Repeat the same step with ‘All Inlinks.’
•If
you’re using DeepCrawl, go to the ‘Unique Broken Links’ tab under the ‘Internal
Links’ section.
URLs
Every
time you take on a new client, you want to review their URL format. What am I
looking for in the URLs?
•Parameters
– If the URL as weird characters like ?, =, or +, it’s a dynamic URL which can
cause duplicate content if not optimized.
•User-friendly
– I like to keep the URLs short and simple while also removing any extra
slashes.
How
to fix:
•You
can search for parameter URLs in Google by doing site:www.buyaunicorn.com/
inurl: “?” or whatever you think the parameter might include.
•After
you’ve run the crawl on Screaming Frog, take a look at URLs. If you see
parameters listed that are creating duplicates of your content, you need to
suggest the following:
v Add a canonical tag to the main URL
page. For example, www.buyaunicorn.com/magical-headbands is the main page and I
see www.buyaunicorn.com/magical-headbands/?dir=mode123$, then the canonical tag
would need to be added to www.buyaunicorn.com/magical-headbands.
·
Update
your parameters in Google Search Console under ‘Crawl’ > ‘URL Parameters.’
•Disallow
the duplicate URLs in the robots.txt.
Step
2: Review Google Search Console and Bing Webmaster Tools.
Tools:
•Google
Search Console
•Bing
Webmaster Tools
•Sublime
Text (or any text editor tool)
Set
a Preferred Domain
Since
the Panda update, it’s beneficial to clarify to the search engines the
preferred domain. It also helps make sure all your links are giving one site
the extra love instead of being spread across two sites.
How
to fix:
•In
Google Search Console, click the gear icon in the upper right corner.
•Choose
which of the URLs is the preferred domain.
•You
don’t need to set the preferred domain in Bing Webmaster Tools, just submit your
sitemap to help Bing determine your preferred domain.
Backlinks
With
the announcement that Penguin is real-time, it’s vital that your client’s
backlinks meet Google’s standards.
If
you notice a large chunk of backlinks coming to your client’s site from one
page on a website, you’ll want to take the necessary steps to clean it up, and
FAST!
How
to fix:
•In
Google Search Console, go to ‘Links’ > then sort your ‘Top linking sites.’
•Contact
the companies that are linking to you from one page to have them remove the
links.
•Or,
add them to your disavow list. When adding companies to your disavow list, be
very careful how and why you do this. You don’t want to remove valuable links.
Here’s
an example of what my disavow file looks like:
Keywords
As
an SEO consultant, it’s my job to start to learn the market landscape of my
client. I need to know who their target audience is, what they are searching
for, and how they are searching. To start, I take a look at the keyword search
terms they are already getting traffic from.
•In
Google Search Console, under ‘Search Traffic’ > ‘Search Analytics’ will show
you what keywords are already sending your client clicks.
Crawl
Crawl
errors are important to check because it’s not only bad for the user but it’s
bad for your website rankings. And, John Mueller stated that low crawl rate may
be a sign of a low-quality site.
To
check this in Google Search Console, go to ‘Coverage’ > ‘Details.’
To
check this in Bing Webmaster Tools, go to ‘Reports & Data’ > ‘Crawl
Information.’
How
to fix:
•Manually
check your crawl errors to determine if there are crawl errors coming from old
products that don’t exist anymore or if you see crawl errors that should be disallowed
in the robots.txt file.
•Once
you’ve determined where they are coming from, you can implement 301 redirects
to similar pages that link to the dead pages.
•You’ll
also want to cross-check the crawl stats in Google Search Console with average
load time in Google Analytics to see if there is a correlation between time
spent downloading and the pages crawled per day.
Structured
Data
As
mentioned above in the schema section of Screaming Frog, you can review your
client’s schema markup in Google Search Console.
Use
the individual rich results status report in Google Search Console. (Note: The
structured data report is no longer available).
This
will help you determine what pages have structured data errors that you’ll need
to fix down the road.
How
to fix:
•Google
Search Console will tell you what is missing in the schema when you test the
live version.
•Based
on your error codes, rewrite the schema in a text editor and send to the web
development team to update. I use Sublime Text for my text editing. Mac users
have one built-in and PC users can use TextPad.
Step
3: Review Google Analytics
Tools:
•Google
Analytics
•Google
Tag Manager Assistant Chrome Extension
•Annie
Cushing Campaign Tagging Guide
Views
When
I first get a new client, I set up 3 different views in Google Analytics.
•Reporting
view
•Master
view
•Test
view
These
different views give me the flexibility to make changes without affecting the
data.
How
to fix:
•In
Google Analytics, go to ‘Admin’ > ‘View’ > ‘View Settings’ to create the
three different views above.
•Make
sure to check the ‘Bot Filtering’ section to exclude all hits from bots and
spiders.
•Link
Google Ads and Google Search Console.
•Lastly,
make sure the ‘Site search Tracking’ is turned on.
Filter
You
want to make sure you add your IP address and your client’s IP address to the
filters in Google Analytics so you don’t get any false traffic.
How
to fix:
•Go
to ‘Admin’> ’View’ > ‘Filters’
•Then,
the settings should be set to ‘Exclude’ > ‘traffic from the IP addresses
> ‘that are equal to.’
Tracking
Code
You
can manually check the source code, or you can use my Screaming Frog technique
from above.
If
the code is there, you’ll want to track that it’s firing real-time.
•To
check this, go to your client’s website and click around a bit on the site.
•Then
go to Google Analytics > ‘Real-Time’ > ‘Locations,’ your location should
populate.
•If
you’re using Google Tag Manager, you can also check this with the Google Tag
Assistant Chrome extension.
How
to fix:
•If
the code isn’t firing, you’ll want to check the code snippet to make sure it’s
the correct one. If you’re managing multiple sites, you may have added a
different site’s code.
•Before
copying the code, use a text editor, not a word processor to copy the snippet
onto the website. This can cause extra characters or whitespace.
•The
functions are case-sensitive so check to make sure everything is lowercase in
code.
Indexing
If
you had a chance to play around in Google Search Console, you probably noticed
the ‘Coverage’ section.
When
I’m auditing a client, I’ll review their indexing in Google Search Console
compared to Google Analytics. Here’s how:
•In
Google Search Console, go to ‘Coverage’
•In
Google Analytics, go to ‘Acquisition’ > ‘Channels’ > ‘Organic Search’
> ‘Landing Page.’
•Once
you’re here, go to ‘Advanced’ > ‘Site Usage’ > ‘Sessions’ > ‘9.’
How
to fix:
•Compare
the numbers from Google Search Console with the numbers from Google Analytics,
if the numbers are widely different, then you know that even though the pages
are getting indexed only a fraction are getting organic traffic.
Campaign
Tagging
The
last thing you’ll want to check in Google Analytics is if your client is using
campaign tagging correctly. You don’t want to not get credit for the work
you’re doing because you forgot about campaign tagging.
How
to fix:
•Set
up a campaign tagging strategy for Google Analytics and share it with your
client. Annie Cushing put together an awesome campaign tagging guide.
•Set
up Event Tracking, if your client is using mobile ads or video.
Keywords
You
can use Google Analytics to gain insight into potential keyword gems for your
client. To find keywords in Google Analytics, follow these steps:
•Go
to Google Analytics > ‘Behavior’ > ‘Site Search’ > ‘Search Terms.’
This will give you a view of what customers are searching for on the website.
•Next,
I’ll use those search terms to create a ‘New Segment’ in Google Analytics to see
what pages on the site are already ranking for that particular keyword term.
Step
4: Manual Check
Tools:
•Google
Analytics
•Access
to client’s server and host
•You
Get Signal
•Pingdom
•PageSpeed
Tools
•Wayback
Machine
1
Version of Your Client’s Site is Searchable
Check
all the different ways you could search for a website. For example:
•http://annaisaunicorn.com
•https://annaisaunicorn.com
•http://www.annaisaunicorn.com
As
Highlander would say, “there can be only one” website that is searchable.
How
to fix:
•Use
a 301 redirect for all URLs that are not the primary site to the canonical
site.
Indexing
Conduct
a manual search in Google and Bing to determine how many pages are being
indexed by Google. This number isn’t always accurate with your Google Analytics
and Google Search Console data, but it should give you a rough estimate.
To
check, do the following:
•Perform
a site search in the search engines.
•When
you search, manually scan to make sure only your client’s brand is appearing.
•Check
to make sure the homepage is on the first page. John Mueller said it isn’t
necessary for the homepage to appear as the first result.
How
to fix:
•If
another brand is appearing in the search results, you have a bigger issue on your
hands. You’ll want to dive into the analytics to diagnose the problem.
•If
the homepage isn’t appearing as the first result, perform a manual check of the
website to see what it’s missing. This could also mean the site has a penalty
or poor site architecture which is a bigger site redesign issue.
•Cross-check
the number of organic landing pages in Google Analytics to see if it matches
the number of search results you saw in the search engine. This can help you
determine what pages the search engines see as valuable.
Caching
I’ll
run a quick check to see if the top pages are being cached by Google. Google
uses these cached pages to connect your content with search queries.
To
check if Google is caching your client’s pages, do this:
http://webcache.googleusercontent.com/search?q=cache:https://www.searchenginejournal.com/pubcon-day-3-women-in-digital-amazon-analytics/176005/
Make
sure to toggle over to the ‘Text-only version.’
You
can also check this in Wayback Machine.
How
to fix:
•Check
the client’s server to see if it’s down or operating slower than usual. There
might be an internal server error or a database connection failure. This can
happen if multiple users are attempting to access the server at once.
•Check
to see who else is on your server with a reverse IP address check. You can use
You Get Signal website for this phase. You may need to upgrade your client’s
server or start using a CDN if you have sketchy domains sharing the server.
•Check
to see if the client is removing specific pages from the site.
Hosting
While
this may get a little technical for some, it’s vital to your SEO success to
check the hosting software associated to your client’s website. Hosting can
harm SEO and all your hard work will be for nothing.
You’ll
need access to your client’s server to manually check any issues. The most
common hosting issues I see are having the wrong TLD and slow site speed.
How
to fix:
•If
your client has the wrong TLD, you need to make sure the country IP address is
associated with the country your client is operating in the most. If your
client has a .co domain and also a .com domain, then you’ll want to redirect
the .co to your client’s primary domain on the .com.
•If
your client has slow site speed, you’ll want to address this quickly because
site speed is a ranking factor. Find out what is making the site slow with
tools like PageSpeed Tools and Pingdom. Here’s a look at some of the common
page speed issues
- ·
Host
- ·
Large
images
- ·
Embedded
videos
- ·
Plugins
- ·
Ads
- ·
Theme
- ·
Widgets
- ·
Repetitive
script or dense code
No comments:
Post a Comment