Profile cover photo
Profile photo
Palash Kumar Daw (পলাশ কুমার দাঁ)
Founder At WebCully
Founder At WebCully


Post is pinned.Post has attachment

Post has attachment
Trying #typography with mouse courser. #Calligraphy #lettering #typemetters
Add a comment...

Post has attachment
Digital #advertising plays an important role in making the #web what it is today............
Add a comment...

Post has attachment

Post has attachment

Post has shared content
So you got your staging site indexed? Happens to everyone. Here's a rough guide on fixing it, and suggestions for preventing it.
(thought I'd write this up somewhere)

The fastest way to get the staging site removed from search is remove it via Search Console. For that, you need to verify ownership via Search Sonsole [1] (ironically, this means you'll likely have to make it accessible to search engines again, or figure out DNS verification, which isn't that common but also not that hard). From there, you can do a site-removal request [2], which will take the whole hostname out of Google's search for ca 90 days. During this time, you can figure out and implement your general plan to block the staging site from search.

My recommendation for staging sites is to block access on the server side, either with server-side / HTTP authentication [3] or IP address whitelisting (IP addresses can change, and this would block you from using tools from home, etc, so it's worth being cautious there and whitelisting rather than blacklisting).

I don't like the alternatives. Using page-level or HTTP response noindex [4] means the pages need to be accessible (open to competitors, scrapers, etc). Using robots.txt [5] [6] means you need to remember to change the robots.txt when moving from staging to production (another source of common problems), and can result in URLs being indexed without their content (URLs blocked by robots.txt may be indexed, even without their content being known).

Regardless of the method, if you're not using the site-removal request, keep in mind that crawling is at a page-level and can take time, especially if we're not sure about the importance of your staging site (which is usually the case). It's normal for URLs to not be recrawled in months, so if you add any block on the URL level, it can easily take a half year or longer to be fully processed for all URLs. The site-removal request gives you most of that time, and you can submit another one should you need to extend it.

What did I miss? Which is your favorite setup?

Add a comment...

Post has attachment
Machine learning (#ML) has become an increasingly powerful #tool, one that can be applied to a wide variety of areas spanning object recognition, language translation, health and more. However, the #development of ML systems is often restricted to those with computational resources and the #technical expertise to work with commonly available ML libraries.
Add a comment...

Post has attachment

The new #SearchConsole: a sneak peek at two experimental features

Search Console was initially launched with just four reports more than a decade ago. Today, the product includes more than two dozen reports and tools covering AMP, structured data, and live testing tools, all designed to help improve your site's performance on Google Search. Now we have decided to embark on an extensive redesign to better serve you, our users. Our hope is that this redesign will provide you with:

More actionable insights - We will now group the identified issues by what we suspect is the common “root-cause” to help you find where you should fix your code. We organize these issues into tasks that have a state (similar to bug tracking systems) so you can easily see whether the issue is still open, whether Google has detected your fix, and track the progress of re-processing the affected pages.
Better support of your organizational workflow - As we talked to many organizations, we’ve learned that multiple people are typically involved in implementing, diagnosing, and fixing issues. This is why we are introducing sharing functionality that allows you to pick-up an action item and share it with other people in your group, like developers who will get references to the code in question.
Faster feedback loops between you and Google - We’ve built a mechanism to allow you to iterate quickly on your fixes, and not waste time waiting for Google to recrawl your site, only to tell you later that it’s not fixed yet. Rather, we’ll provide on-the-spot testing of fixes and are automatically speeding up crawling once we see things are ok. Similarly, the testing tools will include code snippets and a search preview - so you can quickly see where your issues are, confirm you've fixed them, and see how the pages will look on Search.

In the next few weeks, we're releasing two exciting BETA features from the new Search Console to a small set of users — Index Coverage report and AMP fixing flow.
The new Index Coverage report shows the count of indexed pages, information about why some pages could not be indexed, along with example pages and tips on how to fix indexing issues. It also enables a simple sitemap submission flow, and the capability to filter all Index Coverage data to any of the submitted sitemaps. Here’s a peek of our new Index Coverage report:
The new AMP fixing flow
The new #AMP fixing experience starts with the AMP Issues report. This report shows the current AMP issues affecting your site, grouped by the underlying error. Drill down into an issue to get more details, including sample affected pages. After you fix the underlying issue, click a button to verify your fix, and have Google recrawl the pages affected by that issue. Google will notify you of the progress of the recrawl, and will update the report as your fixes are validated.
As we start to experiment with these new features, some users will be introduced to the new redesign through the coming weeks.
Add a comment...

Post has attachment

Post has attachment
Wait while more posts are being loaded