Microsoft Flow (delete files and folders from a document library)

Do you know how to delete files and folders from SharePoint Online document library using Microsoft Flow?

When I received the task to delete files and folders older than 30 days from SharePoint Online with Microsoft Flow I was thinking this will be an easy one, but I was wrong 😒

I used Recurrence as a trigger (run every 30 days), Get items, Apply to each and Delete item actions to select all files and folders older than 30 days:


But there is a problem with this configuration if you have folders with files in the document library. Get items will collect all folders and files, each and every one of them are having unique internal ID. It will delete folders with the files than all other files that are older than 30 days and it will try to delete the files from the folder(s) which are already deleted when the folder(s) has been deleted and it will throw and error: 404 item not found. The flow will fail but all the files and folders will be deleted.

How to avoid this error? First we need to get all the files and all the folders older than 30 days.


Get items (only files) has the following configuration (first you need to enable the Filter Query from Advanced parameters):
  • Created date should be less than addDays(utcNow(), -30, 'yyyy-MM-dd') and FSObjType equal 0 (0 is fo files and 1 is for folders)

After you have all the files with created date older than 30 days you apply for each action and delete them.

Last step is to select only folders older than 30 days and delete them, the setup looks like the screenshot below:


Hope this article help you solve this issue. Please share your feedback.

[ATTENTION] Be careful with Copy-SPSite and Colligo!

Analysis of Copy-SPSite and Colligo

Folks be careful when using Colligo in your organization, careful with Copy-SPSite. Otherwise you may end-up with several issues regarding Colligo is unable to sync some sites:
According to the description provided by Microsoft Copy-SPSite:

Use the Copy-SPSite cmdlet to make a copy of a site collection from an implied source content database to a specified destination content database.The copy of the site collection has a new URL and a new SiteID. When you have database snapshot capabilities on a computer running SQL Server, a temporary snapshot of the source database is created for the duration of the copy to prevent any data changes during the copy process. If you do not have database snapshot capabilities on the server running SQL Server, you can back up the source and restore it to the destination to get the same result.

Copy-SPSite is creating a new site with new URL and a new SiteID, but the WebID is the same. The WebId of the original site and cloned one are the same. Yes that is true the WebID stay the same and this is what is confused for Colligo. Because Colligo relays on WebID as unique identifier.

How to get that? Here are the steps:

1. Find out which is the "Storage Location" for Colligo Briefcase. Go to View->Options
2. Go to this location, you will see "Sites.db" file:

3. Open the file with "DB Browser for SQL Lite", but first close Colligo and open Webs table.
Under Server Name is the WebID of the site and if you go to SharePoint server you will see it is the same GUID provided by Get-SPWeb. You can compare both WebID's of the sites, original ones and the one created with Copy-SPSite ... they are the same. Of course the logic that is used by Colligo is not allowing you to synchronize sites created with Copy-SPSite.

Interesting is that OneDrive for Business does not have this problem and synchronization is working fine. Why? Because Microsoft use a different logic for unique identifier. Normally the OneDrive for Business local cache access database is located here: C:\Users\Pavlov Aleksandar\AppData\Local\Microsoft\Office\15.0\OfficeFileCache\CentralTable.accdb. Open the DB in Access and than open MasterFile table.


As you can see the logic used by Microsoft for storing the information about the site is OneDrive for Business is GUID/relative path, in this case STID${CB935F8C-1D73-4BEC-B54E-D2E23D013CDC}/sites/loadtest22.

At the beginning to me it was also too strange to believe that WebID of two or more sites could be the same and in my opinion Microsoft changed this logic a while ago. But this may be a huge problem for companies using Colligo for file synchronization between SharePoint and local PC if somebody from Colligo do not pay attention to this problem.

Hope you will find this post helpful.

If so please share it.
-----------------------------------------------------------------------------

Deploying new solutions in SharePoint 2013 takes too much time

One of my customers is having TEST, QA and PROD environment. Where QA and PROD are equal (same amount of WFE servers (in this case 2 WFE servers), same configuration and settings, in case PROD is down we can switch to QA in a minutes and it will be the same). 

What happened: since a while deploying a new version of the solutions takes too much time, around 1.5 hours, compared with the past and QA - 15 min. It is horrible to stay in front of the monitor and wait the slow deployment to finish. Yes, it finished without errors but waiting was killing me. I wasn't aware of any changes in infrastructure level that can lead to this slowness. 

I have to mention that there wasn't any slowness on the sites and locally on the server everything was working fine, no high memory or CPU consumption, nothing that could tell you way the deployment is so slow. 

In QA environment the deployment time was normal, around 15-20 min. So this also exclude the .wsp's from the suspected root cause.

Start looking for a solutions in internet and I was surprised how many article and post related to SharePoint slowness, sites are loading too slow and etc. but nothing related to my case. That's why I decided to write this post because I was able to find the root cause.

Incidentally I opened two RDP sessions to the WFE servers and size of the windows was different, guess what:

Can you see where is the problem from the picture above?

Yes, that is right. 

Both servers are having different time, the second server was 5 min behind the real time.

OK potentially I thought this could be the reason and went to check the time zones, but on both servers the time zones were correct, but not the time. After a while I checked the services and found out that "Windows Time" services on both machines were set to disable. Check how it is QA and there the services were running. Enabling the services and set them to automatic was the fix I needed.    

In the future if you have a slow deployment make sure you check the "Windows Time" service, set it to automatic and run it.

Hope you will find this post helpful, if so please share it.

Enjoy!

Proactively increase the max site storage quota

With this post I would like to show you my way of proactively increasing the site storage quota. My task was to eliminate the need of users to request site storage quota increase. I wrote a small PowerShell script which you can download from here

The script will check the current site size, compared it with the max site storage quota assigned and if the current occupied storage is over 80% the site is a potential candidate for increase the max site storage quota. Of course you can change this value to whatever it is suitable for you. You will be able also the see the assigned quota template, if you have custom quota created or it will be shown as "No Template Applied". That means that "Individual" quota has been assigned, but you will still need to increase the assigned max storage size. The Read/Only sites are displayed in the report only if the warning quota is reached, in my case the warning quota is 1024MB, so you have to switch back to Write/Read before you change the quota. Inside the script if it is necessary you can change the warning quota limit (please read the hints/comments inside the script). 

The output looks like:

The output will be stored in a file and this file will be sent via email. Please check the settings in the script to understand to whom to send the report. Reports older than 2 days will be removed from the folder.

So give it a try and let me know if it works or not.

Enjoy!

Could not retrieve profile schema from server

How many of you faced this strange behavior of SharePoint 2013 designer?


When trying to use User Profile as a Data Source in a workflow. Something similar to the screen shot below:

Try looking in google did not give me the expected solution. A lot of people suggested to grant permissions of the web app pool account to the search service database. But what actually that means? I granted read permissions to all search related databases, but as you expect the problem was still persisting.

So many articles were referring to Search I was start thinking that the problem is really there. But all of the explanations were not exactly correct. The right way to make it working is:
  1. Go to Central Admin
  2. Go to Manage Service Applications
  3. Select Search Service Application, but don't click on it
  4. Select Administrators
  5. Add the application pool account of the web application
  6. Grant "Read (Diagnostic Pages Only)" permission and click OK


Afterwards everything should work normally.

For now I don't have exact explanation why these settings are necessary. Will try to find out and post here as a comment.

[HOW TO] understand who deleted a SharePoint 2013 site

Very interesting task - How to find who deleted a site/sitecollection in SharePoint 2013. Few days ago I was tasked to restore a site, whcih was deleted 1 or 2 days ago. Easy task just found the site inside SPDeletedSite (Get-SPDeletedSite) and using Restore-SPDeletedSite was able to restore the site. But the next day again received the request to restore the same site, it has been again deleted and inside Get-SPDeletedSite information about who performed the deletion is not available.
I decided that I have to find out who deleted the site. So I downloaded the IIS logs (from SPDeletedSite object you can see the date the site has been deleted) for the coresponding day and start analyzing them. But the logs are too big.
And if you try to open them with Notepad or Notepad++ this is not going to work. So what is next? Fortunatelly there is LogParser. LogParser is the best IIS logs analyzing tool for every SharePoint administrator. So spare enough time and play with LogParser because it will definitelly save you time and will help you a lot in your day-to-day work and of course you will impress your manager with the capabilities offered by LogParser.
Then I have to find out how was the site deleted, in order to build LogParser query so I can execute it over IIS log files. They are 2 different ways to delete a site:
  1. From the Site Collection setting menu (deleteweb.aspx) or
  2. from Central administration (delsite.aspx)
In case I don't know the way the site has been deleted I build two LogParser queries:
  • LogParser.exe -i:W3C -o:W3C "select date,time, s-sitename, s-computername, s-ip, cs-method, cs-uri-stem, cs-username into 'C:\Tools\deletedSite.txt' from 'C:\Tools\u_ex*.log' where cs-uri-stem like '%delsite%'"
  • LogParser.exe -i:W3C -o:W3C "select date,time, s-sitename, s-computername, s-ip, cs-method, cs-uri-stem, cs-username into 'C:\Tools\deletedSite.txt' from 'C:\Tools\u_ex*.log' where cs-uri-stem like '%deleteweb%'"
I my case site has been deleted from the Site Setting menu (deleteweb.aspx) and the rusult is as follow:

And if you pay attention to the DeletionTime (from SPDeletedSite) and Date, Time form IIS logs you will see that we have an exact match, and they are no other entries for deleteweb.aspx in IIS logs. So this proves that this user deleted the site and he/she deleted it from the Site Settings menue (because of the deleteweb.aspx).
Hope this will help someone if so share this post or comment it.
And definitelly this is the last post from me for this year.
Enjoy the holidays and see you next year 2017.

Windows 10 Data DeDuplication after Anniversary update

This could be one of my last posts for this year.

One post out of the SharePoint world, but closely related. I'm using Windows 10 on my laptop (with 32 GB of RAM) where my virtual test machines (SharePoint 2013 and 2016) are running. Pretty usual environment for every SharePoint enthusiast.  And most of you know the problem with the speed (performance) when 3 VM's are running on a laptop. That's why I bought 2 additional SSD's (120 GB each) and put them in my laptop. So my OS is running on a SSD and I have one SSD (120 GB) and one HDD (1 TB) for SharePoint VM's playground, but this space, on the SSD drive is insufficient.

Most of you already know about Data DeDuplication really good feature which allows you to store more data than usually you can. But I' not going to explain here what is DeDulication and how it works, if you are interested you can check in google.

I'm very happy with this feature/technology even if it is not released for home usage. Officially it available for MS server products, but there is a way to install/activate it on your local Windows 10.

One day my machine needs to install some updates and I was shocked afterwards because my SSD drive, where the DeDuplication was enabled, was not accessible anymore. OK I was accept the fact that I'll lose the data on the disk and had to format it. But definitely I need this feature and start searching for a solution, unfortunately several days without success, tried a lot of suggestions, but none of them was working, pretty sad.

Fortunately I post a question to MSDN forum and with a lot of reading could find a solution. You just need to download these files from here (not working any more) and install them on your machine. then you will be able to enable DeDuplication on a drive which you want. My machine had installed Windows 10 Pro version 1607 OS build 14393.222.

So try if you want Data DeDulication feature on your Windows 10, let me know if it works for you or not and share if you think it could be helpful for others.

Enjoy the holidays, wish you all the best to you and your family. 
Hope to see you soon in the next 2017 year!