Comparing two slices - what am I doing wrong?

I have two slices:
unique: [dynamic_content segmentation ai]
relations: [dynamic_content segmentation ai ai segmentation]

I want to print the difference between them. When I run this:

{{ $unique | symdiff $relations }}
{{ $relations | symdiff $unique }}
{{ $unique | complement $relations }}
{{ $relations | complement $unique }}

All I get is 4 empty slices. Trying to run intersect returns the correct values - a collection of common elements.

I also tried this syntax:
{{ collections.SymDiff $unique $relations}}
Without any success.

What am I missing? Which function should I use and how?

Can you give us more from your code or your github repo?

How many items you have in $unique and $relations?

I used similar with paging

{{ $r1 := where .Data.Pages "ExpiryDate" ">=" now }}
{{ $r2 := where .Data.Pages (isset .Params "ExpiryDate") }}
{{ $paginator := .Paginate ( intersect $r1 $r2) }} 
{{ range $paginator.Pages -}}
	{{ .Render "summary" }}
{{- end }}
{{ partial "pagination" . }}
1 Like

My beginning is a file with my own Param in the metadata:
features: [dynamic_content, segmentation, ai, ai, segmentation]

This can be much longer and include many repetitions of each item.
What I want to achieve is that Hugo lets me know when there are duplicates and what they are exactly, as part of a bigger partial that parses this metadata and creates links to other files.

{{ $relations := .Page.Param "features" }}
{{ $unique := $relations | uniq }}
<!-- Locate duplicate values in param array -->
{{ if not (eq $relations $unique)}}
    {{ $duplicates := (collections.SymDiff $unique $relations) }}
    {{ printf "Duplicate feature(s) %s found in %s" $duplicates .Page.File }}
{{ end }}

The output is
Duplicate feature(s) [] found in article-test

Given the following sets:

{{ $a := slice "zero" "one" "one" "two" "three" "five" "eight"}}
{{ $b := slice "two" "four" "six" "eight"  }}

{{ $unique := slice "dynamic_content" "segmentation" "ai" }}
{{ $relations := slice "dynamic_content" "segmentation" "ai" "ai" "segmentation" }}

And looking at the docs:

complement: gives the elements of a collection that are not in any of the others.

{{ $a | complement $b }} : [zero one one three five]

Both of these below return [] because each element in each set are also found in the other set.

{{ $unique | complement $relations }}
{{ $relations | complement $unique }}

symdiff: returns the symmetric difference of two collections.

{{ $a | symdiff $b }} : [zero one one three five four six]

These two below will return [] because again, "dynamic_content" "segmentation" and "ai" are in both sets. Once you remove instances of these from each set, you are left without any other elements.

{{ $unique | symdiff $relations }}
{{ $relations | symdiff $unique }}

Is this not then the function you are looking for?


Ok, I get why it doesn’t do what I thought it will. Thanks!

Intersect is not what I’m looking for, I just ran it to see if maybe there’s something completely wrong with my code.

I’ll have to figure out some other way to list the duplicates.

Given your two sample arrays, what result would you want/expect to be displayed?

Take a look at $relations - I need a list of items that occur more than once in that array.

$unique is there only because I thought I could use it for symdiff comparison, it’s a by-product of $relations. It starts to feel like a dead-end, though.

I have another idea to test tomorrow at work: range $unique over $relations, and each iteration deletes the current value from $relations. If I’m thinking correctly, it will only delete the first occurrence it finds. This should leave me with an array of items that occur more than once. If an item is repeated three times, two of those will remain in my final array, and that’s fine for my usage scenario.

Another approach might be to iterate $relations over itself and count the occurence of each iterable value, save the value-occurences pairs into a map, and then output only the values with occurences > 1 from that map.

I’ll post here when I check these ideas.