A common pastime for poker players at a poker table is to shuffle stacks of chips. Shuffling chips is performed by starting with two stacks of poker chips, S1 and S2, each stack containing C chips. Each stack may contain chips of several different colors.
The actual shuffle operation is performed by interleaving a chip from S1 with a chip from S2 as shown below for C = 5:
The single resultant stack, S12, contains 2 * C chips. The bottommost chip of S12 is the bottommost chip from S2. On top of that chip, is the bottommost chip from S1. The interleaving process continues taking the 2nd chip from the bottom of S2 and placing that on S12, followed by the 2nd chip from the bottom of S1 and so on until the topmost chip from S1 is placed on top of S12.
After the shuffle operation, S12 is split into 2 new stacks by taking the bottommost C chips from S12 to form a new S1 and the topmost C chips from S12 to form a new S2. The shuffle operation may then be repeated to form a new S12.
For this problem, you will write a program to determine if a particular resultant stack S12 can be formed by shuffling two stacks some number of times.
Input
The first line of input contains a single integer N, (1 ≤ N ≤ 1000) which is the number of datasets that follow.
Each dataset consists of four lines of input. The first line of a dataset specifies an integer C, (1 ≤ C ≤ 100) which is the number of chips in each initial stack (S1 and S2). The second line of each dataset specifies the colors of each of the C chips in stack S1, starting with the bottommost chip. The third line of each dataset specifies the colors of each of the C chips in stack S2 starting with the bottommost chip. Colors are expressed as a single uppercase letter (A through H). There are no blanks or separators between the chip colors. The fourth line of each dataset contains 2 * C uppercase letters (A through H), representing the colors of the desired result of the shuffling of S1 and S2 zero or more times. The bottommost chip’s color is specified first.
Output
Output for each dataset consists of a single line that displays the dataset number (1 though N), a space, and an integer value which is the minimum number of shuffle operations required to get the desired resultant stack. If the desired result can not be reached using the input for the dataset, display the value negative 1 (−1) for the number of shuffle operations.
Sample Input
2
4
AHAH
HAHA
HHAAAAHH
3
CDE
CDE
EEDDCC
Sample Output
1 2
2 -1
题意:给定两个长度为len的字符串s1和s2, 接着给出一个长度为len*2的字符串s12。
将字符串s1和s2通过一定的变换变成s12,找到变换次数
变换规则如下:
假设s1=12345,s2=67890
变换后的序列 s=6172839405
如果s和s12完全相等那么输出变换次数
如果不完全相等,s的前半部分作为s1,后半部分作为s2,重复上述过程
Input
第一行给出T(1≤T≤1000),代表有T组数据,每组数据首先给出len(1≤len≤100),接着给出两个长度为len的字符串s1 s2 然后给出长度为len*2的字符串s12。
Output
首先输出处理数据组的编号(编号从1开始)
再输出变换次数并换行。
注意两个数字之间有空格。
对于变换次数,如果无需变换直接得到s12,那么输出0,如果无论怎么变换都不会得到s12,那么输出 -1。
Sample Input
2
4
AHAH
HAHA
HHAAAAHH
3
CDE
CDE
EEDDCC
Sample Output
1 2
2 -1
我的解决问题路程,首先肯定是搜索了,反复这个过程。那么什么时候结束呢?
一开始我以为返回-1的情况是过程中出现了与开始时相同的字符串,然后wa了。
我坚信我的方法没错,就找了一个题解,发现返回-1的情况可能是在过程中出现了与前面一致的情况而不一定是开始时的情况。然后注意要所有都要dfs。
ac代码
#include<cstdio>
#include<cmath>
#include<string>
#include<cstring>
#include<map>
#include<queue>
#include<iostream>
#include<algorithm>
using namespace std;
map<string,int>m;
string s1,s2,ans;
string s;
int h;
int step;
int dfs()
{
//cout<<"asasssss"<<endl;
s="";
for(int i=0; i<h; i++)
{
s+=s2[i],s+=s1[i];
}
//cout<<"1111111111 "<<s<<endl;
step++;
m[s]++;
if(s==ans)
return step;
if(m[s]>1)
return -1;
s1="",s2="";
// s1=s.substr(0,h);//换成这个不知道为什么会错,求解?
// s2=s.substr(h,2*h);
for(int i=0; i<h; i++)
s1+=s[i];
for(int i=h; i<2*h; i++)
s2+=s[i];
//cout<<s1<<" "<<s2<<endl;
dfs();
}
int main()
{
int t;
int tt=1;
cin>>t;
while(t--)
{
step=0;
cin>>h;
cin>>s1>>s2;
//cout<<s1<<" "<<s2<<endl;
cin>>ans;
cout<<tt++<<" "<<dfs()<<endl;
}
return 0;
}